Newsletters




Seven Trends Reshaping the Data Ecosystem Around Oracle


In recent years, the networks of developers, integrators, consultants, and manufacturers committed to supporting database systems have morphed from one-on-one partnerships into huge ecosystems in which they have become interdependent on one another, and are subject to cross-winds of trends and shifts that are shaping their networks. Nowhere is this more apparent than the huge ecosystem that has developed around Oracle.

With Oracle’s never-ending string of acquisitions, new functionality, and widespread adoption by enterprises, trends that shape this ecosystem are certain to have far-reaching effects on the rest of the IT world. Concerns that percolate through the ecosystem reflect — and influence — broad business concerns. New paradigms — from cloud computing to big data to competing on analytics — are taking root within the Oracle ecosystem long before anywhere else.

With this in mind, DBTA went out to leading experts and vendors in the database world to discuss the issues and concerns seen within their parts of the ecosystem. Seven key trends were discussed.

More Big Data

Not surprisingly, big data is the one trend most frequently cited as driving new change within the data ecosystem. What role will Oracle play, and what is the future of its traditional relational database management system?

Data ecosystem participants expressed the following concerns about the big data phenomenon:

On the overall impact of big data:

Jeff Reser, manager of cloud and Data-Direct Marketing, Progress Software: “Big data solutions will begin to support—out of necessity—whatever mixed bag of information is out there. This means more than storing and retrieving different types of information. It means giving other ISVs and customers the flexibility to leverage the information in the way they need to today and in the future, without redesigning the data store.”

Srinivas Prabhala, CTO of Financial Services and Insurance at Infosys: “We are seeing a lot more of big data. Unstructured data being analyzed is mostly external and received in files and so Hadoop is the preferred way to go. For structured data, if the data is internal then it would probably reside on a RDBMS. Hence, most companies are looking for in-RDBMS big data analytics where the huge data does not have to be replicated into a file or Hadoop to process it.”

Gerry Miller, partner with Kalypso: “Established data warehouse, data integration and business analytics vendors have been slow to respond to big data analytics, opening the door for smaller niche players that are forming an open source framework to process big data data sets across clustered systems. Over the next 2 years, I expect to see the largest three to five business intelligence software companies acquire smaller big data vendors — the same thing that happened with the last generation of business analytics innovators.”

On the impact of big data on the database market:

Amaresh Tripathy, partner with PwC: “To a large extent, the growth of unstructured data has driven much of the opportunity in the big data arena. This growth brings new challenges in the form of increased data volume, velocity, and complexity. But it also presents great opportunities for companies to mine data — both structured and unstructured—to create entirely new business opportunities. And, in the process, to gain competitive advantages that were not possible a short time ago.”

On Oracle’s response:

Sam Alapati, president of Miro Consulting: “Companies are starting to incorporate big data into their BI and data warehouse data stores, to take advantage of big data just as they do with traditional relational data. Oracle has taken a big step in this direction by releasing its Oracle Big Data Appliance, an engineered system consisting of both hardware and software, all ready to go. Oracle has hooked up with Cloudera to provide this platform for handling big data, using the Apache Hadoop software framework, which is well known for its processing capabilities in regard to big data.”

Ravi Vishwanath, general manager of the BPM Practice of Persistent Systems: “Oracle’s approach is to reduce the management cost associated with disparate systems by providing a centrally managed implementation of Hadoop and NoSQL offerings. Their approach is well thought out and practical. The fundamental issue with big data is figuring out which part of the data is useful and which needs to be categorized and saved for the future. Oracle NoSQL allows for this to happen by providing a categorization mechanism as data is acquired from all kinds of sources — both transient (streaming) and persistent web data. Oracle provides adaptors to move this data into analysis mode via Cloudera Hadoop, and provides their Big Data Appliance to further simplify the management of this process. The analyzed data can then be pushed into the relational Oracle DB for analytics by their Exalytics systems leveraging in-memory database as well as other performance improvements or traditional BI systems.”

On the impact of Oracle Exadata:

Jeremiah Wilton, senior principal, solution design, Datavail: “Increasingly, we are seeing engineered systems finally gain traction with Oracle customers. Over the years, Oracle has tried to promote database ‘appliances’ as a way to eliminate the engineering needed to design and deploy a hardware solution for Oracle. With Exadata, and now the [Oracle Database Appliance], Oracle’s offering has hit the mark for many more customers than their previous attempts along these lines.”

Ash Ashutosh, CEO of Actifio: “Oracle Exadata comes fully assembled, debugged, and ready to run — which follows the ease-of-use movement towards unified infrastructure. However, Exadata and similar platforms require protection and availability tools. Typically, backing up an Exadata system requires a secondary system for data mirroring, which creates a challenge around extended RTOs and RPOs. For test and development, converged infrastructure solutions also present a challenge for organizations from a cost/complexity perspective. Optimized storage platforms that directly [integrate] with big data systems—such as Exadata—that can provide copy data management for backup and test/development easily and cost effectively are emerging to address these challenges.”

Rick Caccia at Delphix: “One clear trend we see is Oracle’s aggressive expansion of the ‘Exa’ machines within customer accounts. There is a great deal of commentary and noise regarding big data, typically viewed as some form of Hadoop plus unstructured data sets. Certainly every one of our customers — who are also joint customers with Oracle — is looking at this approach for analysis of large data sets. However, existing relational databases are also growing significantly, on average 25% CAGR in our joint Oracle customer base, and those databases aren’t going to be replaced anytime soon — ERP typically doesn’t run on Hadoop. Within these environments, Oracle appears to be very focused on Exadata for RDBMS management, Exalogic for app server management, and Exalytics for structured analysis.”

Wilton (Datavail): “The reason for the success of Exadata and the ODA is simple. They both address problems that customers couldn’t easily solve by building their own systems.”

On the convergence of big data and cloud:

Oren Elias, executive vice president of strategy and business development at Correlsense: “Big data is clearly in the driver’s seat at Oracle — just look at their recent Oracle Cloud announcement, it’s powered by Exadata and Exalogic. Anybody who wants to stay on board with Larry Ellison had better be prepared to jump in — at times head-first — to the cloud, and hope for a soft landing, to keep the metaphor running.”

On Hadoop:

Laura Teller, chief strategy officer for Opera Solutions: “Much data is unstructured, which makes it useless for all practical purposes unless it can be transformed into computable form. Arguably, then, some of the greatest opportunities to expand the data ecosystem lie in advancing the science that can extract computable features, information, and patterns from this unstructured data. Database vendors are seeking to find ways to accommodate a constant and very large flow of unstructured data from outside enterprises, and to insert some analytics so that some information can be extracted from the data. We are seeing a wave of acquisitions and partnerships that allow this to happen. We are also seeing a lot of exploration and some early stage implementation of distributed processing capabilities such as Hadoop — we are running one of our applications on a Hadoop cluster right now.”

More Cloud

Cloud has become a platform of choice across many enterprises. According to “Enterprises Advance into the Cloud: 2011 IOUG Cloud Computing Survey,” a study sponsored by Oracle and conducted by Unisphere Research among 257 IOUG members, adoption of both private and public clouds is up. Thirty percent of respondents report having limited-to-large-scale private clouds, up from 24% only a year ago. Another 25% are either piloting or considering private cloud projects. Public cloud services are also being adopted by more than one out of five respondents. In addition, cloud services are carrying larger workloads within organizations. A large segment of respondents, 37%, report that they now use or offer between one and 10 services through a private cloud. A large segment of organizations adopting public cloud services have replaced applications offered by their own IT departments.

Industry partners and experts agree that cloud is becoming a force across the data ecosphere as well.

On moving data to the cloud:

Alan Santos, vice president, Red Hat: “Many of the qualities that make cloud appealing for processing — elasticity, flexibility, transient availability, don't apply to data. But for most types of applications the processing is only as valuable as the data being used, thus creating the pull for data to follow. There is certainly appeal to offsite hosting for data, it’s just not as compelling given certain technical qualities. There are also unique concerns around history, privacy, and security as data is usually much more sensitive and transparent as processing. Some of these concerns are superficial or already addressed, but there are very real technical and legal limitations that make it sometimes impossible. With respect to legal issues there are very real constraints around redundancy and geography that many cloud providers do not provide. Regardless, many of the concerns are superficial. Consider the success of Salesforce.com: customer data is the most sensitive information a company has, yet Salesforce has been fantastically successful getting companies to move this information onto their cloud.”

On cloud and business resiliency:

Ashutosh (Actifio): “With cloud, users are putting all eggs in one basket, which dictates a multi-layer data protection and disaster recovery approach. Cloud also makes deploying new workloads increasingly easy and this has led to the ‘VM-sprawl’ challenge.”

Alapati (Miro): “Oracle has been increasingly focusing on how to build applications, automate processes, and secure the enterprise in a cloud environment. Oracle has demarcated its cloud services into cloud application, platform, and social services. The cloud application services support mission-critical business services such as sales and marketing, which platform services let companies quickly test and deploy enterprise business applications on Oracle’s and app server product line. Oracle’s cloud-based social services help companies transform business processes with social collaboration and insight services.”

More In-Memory and Real-Time Analytics

With many organizations facing the challenges of big data, in-memory is seen as a way to dramatically speed up the ability to access and analyze information. There are numerous business and technical benefits that in-memory analytics brings to organizations. Running existing applications in-memory or refactoring these applications to exploit in-memory approaches can result in improved transactional application performance and scalability, lower latency of less than a microsecond, enhanced application messaging, dramatically faster batch execution and faster response time in analytical applications. A Unisphere Research survey of 446 data managers and professionals, “The Post-Relational Reality Sets In: 2011 Survey on Unstructured Data,” conducted in partnership with MarkLogic among readers of Database Trends and Applications, finds that 18% are exploring in-memory technology as a strategy for managing their growing data assets.

A number of industry ecosystem partners are also responding to the in-memory opportunity that may change the way analytics are delivered within enterprises:

Mark Troester, global product marketing manager for CIO and IT strategy for SAS: “The ability to move processing or analytics closer to the data is a key to performance, scalability and manageability, and that goes double when it comes to big data. One way to accomplish this is to leverage in-database capabilities, which involves taking the analytics capabilities and running the processing directly inside the database. Organizations can appreciate finer analysis and segmentation that enables a better understanding of customer behaviors so they can build and deliver more effective marketing campaigns. Similarly, the ability to embed analytics within the database improves the speed of marketing activity — these organizations can respond promptly to customer needs, can more easily align resources to high value campaigns and react quicker to market trends so they can drive profitable revenue growth. We are working directly with the Oracle engineering team to accomplish this.”

More Competing on Analytics

In an era of hypercompetitive global markets, organizations recognize that the one true advantage they have is their data, and the ways they can go about analyzing it to understand market trends, as well as to better engage with customers. Data provides the needed edge.

Data ecosystem participants made the following observations regarding their increasing support for analytics-driven organizations:

Oliver Halter, partner with PwC: “We see vendors emphasizing speed and performance in their technologies, and incorporating more analytical capabilities — including forecasting, fraud detection and business activity monitoring—into their product suites. Many of the traditional infrastructure and application vendors are adding capabilities to their prod-uct suites, either directly or through acquisition of smaller players that have focused directly on mining and interpreting unstructured data.”

More Heterogeneity

More and more, the all-Oracle or all-IBM or all-Microsoft shop is fading into memory. Companies employ a variety of systems and databases — from multiple vendors, and supporting multiple types of data and implementations. For example, according to a recent Unisphere Research survey, “Moving Data: Charting The Journey From Batch To Blazing, 2012 DBTA Survey On Data Integration Strategies,” while traditional relational database management systems are common at nine out of 10 sites, close to a third, 31%, also support data warehouse appliances. In addition, the survey of 338 DBTA readers, sponsored by Attunity, finds that 15% are moving into data services and data virtualization. NoSQL and in-memory databases are also employed by up to one out of 10 enterprises.

Industry ecosystem partners provided the following insights about the growing heterogeneity of the database world:

Alan Santos (Red Hat): “We’re now in the midst of a Cambrian explosion with respect to data management. Relational storage is no longer enough, hence the proliferation of key/value stores, columnar databases and document-oriented databases. At some point in the future there will be consolidation of vendors and technologies but a lasting result will be the recognition that there’s no right tool for the job and IT developers, operations, and others will be more selective and choosing different tools for different projects. Overall, it’s a great thing for technology practitioners.”

Don Bergal, CMO of Confio Software: “The ecosystem is expanding as customer sites become more complex and more heterogeneous. Today, there is no such thing as a pure Oracle shop, and Oracle can no longer dictate a complete solution to its technical customers. Microsoft SQL Server has become more robust and sophisticated enough to handle the heavy lifting. Enterprise installations are migrating to mixes of traditional servers, private clouds and public clouds. Open source at both the high and low end — from Hadoop to PostgreSQL, and MySQL — creep into the mix.”

Alapati (Miro): “Oracle can fully expect to remain a leader in the market. However, when it comes to new data management trends, Oracle has no such lock on the market. A good example is big data, where smaller companies still lead the market. Of course, mobile applications and the huge outpouring of unstructured data from social platforms is another major new trend that Oracle as well as other database companies are seriously addressing.”

On the rise of open source databases:

Terry Erisman, CMO of Percona: “There’s been tremendous growth in the MySQL ecosystem, due to a combination of factors. One is the growth of large enterprise usage of MySQL for mission-critical applications which is driving our MySQL support business. We also see tremendous growth in the need for MySQL performance improvements to increase application responsiveness and decrease data storage costs.”

Mobility and BYOD

Ecosystem partners are also carefully watching the mobile bring-your-own-device (BYOD) phenomenon, spurred by the relentless rise of smartphones and tablets, which company employees are increasingly using to access data and run applications:

Peter Price, CEO of Webalo: “On the heels of the BYOD movement, employees now demand mobile access to back-end systems on their smart phone or tablet of choice as a way to do their job more conveniently and effectively. It’s not necessary to mobilize entire enterprise systems.  Rather, employees need access to specific, discrete functions to do their jobs more effectively.  This may include access to inventory management, timesheets, sales projections, and so on, depending on the specific job responsibilities of an employee.”

More Database Market Consolidation

Paradoxically, while the data ecosphere continues to grow more diverse, there is also a notable trend toward consolidation of data products and platforms, organizations seek to clear up the sprawl and discontinuity seen in their IT assets.

Some industry partners see a tendency for large vendors such as Oracle to simplify and standardize their product lines:

Bill Abbott, partner with PwC: “In the early 2000s, Google and Yahoo had to deal with previously unknown data volumes as they indexed the World Wide Web. That challenge led to a host of innovations driven by the two internet giants. This in turn led to investments and start-ups hat commercialized projects — such as Hadoop — and developed novel ideas, both proprietary and open source. Traditional database vendors are now playing catch-up. We can expect to see consolidation as large players acquire small ones and integrate their technologies.”

Wilton (Datavail): “Oracle has recognized that the vast majority of deployments don’t require extensive custom in-house engineering. Many of Oracle’s efforts over the last 5 years have been aimed at creating a single road most traveled. This means reducing the vast number of OS platforms, Oracle versions, hardware combinations, and diverse configurations in use at customer sites.”


Sponsors