Newsletters




The Year Ahead in Information Management: Smarter, Cloudier and Greener


As we enter the next decade of the millennium, we will see information technology becoming more ubiquitous, driving an even greater share of business decisionmaking and operations. IT has proven its muster through the recent downturn as both a tactical and strategic weapon for streamlining, as well as maintaining competitive edge. Now, as we begin the next round of economic recovery, companies will be relying on IT even more to better understand and serve their markets and customers. Yet, there are many challenges with managing a growing array of IT hardware, software, and services. To address these requirements, businesses continue to look to approaches such as analytics, virtualization, and cloud computing. To capture the trends shaping the year ahead, Database Trends and Applications spoke to a range of industry leaders and experts.

Business analytics will usher in ‘democratization of data.' For years, industry observers have talked about making business intelligence capabilities available across the enterprise, versus being relegated to small numbers of business analysts employing sophisticated tools. That is changing. "Increasing data volumes and the desire for faster insight into information is driving business analytics as the next megatrend that will shape the data management industry in 2010," Don Campbell, CTO of business intelligence and performance management for IBM, tells DBTA. "Business analytics initiatives will combine the use of information integration, analytics, business intelligence, search, automatic Web spiders, data visualization, and complex event processing powered by high bandwidth and 'extreme' transaction processing to instantly parse diverse, unstructured and disconnected pieces of data."

Predictive, self-service BI and analytics will emerge. Along with democratization, business intelligence will take a futuristic caste, predicts John Callan, director of product marketing at TIBCO Spotfire. "In 2010, putting the power of predictive analytics into the hands of everyday knowledge workers-and taking self-service, self-guided BI to the next level - will drive competitive advantages and greater predictability for businesses," he tells DBTA. "While traditional BI provides the ability to report on past and present trends, predictive analytics provides much-needed future insight.  In order to achieve this next level of the ‘predictive business,' organizations must address two things throughout 2010-the need for predictive, self-guided analysis so that every day business users are making decisions with a future-focused frame of mind-not about what has been, but what will be; and the challenge that static, pre-packaged BI reports puts on workers with immediate analysis needs and who work in the context of a specific business process with its own unique data and analysis requirements."

Social media monitoring and CRM data management converge. All that social networking activity that has been emerging over the past 2 years will begin to deliver information that will help in making business decisions, says John Cass, author of "Strategies & Tools for Corporate Blogging."  "Companies have been driving towards using measurement and response for social networking," he tells DBTA. "The natural conclusion is to use CRM tools to manage the engagement processes with customers. The volume of content in social media means that companies just have to automate the process, but they also have to put thinking, intelligence professionals in charge of those tools. Social media analysis requires levels of analysis that go beyond what we've seen before in marketing efforts. I think this convergence of measurement and the decision as to when to respond puts demands on the data center in 2010."

More data security through logging. As the economy improves and operations begin to expand again, there will be renewed attention to data security issues. More companies will start turning to a tried-and-true security mechanism - database logging. "Database logging is often the neglected stepchild in the large enterprise because of the performance hit that the average database takes when auditing is turned on," Dimitri McKay, security architect at LogLogic, tells DBTA. "Although there are a host of tools on the market today which will collect logs from the database-via JDBC connections, via flat file collection, via network monitoring or even agent based-there is still a huge gap between databases that are logged and those that are not. Within the next year I expect we'll see that database security is becoming a much larger conversation. With a host of tools available that are not so invasive in logging these databases, more and more databases will be added to the overall logging scope. As a result, databases will become more secure in 2010."

A rise in memory virtualization. As demand for greater capacity and speed grows, companies will increasingly turn to memory virtualization to address these issues. "Next year, we believe memory virtualization will be as widely accepted in the data center as server virtualization, desktop virtualization and storage virtualization," Clive Cook, CEO of RNA networks, tells DBTA. "In 2009, we saw a number of vendors like Cisco and HP focus on the problem of memory limitations in the data center. Attention to this age-old problem will only heat up in 2010. Three specific trends are fueling this growth-memory is a hundred times faster than storage; data proliferation is putting a strain on the data center, and a new caching model that reduces replication and supports a broad range of data types is needed; and memory virtualization's ability to dynamically shift support to different applications and automatically be configured, set up, provisioned and decoupled from servers enables cloud providers to maximize resources."

Virtualization will morph into internal cloud computing. "Cloud computing is a developing capacity paradigm that shows promise for future data centers, providing either a fixed or variable pool of resources from which to rapidly provision virtual server images with customer specified capacity allocations-such as CPU, memory, and storage," Vinod Kachroo, vice president of enterprise infrastructure for MetLife, tells DBTA. "In 2010 MetLife looks to build on the progress it has made in utilizing virtualization in its data centers, which has already delivered both improved capacity and large cost savings and cost avoidance. It seeks to do so through further exploring cloud computing."

Databases in the cloud. "In 2010, customers desiring cloud deployments will be seeking cost-effective ways to decouple their applications from the database to avoid potential scalability bottlenecks, but in ways that still provide the data consistency and high availability of the database," Jeff Hartley, vice president, marketing and products for Terracotta, tells DBTA. "Distributed caches, clustered memory solutions, and in-memory data grids will receive more attention as complements to the database that can enhance an application for more efficient operations in clouds."

Platforms in the cloud. "The idea of having groups of individual servers will fade and be replaced with the reality of flexible, functional and cost-effective platforms," Matt Heinrichs, vice president of data operations for Ratchet, tells DBTA. "In order for platform as a service (PaaS) to become a reality, enterprises need to trust all of their data and processes to a public cloud. This will happen in the near future. For now, what they can accept are the advantages of PaaS hosted in their own data centers. Enterprises also need vendors with the vision and ability to make this happen. VMware is the most motivated player in this space with standard virtualization becoming commoditized and marginalized, but bigger players like Microsoft and IBM are much more capable of pushing all the pieces forward to make this become a reality."

Cloud will extend data analysis. "The continued explosion in data volumes and the vast increase in compute power for data analysis will shape the data center as we enter 2010," Jason Stowe, CEO of Cycle Computing, tells DBTA. "As a result of these demands, IT will continue to look to cloud computing to manage and execute computations quickly and easily. In the cloud, using one processor for 1,000 hours costs the same as running 1,000 processors for one hour-therefore utilizing this type of technology speeds the time to result for calculations."

Cloud will put more of the spotlight on data security. Cloud computing has its advantages, but over the coming year, more questions will start getting asked about data security, Mike Logan, CEO of Axis Technology, tells DBTA. "Increasingly the ease of getting data will be met with the eternal question: 'is my data secure?' Customers want to know their data is safe and secure. They only want parties they trust and explicitly approve to have access to their data and then only for the specific purpose it is needed. In 2010 these concerns are going to catch up with the technology. In response to the financial crisis, new regulations have been adopted and will be implemented that include provisions that provide security to consumers. Companies that do nothing will be held accountable and face significant penalties for neglecting data security issues including fines, customer attrition and bad publicity."

Large scale, end-to-end cloud adoption. In 2009, "CIOs sank their teeth into what cloud computing is and how it can be applied," Dennis Quan, director of development for autonomic computing for IBM, tells DBTA. "We predict the real game-changing event is imminent-when companies move beyond simply virtualizing their servers and start applying cloud computing concepts in earnest-self-service, automated processes, and elastic, massive scalability." Quan predicts that in 2010, companies will be rolling out "cloud-based applications on a large scale, realizing significant business value and increased competitiveness from solutions tailored to specific business problems such as lower-cost development and test environments, cross-organization collaboration, and secure desktop virtualization."

SOA revives. Many commentators said the SOA trend was dead this year, but the methodology will revive with a business caste in 2010. "In 2010, enterprises will change the way they define and use SOA. Driven by business needs within a challenging and rapidly-changing economy, IT organizations have had to broaden their use of SOA to enable more intelligent, flexible processes," Craig Hayman, general manager for IBM WebSphere, tells DBTA. "The scope has expanded to integrating business processes and technology, driven by requirements of business users in more extended areas of the enterprise."

IT moves from maintenance to more strategic activities. For too long, IT departments have been involved in the "plate-spinning" business, Lynda Stadtmueller, senior research analyst covering business communication services for Frost & Sullivan, tells DBTA. But this is changing, and we're likely to see more evidence of this in the year ahead. "Like the plate spinners on the old Ed Sullivan show, IT personnel find themselves running from task to task just to keep all the parts moving," she explains. "Instead of propelling the business forward, the goal is to avoid a spectacular crash." This mindset is changing, she continues. "Faced with resource constraints and a never-ending onslaught of application demands from colleagues, partners, and customers, IT will stop trying to do it all. Data center managers will make the choice to give up non-strategic 'grunt-work' maintenance functions, to free up time, budget and resource for more strategic undertakings."

More IT automation. Consolidation through cloud and virtualization will result in "future data center architectures increasingly looking like traditional supercomputing facilities, where application services are deployed across pools of physical and virtual resources - servers, storage, networks - by intelligent workload management software that decides how to best allocate workloads in time and space across a heterogeneous computing landscape," David Jackson, CEO of Adaptive Computing, tells DBTA. "In order to deliver both efficiency and agility, this workload-aware automation and management software must enforce service-level agreements (SLAs), be driven by business policies and must be capable of end-to-end automation with minimal human intervention."

Emergence of an ERP application of energy management. "Energy efficiency can be achieved through sector-specific technologies such as data center in-row cooling, air containment, right-sizing, and through end-to-end energy management and control systems-for example, by linking data center management and building management systems," Steven Carlini, senior director of APC by Schneider Electric, tells DBTA. He predicts more enterprises will be embracing reference architectures, which he equates to "the beginning of a technical framework for an 'ERP of energy management'-a controlling, open, web-services-based energy management system and dashboard for an entire business that links into the domains."


Sponsors