Newsletters




Trends and Applications



With its January 2010 acquisition of Sun Microsystems, Oracle gained the MySQL open source database management software (DBMS) platform for enterprise IT environments. MySQL is designed to let users design and manage complex applications and data sets, and has gained a substantial share of the overall DBMS market.

Posted January 07, 2011

DBTA Hadoop Webcast Now Available on Demand

Posted January 07, 2011

When designing a system an architect must conform to all three corners of the CIA (Confidentiality, Integrity and Accessibility) triangle. System requirements for data confidentiality are driven not only by business rules but also by legal and compliance requirements. As such, the data confidentiality (when required) must be preserved at any cost and irrespective of performance, availability or any other implications. Integrity and Accessibility, the other two sides of triangle, may have some flexibility in design.

Posted January 07, 2011

As security threats increase and become more sophisticated, organizations face pressure to implement strong processes and technology solutions to ensure compliance and the safety of critical assets. The risks associated with a data breach can be devastating, regardless of whether it is due to a simple mistake, or a stolen end-point device such as a laptop. The impact goes beyond fines and lost revenue, to negatively impacting an organization's brand identity and equity, or jeopardizing customers' trust. Providing greater clarity, as well as aligning with industry changes and best practices, Version 2.0 of the PCI DSS standard went into effect earlier this month.

Posted January 07, 2011

The idea of moving off IMS might seem compelling at first glance, but once you look at the whole picture, you might think otherwise. Most people think of cost as the primary reason to move off IMS. But if you look at all of the comparative costs of IMS on a mainframe against a WINDOWS/UNIX solution, you will find that running IMS is actually cost-effective. The obvious cost elements are hardware and software and the huge expense of converting hundreds of thousands of lines of code and hundreds of databases. However, these are only a small part of the story.

Posted January 07, 2011

These days, many companies recognize that there are severe repercussions to ignoring or undervaluing data security, and a sizable segment of organizations-at least one-third in many cases-have been taking additional measures to bolster their data security.

Posted November 30, 2010

When Data Virtualization?

Posted November 30, 2010

One common challenge I have observed during ITIL service catalog implementations pertains to the handling of out-of-band requests. That is, how should one manage a request for a service that is not in the catalog?

Posted November 30, 2010

The year 2010 brought many new challenges and opportunities to data managers' jobs everywhere. Companies, still recovering from a savage recession, increasingly turned to the power of analytics to turn data stores into actionable insights, and hopefully gain an edge over less data-savvy competitors. At the same time, data managers and administrators alike found themselves tasked with managing and maintaining the integrity of rapidly multiplying volumes of data, often presented in a dizzying array of formats and structures. New tools and approaches were sought; and the market churning with promising new offerings embracing virtualization, consolidation and information lifecycle management. Where will this lead in the year ahead? Can we expect an acceleration of these initiatives and more? DBTA looked at new industry research, and spoke with leading experts in the data management space, to identify the top trends for 2011.

Posted November 30, 2010

If data is the lifeblood of an enterprise, a robust master data management (MDM) solution may well be the heart, pumping purified data downstream to vital applications and databases while simultaneously accepting inaccurate and old data for cleansing and enrichment. This "bloodstream," as we know it, is comprised of a myriad of different subject areas, and/or domains. Though the MDM market may well consider itself conceptually and technically mature, end users still struggle to determine whether they should embrace specialist MDM solutions dedicated to supporting one subject area, or make one strategic acquisition and implement truly-multi domain software that addresses multiple subject areas.

Posted November 09, 2010

Leveraging Data Models for Business Intelligence and Data Warehousing Agility

Posted November 09, 2010

There has been a lot of interest lately in NoSQL databases and, of course, many of us have strong backgrounds and experience in traditional relational "SQL" databases. For application developers this raises questions concerning the best way to go. One recurring truth that eventually surfaces with all new software technologies is that "one size does not fit all." In other words, you need to use the right tool for the job, as each has its own strengths and weaknesses. In fact, a danger of many new architectural approaches is one of "over-adoption" - using a given tool to address a wide array of situations when originally they were designed for the specific problem domain in which they excel.

Posted November 09, 2010

When IBM developers set out to build the next version of Informix their goal was to build on the foundation of one of the more mature, effective and reliable pieces of information management software in the industry. With the 10th anniversary of the IBM acquisition of Informix fast approaching, they knew that the 11.7 release would be closely watched by clients and partners alike.

Posted October 12, 2010

Cloud computing offers the promise of greater agility, resource optimization, and user performance, yet many businesses are understandably leery about jumping onto the cloud bandwagon until they have assurances that hosted resources will be secure. In fact, security concerns are the main obstacle to widespread cloud computing adoption among enterprises today. Before taking advantage of these capabilities, businesses need to assure users they have a simple way to access all their applications, and trust that their information is secure in the cloud.

Posted October 12, 2010

The flood of digital information increases the need for accuracy - including knowing which data to leave out. Remember when we used to ride around in our cars and listen to AM radio? Maybe you're not quite old enough to remember, but there was a time when AM radio was all we had - and that was fine. There also used to be only a handful of television channels, which we had to get up out of our chairs to change. That was fine, too. We didn't long for a wider variety of music on the radio, or more channels to watch on TV. We had what we had, and it was all fine - it was all "good enough."

Posted October 12, 2010

The relational database - or RDBMS - is a triumph of computer science. It has provided the data management layer for almost all major applications for more than two decades, and when you consider that the entire IT industry was once described as "data processing," this is a considerable achievement. For the first time in several decades, however, the relational database stranglehold on database management is loosening. The demands of big data and cloud computing have combined to create challenges that the RDBMS may be unable to adequately address.

Posted October 12, 2010

InterSystems Corporation has rolled out a new version of its Caché high-performance object database. The new release targets the growing demand by CIOs for economical high availability by introducing database mirroring, while also addressing Java developers' need for high-volume high-performance processing combined with persistent storage for event processing systems. Robert Nagle, InterSystems vice president for software development, recently chatted with DBTA about the release and the new features it offers. Commenting on the growing interest in NoSQL databases, Nagle observes that many of the beneficial characteristics people see in NoSQL are in fact true of Caché - a flexible data model and zero DBA cost. "But for us, what is unique is that it is not NoSQL, it is that it needs to be SQL - without the overhead of relational technology - because I think SQL is extremely important for almost every class of application that is deployed."

Posted October 12, 2010

Oracle is a fast-changing company, and in recent years, its pace has accelerated to blinding speed. The software giant has expanded well beyond its relational database roots to encompass applications, management tools, service-oriented architecture and middleware, and even hardware. There are now many components to Oracle - from three major databases, to enterprise resource applications, to web applications to development languages to open source desktop tools.

Posted September 07, 2010

Organizations turn to master data management (MDM) to solve many business problems - to reach compliance goals, improve customer service, power more accurate business intelligence, and introduce new products efficiently. In many cases, the need for an MDM implementation is dictated by the business challenge at hand, which knows no single data domain. Take a manufacturing customer, for example. The company decided to deploy an MDM solution in order to solve buy-side and sell-side supply chain processes, to more effectively manage the procurement of direct and indirect materials and to improve the distribution of products. To meet these goals the solution must be capable of managing vendor, customer, material and product master data. Unfortunately, quite a few vendors sell technology solutions that focus exclusively on either customer data integration (CDI) or product information management (PIM), which solves only a piece of the business problem.

Posted September 07, 2010

Brent Ozar achieved SQL Server 2008 Master status earlier this year, becoming the fifth person in the U.S. outside of Microsoft to achieve the company's highest technical certification. A Quest Software SQL Server expert at the time, Ozar has since joined SQLskills.com, a provider of training and consulting focused on Microsoft SQL Server, as a principal consulting partner. In this issue, he provides an arcane gliimpse into the intense 3-week-long onsite program that include the most difficult exams he had ever seen.

Posted September 07, 2010

Many organizations now have, in their possession, the sophisticated analysis tools and dashboards that connect to back-end systems and enable them to peer deeply into their businesses to assess progress on all fronts-from revenues to stock outs to employee performance. However, a recent survey of 279 Oracle applications managers reveals that when it comes to decision making, simple spreadsheets still remain the tool of choice. And business users still wait days, weeks, and months for their IT departments to deliver reports, despite significant investments in performance management systems.

Posted September 07, 2010

IBM has entered into a definitive agreement to acquire Storwize, a privately held company based in Marlborough, Mass. Storwize provides real-time data compression technology to help clients reduce physical storage requirements by up to 80%, improving efficiency and lowering the cost of making data available for analytics and other applications. With Storwize, IBM says, it is acquiring storage technology that is unique in the industry due to its ability to compress primary data, or data that clients are actively using, of multiple types - from files to virtualization images to databases - in real-time while maintaining performance. "This is in contrast to what we see our competitors doing, which is primarily focusing on compressing data that is inactive, or data at rest - backup data, as an example," explained Doug Balog, vice president of IBM Storage, during a conference call announcing the planned acquisition.

Posted August 10, 2010

First elected to Oracle Applications Users Group board of directors in 2009, David Ferguson became president of the OAUG this year. He talks with DBTA this month about how the users group is getting "back to basics" with educational sessions and networking opportunities as well as the new approaches it is taking to meet its members' evolving needs.

Posted August 10, 2010

Earlier this year, Andy Flower took over as president of the Independent Oracle Users Group from Ian Abramson. With Oracle OpenWorld right around the corner, Flower talks with DBTA about how the IOUG is changing to best meet the challenges and opportunities presented by the expanding Oracle ecosystem, despite what continues to be a difficult economy. For the IOUG, it is "the year of the member" and it all starts with the database, he says.

Posted August 10, 2010

Many see 2010 shaping up as a boom year for cloud computing, with cloud adopters capable of realizing significant reductions in administrative IT costs compared to non-adopters. However, it's not enough to simply develop and implement a cloud strategy. Rather, enterprises must take into account the performance of their cloud-based assets and the impact of the cloud on their end users' and customers' experiences. After all, the apparent cost and elasticity advantages of the cloud won't yield any business benefit if the direct consequence is a poor end user experience. For this reason, businesses considering the cloud must do the due diligence and insist on performance guarantees from cloud service providers that map directly to business objectives - or risk impacting revenue, brand image and customer satisfaction.

Posted August 10, 2010

Mid-sized businesses are using and saving more data than ever before. Indeed, the phenomenon that IT engineers have come to refer to as "big data" is being felt in businesses of all sizes. At the same time, however, organizations are facing reduced budgets. Regardless of how much data their likely overburdened IT staff must manage today and tomorrow, mid-sized businesses must find ways to save money by keeping a tight rein on both capital and operational expenditures.

Posted July 12, 2010

Oracle has introduced Oracle Business Intelligence 11g. "The new release allows customers to integrate relational and OLAP analysis across a multitude of different federated data sources and presents that in a very simple way to end users so that they can do analysis on their own without understanding or needing to know that there might be potentially multiple data sources beneath," Paul Rodwick, vice president of product management for Oracle Business Intelligence, tells DBTA. Representing the result of a large investment in simplifying the end user experience, adds Rodwick, companies will see "very interactive dashboards that are completely live and completely interconnected and allow business people to do their own analysis without really needing to go into any kind of query tool." The new release also provides new capablities for search and collaboration, and enhanced performance, scalability, and security through deeper integration with Oracle Enterprise Manager 11g and other components of Oracle Fusion Middleware.

Posted July 12, 2010

Everybody seems to agree with the need for organizations to do a better job of protecting personal information. Every week the media brings us reports of more data breaches, and no organization is immune. Hospitals, universities, insurers, retailers, and state and federal agencies all have been the victims of breach events, often at significant costs. State privacy laws such as the new Massachusetts privacy statutes have placed the burden of protecting sensitive information squarely on the shoulders of the organizations that collect and use it. While some managers might view this as yet one more compliance hurdle to worry about, we feel it presents an excellent opportunity to evaluate existing practices and procedures. The good news is that there are some great solutions available today that can help organizations of all stripes address these requirements while at the same time tightening data security practices, streamlining operations, and improving governance.

Posted July 12, 2010

Despite its many advances, the modern data center is still a complicated mess of technology silos and components that are manually cobbled together and managed. This complexity imposes tremendous operational burden and cost. Fortunately, data centers are going through a major transformation, driven in large part by virtualization. This sea change promises to simplify and automate management, allowing IT to focus less on data center plumbing and more on delivering IT services that drive the business forward.

Posted July 12, 2010

VoltDB, LLC, has begun shipping the VoltDB OLTP database management system that is intended to offer faster transaction processing capabilities due to lower overhead. VoltDB, developed under the leadership of Postgres and Ingres co-founder, Mike Stonebraker, is a next-generation, open source DBMS that, according to the company, has been shown to process millions of transactions per second on inexpensive clusters of off-the-shelf servers. The VoltDB design is based on an in-memory, distributed database partitioning concept that is optimized to run on today's memory-rich servers with multi-core CPUs. Data is held in memory (instead of on disk) for maximum throughput, which eliminates buffer management. VoltDB distributes data - and a SQL engine to process it - to each CPU core in the server cluster. Each single-threaded partition operates autonomously, eliminating the need for locking and latching. Data is automatically replicated for intra-cluster high availability, which eliminates logging.

Posted June 07, 2010

Customer service is always important but never more critical than during difficult economic times, when wasting time is just not acceptable and customer satisfaction is an imperative. Atlantic Detroit Diesel-Allison sells and services a full line of Detroit Diesel, Mercedes-Benz, and MTU engines, and the automatic transmissions offered by the Allison Transmission Division of General Motors Corporation. Atlantic DDA had initiated a plan to improve its service department by improving order processing, thus driving more revenue, with additional goals of gaining operational efficiency and heightening customer satisfaction. In order to achieve these objectives, Atlantic DDA needed to make real-time information available to its teams of service representatives.

Posted June 07, 2010

In only a few years' time, the world of data management has been altered dramatically, and this is a change that is still running its course. No longer are databases run in back rooms by administrators worrying about rows and columns. Now, actionable information is sought by decision makers at all levels of the enterprise, and the custodians of this data need to work closely with the business.That's because, in the wake of the recent financial crisis and economic downturn, there's a push from both high-level corporate management and regulators to achieve greater understanding and greater transparency across the enterprise, Jeanne Harris, executive research fellow and a senior executive at the Accenture Institute for High Performance, and co-author, along with Tom Davenport, of Competing on Analytics and Analytics at Work, tells DBTA. "In many ways, I think the ultimate result of the financial crisis is that executives realized they cannot delegate analytics to subordinates; they can't view it as technology or math that doesn't really affect them."

Posted June 07, 2010

In the midst of turbulent times, many successful businesses learned an important lesson: The closer IT works with the business, the better an organization can weather the storms that blow in. Thus, many savvy companies understand that the managers and professionals who oversee information technology and applications need to be well incentivized to stay on. At the same time, these professionals understand the need to develop expertise in business management and communications. Many companies are looking to information technology to provide an additional competitive edge, and see their Oracle enterprise systems as the cornerstone of this strategy. As a result, a survey finds that Oracle enterprise application managers and professionals appear to have weathered the economic storm. The survey, conducted among 334 members of the Oracle Applications Users Group (OAUG) by Unisphere Research, and sponsored by Motion International, finds an increase in the number of Oracle technology professionals who are near or surpassing the $100,000 mark in their base salaries.

Posted June 07, 2010

You're probably familiar with the old saying that "it's not what you know, it's who you know." That may have been true back in the days when conversations about competitive advantage concerned memberships at prestigious golf clubs and lavish expense accounts; in the days when data was regarded as mere records of transactions, production and inventory levels. Today's conversations about competitive advantage may still include talk of personal relationships. More frequently, though, these conversations reflect the relatively recent appreciation of the intrinsic value of enterprise data - a value seen not just by senior executives, but also by employees in virtually every department. There is broad consensus in most organizations that enterprise data, and perhaps more importantly, the ability to analyze large volumes or smaller subsets of data at will, in real time, are crucial business differentiators.

Posted May 10, 2010

Mergers and acquisitions often come quickly and when they do, it is critical to have tools and utilities capable of scaling to meet new challenges so operations continue seamlessly, customer service standards are upheld, and costs are contained. This was the case for UGI Utilities, a large natural gas and electric service provider in the eastern U.S. In 2006, UGI acquired the natural gas utility assets of PG Energy from Southern Union Company. A longtime customer of BMC, UGI found it was aligned with the right software company to provide implementation of mainframe service management solutions as well as first class support to get the job done and successfully integrate the newly acquired company's data into its environment, saving time and money.

Posted May 10, 2010

Application monitoring started off with a simple need - to help an administrator or manager of an application, monitor and manage its performance. It was narrow in scope and limited in use - to monitor a single application and provide metrics useful for managing that application only. Monitoring tools were often provided by application vendors, but the growing and complex nature of IT environments necessitated the entry of third-party monitoring tools. These were more specialized, with the ability to centrally monitor several different applications. They helped administrators gain visibility across several different applications, understand where problems occurred, and helped to quickly resolve them.

Posted May 10, 2010

Today it's all about optimizing the business. IT is being charged with finding ways to simplify and automate business processes, making them both more reliable and less expensive to operate. It's a never-ending process, and as time goes on, the demands get greater. Yet while IT has largely been successful to date in helping other parts of the organization save time, money and effort, the same cannot be said for itself.

Posted May 10, 2010

The drivers and benefits of using open source in the enterprise have been widely documented. Ultimately, enterprise users adopt open source for two primary reasons: cost and vendor independence. These are the same drivers and benefits that apply to well-known categories such as operating systems, application servers and web browsers. However, because the database plays a role in all enterprise IT applications, the scrutiny and rigor that enterprises apply to the selection, implementation and deployment of open source databases is far more intense and deliberate.

Posted April 07, 2010

It's hard enough to lock down sensitive data when you know exactly which server the database is running on, but what will you do when you deploy virtualization and these systems are constantly moving? And making sure your own database administrators (DBAs) and system administrators aren't copying or viewing confidential records is already a challenge - how are you going to know when your cloud computing vendor's staff members are not using their privileges inappropriately? These are just two of the obstacles that any enterprise must overcome in order to deploy a secure database platform in a virtual environment, or in the cloud. In some cases, these concerns have been preventing organizations from moving to virtualization or cloud computing.

Posted April 07, 2010

COLLABORATE 10 will at long last bring together Sun and Oracle users for the first time under one roof. Ian Abramson, president of the Independent Oracle Users Group, talks with DBTA about what the group is planning for the April conference in Las Vegas, and how the integration of customers and technology is being handled by Oracle and the IOUG.

Posted April 07, 2010

IBM acquired predictive analytics vendor SPSS in October 2009. Erick Brethenoux, predictive analytics strategist for SPSS, an IBM Company, talks about the growing importance of the technology in helping enterprises address customer needs, what is driving the demand for it now, and how it fits into IBM's idea for a Smarter Planet.

Posted March 04, 2010

An overwhelming challenge - expanding volumes of data - threatens to gum up any productivity improvements seen to date as a result of information technology deployments. All that data is coming in from systems, sensors, and storage area networks, pressuring organizations to expand database inventories, while grappling with associated licensing and hardware costs. Plus, many compliance mandates demand that this data be stored for long periods of time, but remain accessible to auditors and business end users.

Posted March 04, 2010

For many organizations, application information lifecycle management, or ILM, now offers expedient - and badly needed - measures for properly defining, managing, and storing data. Many enterprises are being stymied by a massive proliferation of data in their databases and applications. Growing volumes of transaction data are being digitally captured and stored, along with unstructured forms of data files such as email, video, and graphics. Adding to this tsunami are multiple copies of all this data being stored throughout organizations. At the same time, increasingly tight mandates and regulations put the onus on organizations to maintain this data and keep it available for years to come. Much of this data still resides on legacy systems, which are costly to operate and maintain.

Posted March 04, 2010

Faced with growing data volumes and limited budgets, companies, educational institutions, and government agencies are increasingly relying on IT to help them gain a competitive edge and better serve their customers and constituents. DBTA recently asked key MultiValue vendors to explain their strategies for enabling data integration from different repositories and for supporting business intelligence and analytics that provide meaningful insight for customers.

Posted March 04, 2010

Managing and measuring costs has taken on a new urgency with the emergence of virtualization and new computing models. With virtualization, customers get a shared infrastructure that shifts the cost from a clear 1:1 relationship between servers, applications and users to a more dynamic model. We're just beginning to realize the tremendous impact this has on cost management and measurement in the data center. To make effective decisions about how to deploy resources, the business needs to clearly understand the associated costs.

Posted February 09, 2010

There's no question that databases are the heart of nearly every application running these days. Moreover, the information stored in databases is now being routinely used as a competitive and operational business weapon by all businesses and organizations regardless of size or industry. Whether used internally in business intelligence applications or utilized externally via the exposure of data tools that let customers view and search through vast amounts of data on websites, data is being maximized in many different ways.

Posted February 09, 2010

When the Sarbanes-Oxley Act (SOX) was first enacted in 2002 in the wake of several very visible accounting scandals, small to medium enterprises may have felt they dodged a very expensive bullet. The requirement to document processes for governance, risk management and compliance (GRC), and have them confirmed by outside auditors only applied to publicly traded companies. Unlike their publicly traded brethren, SMEs were not forced to purchase costly GRC software, did not have to re-direct resources from their normal daily tasks to prepare for audits, and did not have to change their methods of operation to comply with a government mandate.

Posted February 09, 2010

With the increased merging of disparate core business systems in the enterprise - as well as the emergence of additional systems in the form of enterprise resource management, customer relationship management, hierarchical storage strategies, and other business-driven initiatives - many companies today find themselves moving mountains of data on a daily basis. Business intelligence (BI) initiatives in particular typically rely on data warehousing strategies to provide critical information and reports to management in support of business decisions. Such strategies often require the timely transfer of enormous amounts of data from line-of-business systems. Too much time taken in data transfer can adversely impact a company's agility and could mean lost windows of business opportunities. It can also encroach on processing resources better devoted to core business applications.

Posted January 11, 2010

Enterprises that downplay the importance of storage management may be putting other key enterprise objectives at risk. That's the message from Kyle Fitze, Director of Marketing, Storage Platforms Division, HP StorageWorks. With IT shops facing constrained budgets and data volumes continuing to escalate, Fitze says, greater efficiency in the IT infrastructure is a requirement so that more money and time can be targeted at IT projects that will drive business growth. "Today, we believe that most customers spend upward of 70% of their budget just keeping the systems running and the lights on and everything cooled, on maintenance and operations, and the remainder of their budget on innovative IT projects," he observes. What HP would like to do, "is flip that ratio, so that customers, while they spend less on IT overall, are spending a smaller percentage of their budget on operations and the larger percentage then on innovation and business intelligence, and the kind of IT projects that can help them navigate these rough waters of economic decline."

Posted January 11, 2010

If you have ever been involved in configuring, installing and maintaining enterprise software, I don't have to tell you that it's time-consuming and complex. The cumbersome process of installing and tuning the operating system (OS), middleware, and database, then integrating and configuring the software is manual and error-prone. Even if you get it all correct, the process alone can delay time-to-value for the end user and introduce challenges for independent software vendors (ISVs) looking to shorten sales cycles. The whole process is daunting and expensive, discouraging customers and inhibiting sales. In order to maximize their financial return and eliminate installation and maintenance challenges, many ISVs are building appliances - versions of their product, packaged with a "just enough operating system" required to perform the desired tasks. Pre-configured for specific use cases, these compact, self-contained appliances can be deployed in a matter of minutes, requiring only last mile setup.

Posted January 11, 2010

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors