Newsletters




Trends and Applications



Mid-sized businesses are using and saving more data than ever before. Indeed, the phenomenon that IT engineers have come to refer to as "big data" is being felt in businesses of all sizes. At the same time, however, organizations are facing reduced budgets. Regardless of how much data their likely overburdened IT staff must manage today and tomorrow, mid-sized businesses must find ways to save money by keeping a tight rein on both capital and operational expenditures.

Posted July 12, 2010

Oracle has introduced Oracle Business Intelligence 11g. "The new release allows customers to integrate relational and OLAP analysis across a multitude of different federated data sources and presents that in a very simple way to end users so that they can do analysis on their own without understanding or needing to know that there might be potentially multiple data sources beneath," Paul Rodwick, vice president of product management for Oracle Business Intelligence, tells DBTA. Representing the result of a large investment in simplifying the end user experience, adds Rodwick, companies will see "very interactive dashboards that are completely live and completely interconnected and allow business people to do their own analysis without really needing to go into any kind of query tool." The new release also provides new capablities for search and collaboration, and enhanced performance, scalability, and security through deeper integration with Oracle Enterprise Manager 11g and other components of Oracle Fusion Middleware.

Posted July 12, 2010

Everybody seems to agree with the need for organizations to do a better job of protecting personal information. Every week the media brings us reports of more data breaches, and no organization is immune. Hospitals, universities, insurers, retailers, and state and federal agencies all have been the victims of breach events, often at significant costs. State privacy laws such as the new Massachusetts privacy statutes have placed the burden of protecting sensitive information squarely on the shoulders of the organizations that collect and use it. While some managers might view this as yet one more compliance hurdle to worry about, we feel it presents an excellent opportunity to evaluate existing practices and procedures. The good news is that there are some great solutions available today that can help organizations of all stripes address these requirements while at the same time tightening data security practices, streamlining operations, and improving governance.

Posted July 12, 2010

Despite its many advances, the modern data center is still a complicated mess of technology silos and components that are manually cobbled together and managed. This complexity imposes tremendous operational burden and cost. Fortunately, data centers are going through a major transformation, driven in large part by virtualization. This sea change promises to simplify and automate management, allowing IT to focus less on data center plumbing and more on delivering IT services that drive the business forward.

Posted July 12, 2010

VoltDB, LLC, has begun shipping the VoltDB OLTP database management system that is intended to offer faster transaction processing capabilities due to lower overhead. VoltDB, developed under the leadership of Postgres and Ingres co-founder, Mike Stonebraker, is a next-generation, open source DBMS that, according to the company, has been shown to process millions of transactions per second on inexpensive clusters of off-the-shelf servers. The VoltDB design is based on an in-memory, distributed database partitioning concept that is optimized to run on today's memory-rich servers with multi-core CPUs. Data is held in memory (instead of on disk) for maximum throughput, which eliminates buffer management. VoltDB distributes data - and a SQL engine to process it - to each CPU core in the server cluster. Each single-threaded partition operates autonomously, eliminating the need for locking and latching. Data is automatically replicated for intra-cluster high availability, which eliminates logging.

Posted June 07, 2010

Customer service is always important but never more critical than during difficult economic times, when wasting time is just not acceptable and customer satisfaction is an imperative. Atlantic Detroit Diesel-Allison sells and services a full line of Detroit Diesel, Mercedes-Benz, and MTU engines, and the automatic transmissions offered by the Allison Transmission Division of General Motors Corporation. Atlantic DDA had initiated a plan to improve its service department by improving order processing, thus driving more revenue, with additional goals of gaining operational efficiency and heightening customer satisfaction. In order to achieve these objectives, Atlantic DDA needed to make real-time information available to its teams of service representatives.

Posted June 07, 2010

In only a few years' time, the world of data management has been altered dramatically, and this is a change that is still running its course. No longer are databases run in back rooms by administrators worrying about rows and columns. Now, actionable information is sought by decision makers at all levels of the enterprise, and the custodians of this data need to work closely with the business.That's because, in the wake of the recent financial crisis and economic downturn, there's a push from both high-level corporate management and regulators to achieve greater understanding and greater transparency across the enterprise, Jeanne Harris, executive research fellow and a senior executive at the Accenture Institute for High Performance, and co-author, along with Tom Davenport, of Competing on Analytics and Analytics at Work, tells DBTA. "In many ways, I think the ultimate result of the financial crisis is that executives realized they cannot delegate analytics to subordinates; they can't view it as technology or math that doesn't really affect them."

Posted June 07, 2010

In the midst of turbulent times, many successful businesses learned an important lesson: The closer IT works with the business, the better an organization can weather the storms that blow in. Thus, many savvy companies understand that the managers and professionals who oversee information technology and applications need to be well incentivized to stay on. At the same time, these professionals understand the need to develop expertise in business management and communications. Many companies are looking to information technology to provide an additional competitive edge, and see their Oracle enterprise systems as the cornerstone of this strategy. As a result, a survey finds that Oracle enterprise application managers and professionals appear to have weathered the economic storm. The survey, conducted among 334 members of the Oracle Applications Users Group (OAUG) by Unisphere Research, and sponsored by Motion International, finds an increase in the number of Oracle technology professionals who are near or surpassing the $100,000 mark in their base salaries.

Posted June 07, 2010

You're probably familiar with the old saying that "it's not what you know, it's who you know." That may have been true back in the days when conversations about competitive advantage concerned memberships at prestigious golf clubs and lavish expense accounts; in the days when data was regarded as mere records of transactions, production and inventory levels. Today's conversations about competitive advantage may still include talk of personal relationships. More frequently, though, these conversations reflect the relatively recent appreciation of the intrinsic value of enterprise data - a value seen not just by senior executives, but also by employees in virtually every department. There is broad consensus in most organizations that enterprise data, and perhaps more importantly, the ability to analyze large volumes or smaller subsets of data at will, in real time, are crucial business differentiators.

Posted May 10, 2010

Mergers and acquisitions often come quickly and when they do, it is critical to have tools and utilities capable of scaling to meet new challenges so operations continue seamlessly, customer service standards are upheld, and costs are contained. This was the case for UGI Utilities, a large natural gas and electric service provider in the eastern U.S. In 2006, UGI acquired the natural gas utility assets of PG Energy from Southern Union Company. A longtime customer of BMC, UGI found it was aligned with the right software company to provide implementation of mainframe service management solutions as well as first class support to get the job done and successfully integrate the newly acquired company's data into its environment, saving time and money.

Posted May 10, 2010

Application monitoring started off with a simple need - to help an administrator or manager of an application, monitor and manage its performance. It was narrow in scope and limited in use - to monitor a single application and provide metrics useful for managing that application only. Monitoring tools were often provided by application vendors, but the growing and complex nature of IT environments necessitated the entry of third-party monitoring tools. These were more specialized, with the ability to centrally monitor several different applications. They helped administrators gain visibility across several different applications, understand where problems occurred, and helped to quickly resolve them.

Posted May 10, 2010

Today it's all about optimizing the business. IT is being charged with finding ways to simplify and automate business processes, making them both more reliable and less expensive to operate. It's a never-ending process, and as time goes on, the demands get greater. Yet while IT has largely been successful to date in helping other parts of the organization save time, money and effort, the same cannot be said for itself.

Posted May 10, 2010

The drivers and benefits of using open source in the enterprise have been widely documented. Ultimately, enterprise users adopt open source for two primary reasons: cost and vendor independence. These are the same drivers and benefits that apply to well-known categories such as operating systems, application servers and web browsers. However, because the database plays a role in all enterprise IT applications, the scrutiny and rigor that enterprises apply to the selection, implementation and deployment of open source databases is far more intense and deliberate.

Posted April 07, 2010

It's hard enough to lock down sensitive data when you know exactly which server the database is running on, but what will you do when you deploy virtualization and these systems are constantly moving? And making sure your own database administrators (DBAs) and system administrators aren't copying or viewing confidential records is already a challenge - how are you going to know when your cloud computing vendor's staff members are not using their privileges inappropriately? These are just two of the obstacles that any enterprise must overcome in order to deploy a secure database platform in a virtual environment, or in the cloud. In some cases, these concerns have been preventing organizations from moving to virtualization or cloud computing.

Posted April 07, 2010

COLLABORATE 10 will at long last bring together Sun and Oracle users for the first time under one roof. Ian Abramson, president of the Independent Oracle Users Group, talks with DBTA about what the group is planning for the April conference in Las Vegas, and how the integration of customers and technology is being handled by Oracle and the IOUG.

Posted April 07, 2010

IBM acquired predictive analytics vendor SPSS in October 2009. Erick Brethenoux, predictive analytics strategist for SPSS, an IBM Company, talks about the growing importance of the technology in helping enterprises address customer needs, what is driving the demand for it now, and how it fits into IBM's idea for a Smarter Planet.

Posted March 04, 2010

An overwhelming challenge - expanding volumes of data - threatens to gum up any productivity improvements seen to date as a result of information technology deployments. All that data is coming in from systems, sensors, and storage area networks, pressuring organizations to expand database inventories, while grappling with associated licensing and hardware costs. Plus, many compliance mandates demand that this data be stored for long periods of time, but remain accessible to auditors and business end users.

Posted March 04, 2010

For many organizations, application information lifecycle management, or ILM, now offers expedient - and badly needed - measures for properly defining, managing, and storing data. Many enterprises are being stymied by a massive proliferation of data in their databases and applications. Growing volumes of transaction data are being digitally captured and stored, along with unstructured forms of data files such as email, video, and graphics. Adding to this tsunami are multiple copies of all this data being stored throughout organizations. At the same time, increasingly tight mandates and regulations put the onus on organizations to maintain this data and keep it available for years to come. Much of this data still resides on legacy systems, which are costly to operate and maintain.

Posted March 04, 2010

Faced with growing data volumes and limited budgets, companies, educational institutions, and government agencies are increasingly relying on IT to help them gain a competitive edge and better serve their customers and constituents. DBTA recently asked key MultiValue vendors to explain their strategies for enabling data integration from different repositories and for supporting business intelligence and analytics that provide meaningful insight for customers.

Posted March 04, 2010

Managing and measuring costs has taken on a new urgency with the emergence of virtualization and new computing models. With virtualization, customers get a shared infrastructure that shifts the cost from a clear 1:1 relationship between servers, applications and users to a more dynamic model. We're just beginning to realize the tremendous impact this has on cost management and measurement in the data center. To make effective decisions about how to deploy resources, the business needs to clearly understand the associated costs.

Posted February 09, 2010

There's no question that databases are the heart of nearly every application running these days. Moreover, the information stored in databases is now being routinely used as a competitive and operational business weapon by all businesses and organizations regardless of size or industry. Whether used internally in business intelligence applications or utilized externally via the exposure of data tools that let customers view and search through vast amounts of data on websites, data is being maximized in many different ways.

Posted February 09, 2010

When the Sarbanes-Oxley Act (SOX) was first enacted in 2002 in the wake of several very visible accounting scandals, small to medium enterprises may have felt they dodged a very expensive bullet. The requirement to document processes for governance, risk management and compliance (GRC), and have them confirmed by outside auditors only applied to publicly traded companies. Unlike their publicly traded brethren, SMEs were not forced to purchase costly GRC software, did not have to re-direct resources from their normal daily tasks to prepare for audits, and did not have to change their methods of operation to comply with a government mandate.

Posted February 09, 2010

With the increased merging of disparate core business systems in the enterprise - as well as the emergence of additional systems in the form of enterprise resource management, customer relationship management, hierarchical storage strategies, and other business-driven initiatives - many companies today find themselves moving mountains of data on a daily basis. Business intelligence (BI) initiatives in particular typically rely on data warehousing strategies to provide critical information and reports to management in support of business decisions. Such strategies often require the timely transfer of enormous amounts of data from line-of-business systems. Too much time taken in data transfer can adversely impact a company's agility and could mean lost windows of business opportunities. It can also encroach on processing resources better devoted to core business applications.

Posted January 11, 2010

Enterprises that downplay the importance of storage management may be putting other key enterprise objectives at risk. That's the message from Kyle Fitze, Director of Marketing, Storage Platforms Division, HP StorageWorks. With IT shops facing constrained budgets and data volumes continuing to escalate, Fitze says, greater efficiency in the IT infrastructure is a requirement so that more money and time can be targeted at IT projects that will drive business growth. "Today, we believe that most customers spend upward of 70% of their budget just keeping the systems running and the lights on and everything cooled, on maintenance and operations, and the remainder of their budget on innovative IT projects," he observes. What HP would like to do, "is flip that ratio, so that customers, while they spend less on IT overall, are spending a smaller percentage of their budget on operations and the larger percentage then on innovation and business intelligence, and the kind of IT projects that can help them navigate these rough waters of economic decline."

Posted January 11, 2010

If you have ever been involved in configuring, installing and maintaining enterprise software, I don't have to tell you that it's time-consuming and complex. The cumbersome process of installing and tuning the operating system (OS), middleware, and database, then integrating and configuring the software is manual and error-prone. Even if you get it all correct, the process alone can delay time-to-value for the end user and introduce challenges for independent software vendors (ISVs) looking to shorten sales cycles. The whole process is daunting and expensive, discouraging customers and inhibiting sales. In order to maximize their financial return and eliminate installation and maintenance challenges, many ISVs are building appliances - versions of their product, packaged with a "just enough operating system" required to perform the desired tasks. Pre-configured for specific use cases, these compact, self-contained appliances can be deployed in a matter of minutes, requiring only last mile setup.

Posted January 11, 2010

As we enter the next decade of the millennium, we will see information technology becoming more ubiquitous, driving an even greater share of business decisionmaking and operations. IT has proven its muster through the recent downturn as both a tactical and strategic weapon for streamlining, as well as maintaining competitive edge. Now, as we begin the next round of economic recovery, companies will be relying on IT even more to better understand and serve their markets and customers. Yet, there are many challenges with managing a growing array of IT hardware, software, and services. To address these requirements, businesses continue to look to approaches such as analytics, virtualization, and cloud computing. To capture the trends shaping the year ahead, Database Trends and Applications spoke to a range of industry leaders and experts.

Posted December 14, 2009

Corporate management is complacent about data security. Efforts to address data security are still ad hoc, and not part of an overall database security strategy or plan. Companies are not keeping up with the need to monitor for potential risks. More monitoring tends to be ad hoc or on-the-fly, versus more organized or automated systematic approaches. These are the findings from new research from Unisphere Research and the Independent Oracle Users Group (IOUG), which shows that the recent economic downturn has taken a toll on data security efforts within enterprises.

Posted December 14, 2009

Credit card security is a top priority - for both consumers and businesses. But what happens if there is a security breach exposing critical data to unknown sources? What can businesses do from an IT perspective to ensure they're protecting consumer information? When sensitive cardholder information resides in legacy host systems, host access technology can be a critical tool to help organizations successfully achieve PCI DSS compliance.

Posted December 14, 2009

Rocket Software recently completed the purchase of the UniData and UniVerse Servers and Tools assets from IBM. Susie Siegesmund, now vice president and general manager for the U2 brand under Rocket, talks with Database Trends and Applications about why the timing was right for this move and what U2 customers and partners can expect under the new ownership.

Posted December 14, 2009

A modern architecture, system stability and strong behind-the-scenes support are key attributes to consider when evaluating new database technology. In December 2008, Brasher's initiated a phased roll-out of its enterprise applications on the InterSystems CACHÉ high-performance database with MultiValue technology, concluding the implementation in January 2009. In all, the migration involved more than 8,000 programs and cataloged procedures ranging from accounting applications through real-time bid processing systems in auction venues. In going live with CACHÉ at each location, says Ty Brewer, Brasher's CIO, "our goal was for people to go home on a Friday and come back on a Monday and not notice anything different, other than things being faster. By and large, that's exactly what happened."

Posted November 11, 2009

Cloud computing offers a bright future for enterprise IT in the form of a scalable infrastructure and pay-as-you-need pricing model. As cloud adoption emerges both in hype and value, all technologists are interested to know how the story will unfold. One way to examine the future of cloud computing is to look at the recent past of another formerly over-hyped technology enabling agility and cost-savings to organizations - service-oriented architecture (SOA).

Posted November 11, 2009

Performance bottlenecks have the potential to effectively cripple an entire organization, which can spell disaster for the enterprise. The lengthy downtime caused by poor database performance interrupts business continuity and reduces end-user productivity, and can cause a direct, negative impact on the organization's bottom line.

Posted November 11, 2009

Organizations that really want to take advantage of a higher performance, more agile and lower cost data warehouse architecture, should implement master data management (MDM) to improve data quality. Nearly every data warehouse ecosystem has attempted to manage master data within its data warehouse architecture, but has focused on mastering data after transactions occur. This approach does little to improve data quality because data are "fixed" after the fact. The best way to improve data quality is to move the process "upstream" of the data warehouse and before transactions are executed.

Posted November 11, 2009

The Sarbanes-Oxley Act of 2002 (SOX) can be considered the most significant compliance standard of our time. Since the passing of the legislation 7 years ago, companies have had to rethink the way they use technology to store company data. This transformation has been anything but an easy ride for companies today, and has significantly impacted the role of the CIO within an organization.

Posted October 13, 2009

High-profile data breaches at major corporations and the usual assortment of state government agencies and educational institutions have highlighted the value of encrypting data. Yet, breach numbers continue to spike and big losses are becoming more common; according to Verizon's 2009 Data Breach Investigations Report, which looks only at breaches that resulted in stolen data being used in a crime, the total number of records breached in Verizon's 2008 caseload—more than 285 million—exceeded the combined total from 2004 to 2007. Apparently the market is now so saturated with stolen data that the price of each record has dropped from a high of $16 in 2007 to less than 50 cents today. But the intensifying number of successful attacks isn't the most distressing part of data breach reports: the Identity Theft Resource Center reports that only 2.4% of the companies involved in all reported breaches utilized encryption.

Posted October 13, 2009

The rising popularity of web 2.0 and cloud computing services has prompted a reexamination of the infrastructure that supports them. More and more people are using web-based communities, hosted services, and applications such as social-networking sites, video-sharing sites, wikis and blogs. And the number of businesses adopting cloud computing applications such as software as a service and hosted services is climbing swiftly.With all this growth, internet data centers are struggling to handle unprecedented workloads, spiraling power costs, and the limitations of the legacy architectures that support these services. The industry has responded by moving towards a "data center 2.0" model where new approaches to data management, scaling, and power consumption enable data center infrastructures to support this growth.

Posted October 13, 2009

In art, the expression "less is more" implies that simplicity of line and composition allow a viewer to better appreciate the individual elements of a piece and their relationship to each other to make a whole. In engineering, "less is more" when you accomplish the same work with fewer moving parts. And when dining out, "less is more" when the portions may be smaller, but the food is so much better and satisfying. In IT, the adage is more accurately stated today as "less does more." As IT increases in complexity, mainframe organizations are being asked to handle greater workloads, bigger databases, more applications, more system resources, and new initiatives. All this, without adding—and sometimes while cutting—staff. In addition, IT is undergoing a serious "mainframe brain drain," as the most experienced technicians retire, taking with them their skills and detailed knowledge of the mainframes' idiosyncrasies.

Posted October 13, 2009

Why do business decision makers need to wait for IT to deliver performance reports on the business? Why can't they build their own reports, and gain rapid access to answer the questions they have?

Posted September 14, 2009

A member of the Quest International Users Group and IT specialist at Shell Canada Ltd., Sue Shaw took on the role of president of the users group in June. She talks with Database Trends and Applications about what drew her in as a member and her goals for the PeopleSoft, JD Edwards, and Oracle Utilities association now that she is at the helm.

Posted September 14, 2009

This year, despite a turbulent economy marked by painful layoffs in many sectors, database professionals appear to be weathering the storm. In fact, database professionals reported higher incomes and bonuses this year over last. Still, a sizeable segment of professionals saw changes in their jobs as a result of economic conditions, and many are concerned going forward about the impact of tighter budgets on their departments' performance.

Posted September 14, 2009

The Swiss National Sound Archives is Switzerland's official depository of audio records. Founded by law in 1987 as a private foundation working in close collaboration with the Swiss National Library in Bern, the mission of the Swiss National Sound Archives is the preservation of the country's audio heritage. Strictly for Switzerland's audio archives, the foundation collects and safeguards anything sound-related, including speeches, theatrical works, interviews, audio books, and all types of music-from rock to classical. It makes these recordings and detailed information about them, such as the people involved in their creation, available through a website accessible to the public in Switzerland's four official languages - German, French, Italian and Romansh - as well as in English.

Posted September 14, 2009

Microsoft SQL Server has been a favorite for years for organizations that want to implement business intelligence (BI) functionality - even in traditionally non-Microsoft shops. Especially since the SQL Server 2005 release, the ROI of a Microsoft solution coupled with the ease of implementation has driven healthy adoption of the DBMS for BI. And the integration of SQL Server 2008 with Microsoft Office, SharePoint Server, and PerformancePoint Services for delivering BI to end users has created an even stronger end-to-end platform.

Posted September 14, 2009

The concept of database sharding has gained popularity over the past several years due to the enormous growth in transaction volume and size of business-application databases. Database sharding can be simply defined as a "shared-nothing" partitioning scheme for large databases across a number of servers, enabling new levels of database performance and scalability. If you think of broken glass, you can get the concept of sharding—breaking your database down into smaller chunks called "shards" and spreading them across a number of distributed servers.

Posted August 14, 2009

Insiders, by virtue of their easy access to organizations' information, systems, and networks, pose a significant risk to employers. Every day, there's a new shocking headline concerning a major network security breach caused (knowingly or unknowingly) by a corporate insider. And the number of security breaches that start from within keep growing—particularly in this down economy, as the number of disgruntled employees escalates. You'd think that large organizations in particular would be rushing to protect themselves from such headlines and liability, but they just aren't getting the message. Nor are they taking the necessary steps to protect themselves from a policy and technical standpoint.

Posted August 14, 2009

There are mainly four kinds of information management professionals in an OLTP environment—data architects, database architects, application DBAs and operational (or production support) DBAs. It should be the aim of an information management professional to master all the four roles and seamlessly shift between them with ease. Once you are able to shift roles easily, be assured that you're adding to the revenue of your business.

Posted August 14, 2009

The data manager's job has never been easy, often presenting significant challenges, including data system rewrites, data security, regulatory compliance, and reporting. And the digital age, with a myriad of new and innovative data sources and more sophisticated analytic models, presents its own unique hurdles to implementing a successful data-management and data-quality program in the modern insurance enterprise.

Posted August 14, 2009

Data encryption performs two purposes: it protects data against internal prying eyes, and it protects data against external threats (hacking, theft of backup tapes, etc.) Encryption in the database tier offers the advantage of database-caliber performance and protection of encryption keys without incurring the overhead and additional cost of using a third-party encryption tool in an application tier.

Posted July 13, 2009

As organizations grow and evolve, they must implement technology changes to accommodate evolving infrastructure needs, often within complex systems running business-critical applications. Along with this, there frequently is an increased demand to reduce the costs of technology by sharing hardware and software resources, a demand many companies try to meet by establishing virtual environments. Virtualization balances the often underutilized physical machines by consolidating them into a single physical host that runs multiple Virtual Machines (VMs) sharing the four core resources—CPU, memory, disks and network cards—of one physical host.

Posted July 13, 2009

Using historically standard analysis techniques related to file placement on disks within the Unisys 2200 environment, it is possible to significantly improve the performance and capacity without significant additional outlays for hardware. Definitions of disk usage and file placement have been identified on a general basis as no longer relevant, as a means of following the "understanding" that modern disks provide sufficient native speed that file placement no longer matters. This is not a valid assumption.

Posted July 13, 2009

In today's competitive and crisis-ridden market, companies are under pressure to rapidly deliver results and make necessary changes—which requires that decision makers have accurate and timely information readily available. However, many executives have doubts about the timeliness of the information they now receive through their current BI and analytics systems.

Posted June 15, 2009

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31

Sponsors