Newsletters




Trends and Applications



Clinical Data Management (CDM) is a company headquartered in Colorado that provides clinical information database software, enabling medical institutions to report and compile data on patient care. A longtime user of Revelation Software, dating back to Revelation G and continuing through OpenInsight, CDM was pleased with both the quality of the products and the service from Revelation. However, CDM had come to realize it needed to provide a web interface for data entry to better support its customers and also stay current with evolving technology requirements. That need was answered when Revelation launched the OpenInsight for Web (O4W) Development Toolkit, a web development toolkit that makes it possible for OpenInsight developers with limited or no HTML, XML or JavaScript experience to develop feature-rich web pages.

Posted March 07, 2012

Valuable data and trusted applications are dependent on MultiValue databases at many organizations, but there is also a need to integrate that data and those applications with other systems and provide access to users in new ways. In this special section, DBTA asks leading MultiValue vendors:What is your organization doing to help customers modernize and stay current with new technologies to address the evolving requirements of customers?

Posted March 07, 2012

SAP application performance (speed and availability) is becoming a major focus as com­panies rely increasingly on their SAP systems to support employee productivity, partner collaboration, customer relationships, revenues, brand equity and growth. With many companies running their critical business processes on SAP, high availability and acceptable speed of the business software environment are essential requirements.

Posted February 23, 2012

In celebration of ODBC's 20th anniversary this year, Progress Software Corporation has unveiled its Platinum ODBC drivers, Progress DataDirect Connect for ODBC 7.0. The standards-based, fully interoperable Progress DataDirect Connect for ODBC 7.0 driver allows application developers to reliably exchange data between cloud data and disparate data sources.

Posted February 23, 2012

Oracle has announced the availability of Oracle Advanced Analytics, a new option for Oracle Database 11g that combines Oracle R Enterprise with Oracle Data Mining. According to Oracle, Oracle R Enterprise delivers enterprise class performance for users of the R statistical programming language, increasing the scale of data that can be analyzed by orders of magnitude using Oracle Database 11g.

Posted February 23, 2012

The challenges of maintaining security and regulatory compliance as applications increasingly move to the cloud - whether public, private or hybrid - will come into greater focus in 2012, says Ryan Berg, cloud security strategy lead for IBM. The need to manage security among an increasingly mobile workforce, with many employees choosing to use their own personal devices, will also be a key concern in 2012, says Berg.

Posted February 23, 2012

Kalido, a provider of agile information management software, unveiled the latest release of the Kalido Information Engine, which helps organizations decrease the time for data mart migrations and consolidations. With this new release, customers will be able to import existing logical and physical models and taxonomies to build a more agile data warehouse. Enabling customers to take advantage of existing assets and investments "is going to dramatically reduce the time and the cost that it takes to bring together data marts into more of a data warehouse scenario," says John Evans, director of product marketing at Kalido.

Posted February 23, 2012

While emails have been "the smoking gun" in many recent court cases, the new big wave in what is "discoverable" is structured (database) data. Accessing data is simpler and much faster from structured data than non-structured data. If the response to e-discovery can come from a structured data format, it is usually much faster than the alternatives and can mitigate the risk of steep fines due to delayed response time. A new combination of stringent regulations and new technology are giving judges and litigators more muscle to subpoena more data. Structured data in any application and database, no matter how old or obsolete, can be used in court as evidence, and increasingly it is being asked for.

Posted February 09, 2012

Today's organizations must capture, track, analyze and store more information than ever before - everything from mass quantities of transactional, online and mobile data, to growing amounts of "machine-generated data" such as call detail records, gaming data or sensor readings. And just as volumes are expanding into the tens of terabytes, and even the petabyte range and beyond, IT departments are facing increasing demands for real-time analytics. In this era of "big data," the challenges are as varied as the solutions available to address them. How can businesses store all their data? How can they mitigate the impact of data overload on application performance, speed and reliability? How can they manage and analyze large data sets both efficiently and cost effectively?

Posted February 09, 2012

Businesses are struggling to cope with and leverage an explosion of complex and connected data. This need is driving many companies to adopt scalable, high performance NoSQL databases - a new breed of database solutions - in order to expand and enhance their data management strategies. Traditional "relational" databases will not be able to keep pace with "big data" demands as they were not designed to manage the types of relationships that are so essential in today's applications.

Posted January 25, 2012

Tableau Software, a provider of business intelligence software, has announced the general availability of Tableau 7.0. The release offers improvements in performance and scalability, adds new visualization types and improves the product's overall analytical power and ease-of-use. In addition, the new Tableau Data Server capabilities will make it easy to share large volumes of data, share data models in large groups, and provide enhanced management of data assets, says Chris Stolte, chief development officer, co-founder, and inventor of Tableau Software.

Posted January 25, 2012

RainStor, a provider of big data management software, has unveiled the RainStor Big Data Analytics on Hadoop, which the company describes as the first enterprise database running natively on Hadoop. It is intended to enable faster analytics on multi-structured data without the need to move data out of the Hadoop Distributed File System (HDFS) environment. There is architectural compatibility with the way Rainstor manages data and the way Hadoop Distributed File Systems manage CSV files, says Deirdre Mahon, vice president of marketing at Rainstor.

Posted January 25, 2012

The Oracle Big Data Appliance, an engineered system of hardware and software that was first unveiled at Oracle OpenWorld in October, is now generally available. The new system incorporates Cloudera's Distribution Including Apache Hadoop (CDH3) with Cloudera Manager 3.7, plus an open source distribution of R. The Oracle Big Data Appliance represents "two industry leaders coming together to wrap their arms around all things big data," says Cloudera COO Kirk Dunn.

Posted January 25, 2012

"Big data" and analytics have become the rage within the executive suite. The promise is immense - harness all the available information within the enterprise, regardless of data model or source, and mine it for insights that can't be seen any other way. In short, senior managers become more effective at business planning, spotting emerging trends and opportunities and anticipating crises because they have the means to see both the metaphorical trees and the forest at the same time. However, big data technologies don't come without a cost.

Posted January 11, 2012

CIOs and IT departments are on the frontlines of a monumental IT shift. With the number of mobile devices and applications exploding and bandwidth soaring, they are being asked to find ways to enable the brave new world of enterprise mobility. All involved - from users to IT - recognize the productivity and business efficiency benefits of this trend, but it is typically only IT that also recognizes the dangers unchecked mobility poses to sensitive corporate data.

Posted January 11, 2012

The argument that "everyone is doing it and you should too" holds no value for strategic decision-making in IT. Yet critical thinking often goes by the wayside when a hot, new trend catches on and it seems like the masses are following along. Cloud computing is certainly in vogue - most industry analysts are bullish on cloud computing adoption and anticipate enterprise spending to increase - but organizations need to steer clear of falling into the trap that moving to the cloud always delivers cost savings.

Posted January 11, 2012

In this, our last E-Edition of Database Trends and Applications for 2011, we're taking a look back at some of the most widely read articles of the past year. These articles cover a range of topics. Some provide an examination of just-emerging or quickly evolving technologies, others highlight best practices in a specific discipline, while others comment on trends observed by industry experts. Click on the "December 2011 E-Edition UPDATE" headline above to access the articles. If you missed one earlier in the year, here's your second chance. All DBTA E-Editions are archived by month on the DBTA website.

Posted December 16, 2011

Stacks of statistics from many sources share a common theme - growth rates for digital information are extremely high and undeniable. A tsunami of e-information is fueling the engine of today's corporate enterprise, and many businesses are aiming to ride the information wave to prosperity. However, many companies are not sufficiently attentive to all the potential liabilities lurking in the depths of this digital information, including the risks involved in using real, live personal customer and employee data for application development and testing purposes. There's real potential for serious data security, legal and noncompliance risks when businesses fail to protect this data.

Posted December 01, 2011

Until recently, companies were only warming up to the possibilities of cloud computing. Lately, however, for many enterprise-IT decision makers, cloud is hot, hot, hot. The sea change now underway means many companies are quickly moving from "dipping their toes into cloud computing" to a full-fledged immersion, says Thom VanHorn, vice president of marketing for Application Security, Inc. In 2012, expect to see those same companies dive right in. "The move will only accelerate," he tells DBTA.

Posted December 01, 2011

A new survey of 421 data managers and professionals affiliated with the Independent Oracle Users Group (IOUG) members finds that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. Many respondents report a significant surge of data within their data warehouses in recent times, fueled not only by growing volumes of transaction data but unstructured data as well. Now, the challenge is to find ways to extend data analysis capabilities to additional business areas.

Posted December 01, 2011

Although it is possible to develop an efficient Vehicle Tracking System using any database server, and many such solutions are already available in the market, Informix offers many advantages. Informix can reduce your disk space requirement, improve your query performance, and reduce your application development efforts, without special training or buying additional technology. The built-in technologies of Informix will reduce your total cost of ownership, so if these benefits are important to you, keep reading!

Posted November 10, 2011

Increasing concerns over security breaches from external and internal threats, regulatory compliance requirements from HIPAA, the HITECH Act, PCI DSS, and other mandates, plus the migration from physical servers to virtual machines and the cloud are prompting companies to adopt encryption as never before. Encryption protects data by transforming data into unintelligible strings of characters (called cipher text) and today is widely considered to be a security best practice.

Posted November 10, 2011

Few things in the world are changing as dramatically as data. Data has tapped out a powerful rhythm to keep time with technology's bleeding edge, leaving many technologies struggling to keep up. It should come as no surprise, then, that many of the data strategies that IT departments developed, and still widely rely upon, are no longer sufficient for today's needs. You can put data marts near the top of that list.

Posted November 10, 2011

Customer centricity has become a watchword for major corporations worldwide, but a recently released survey has revealed that many enterprises are lacking in the basic knowledge of who their customers are, not to mention their attributes, tastes, purchasing histories and relationships with other customers.

Posted October 26, 2011

Columnar database technology burst on the data warehouse scene just a couple years ago with promises of faster query speeds on vast amounts of data. They delivered on that promise, but at a cost that is no longer worth paying. Here's why.

Posted October 26, 2011

At Oracle OpenWorld in San Francisco earlier this month, Oracle users groups were out in force hosting SIG meetings, providing educational information, and presenting sessions by subject matter experts. User group presidents were also onsite, outlining ambitious plans for the year ahead.

Posted October 26, 2011

Teradata, a data analytic data solutions provider, announced the latest update to its flagship data warehouse product, as well as new features in its data warehouse appliance. Teradata Database 14 is designed as the analytical engine for powering all of the vendor's "purpose-built" platform family members - from enterprise data warehouses to appliances. "We're including big data application support," says Scott Gnau, president of Teradata Labs at a briefing at the vendor's recent user group event.

Posted October 26, 2011

DataStax, a provider of solutions based on the open source Apache Cassandra database platform, announced it is shipping an enterprise database platform designed to enable the management of both real-time and analytic workloads from a single environment. The new platform, DataStax Enterprise, is designed to leverage the features of Cassandra to provide performance enhancements and cost savings over traditional database management solutions, the vendor claims.

Posted October 26, 2011

Legacy IT systems were developed for - and still run - about 60% of all mission-critical applications. They are still stable and reliable. However, the maintenance on legacy applications over time has made them so complex that rather than helping to solve business problems, most legacy systems have become a challenge that enterprises are grappling with.

Posted October 15, 2011

Cloud computing has taken the enterprise IT world by force. IT managers and CIOs are evaluating private, public and hybrid cloud infrastructures for running corporate applications and services. Many are doing pilots and evaluating large-scale migrations to the cloud, with the hope of not only saving money but increasing services for users.

Posted October 15, 2011

Joe Clabby authors a comprehensive overview on the SHARE website this week focusing on the issue of the IT Skills Gap. While the need for mainframe professionals remains high, for instance, the supply of young mainframers remains stubbornly short. Certainly, the supply of mainframers coming out of universities is hamstrung by a number of specific factors including an uninformed perception by students about the platform's ongoing importance, a dearth of curriculum and faculty covering the subject in many computer science programs, and a lack of outreach by industry into local computer science departments to foster both academic and internship focus.

Posted October 15, 2011

During a keynote presentation last week at Oracle OpenWorld 2011, the new Oracle Big Data Appliance, an engineered system optimized for acquiring, organizing and loading unstructured data into Oracle Database 11g, was announced by Thomas Kurian, executive vice president, Product Development, Oracle.

Posted October 15, 2011

Smartphones, tablets and other handhelds are changing the way companies do business. And when these revolutionary devices can be combined with existing tried-and-true software for evolutionary change, as opposed to ripping and replacing, the results are even better.

Posted September 14, 2011

As companies learn to embrace "big data" - terabytes and gigabytes of bits and bytes, strung across constellations of databases - they face a new challenge: making the data valuable to the business. To accomplish this, data needs to be brought together to give decision makers a more accurate view of the business. "Data is dirty and it's hard work; it requires real skills to understand data semantics and the different types of approaches required for different data problems," Lawrence Fitzpatrick, president of Computech, Inc., tells DBTA. "It's too easy to see data as ‘one thing.' "

Posted September 14, 2011

As data grows, the reflex reaction within many organizations is to buy and install more disk storage. Smart approaches are on the horizon but still only prevalent among a minority of companies. How is it data has grown so far so fast? Technology growth along the lines of Moore's Law (doubling every 18 months) has made petabyte-capable hardware and software a reality. And data growth itself appears to be keeping pace with the hardware and systems. In fact, a petabyte's worth of data is almost commonplace, as shown in a new survey conducted by Unisphere Research among members of the Independent Oracle Users Group (IOUG). In "The Petabyte Challenge: 2011 IOUG Database Growth Survey," close to 1 out of 10 respondents report that the total amount of online (disk-resident) data they manage today-taking into account all clones, snapshots, replicas and backups-now tops a petabyte.

Posted September 14, 2011

Virtualization is such a broad term and a hot topic among IT professionals. However, just because your organization has conquered server virtualization, or is well underway with confidence, if you proceed with the same desktop virtualization practices, you will be setting yourself up for failure.

Posted August 29, 2011

Rampant data growth has been the stimuli to over-spending on data storage. Technology advances have enabled us to gather more data faster than any time in our history. This has been beneficial in many ways and has provided businesses more data that can enable them to optimize their sales, marketing, customer relations and product offerings. Unfortunately, in order to keep pace with data growth, businesses have had to provision more and more storage capacity, costing them millions of dollars.

Posted August 29, 2011

Objectivity, Inc., a database provider, is releasing a commercial version of InfiniteGraph, a distributed and scalable graph database designed to enable a new, cost effective, and efficient way of navigating multiple types of databases for discovery of deeper and more relevant intelligence, enabling real-time decision support. InfiniteGraph already has a large following of users within government agencies, says Jay Jarrell, president and CEO of Objectivity.

Posted August 29, 2011

Informatica Corporation has announced the availability of what the company describes as the industry's first dynamic data masking (DDM) solution. Informatica Dynamic Data Masking provides real-time, policy-driven obfuscation of sensitive data to address a wide range of common data security and privacy challenges without requiring any changes to database or application source code and is intended to address problems that cannot be solved by other technologies such as IAM (identity access management), SDM (static data masking). Informatica Dynamic Data Masking is based on technology developed by ActiveBase, which was acquired by Informatica in July, 2011.

Posted August 29, 2011

Endeca Technologies, Inc., an information management software company, has unveiled native integration of Endeca Latitude with Apache Hadoop. "This native integration between Endeca Latitude and Hadoop brings together big data processing from Hadoop and interactive search, exploration and analysis from Latitude," says Paul Sonderegger, chief strategist of Endeca Technologies.

Posted August 29, 2011

Another IT initiative is in the news. What does it really mean for you? Is it an opportunity? Or is it a distraction? Whatever your perspective, it seems clear that internet computing standards have reached another plateau of standardization and capability, such that vendors see an opportunity to pursue new models of computing.

Posted August 11, 2011

The rise of big data has garnered much of the attention in the data management arena lately. But it is not simply the sheer volume of data that is challenging data professionals. Many new types and brands of DBMSs are also popping up across organizations, bringing new problems for the data professionals who are tasked with managing them, and also giving rise to scores of "accidental database administrators" with no formal DBA training, a new Unisphere Research study reveals.

Posted August 11, 2011

Over the years, countless papers and articles have been written on enterprise resource planning (ERP) implementation project success rates and why ERP projects fail. Interestingly enough, the reasons that projects fail are the same today as they were 10 years ago: lack of top management commitment, unrealistic expectations, poor requirements definition, improper package selection, gaps between software and business requirements, inadequate resources, underestimating time and cost, poor project management, lack of methodology, underestimating impact of change, lack of training and education, and last, but not least, poor communication.

Posted August 11, 2011

There is no doubt that virtualization is radically changing the shape of IT infrastructure, transforming the way applications are deployed and services delivered. Databases are among the last of the tier 1 applications to be hosted on virtual servers, but the past year has seen a huge wave of increase for production Oracle, SQL Server and other databases on VMware platforms. For all the benefits of virtualization, including cost-effectiveness, there are some impacts on the IT staff involved. Unfortunately for the DBAs virtualization often means losing control and visibility of their systems, which can ultimately hinder their ability to deliver database-oriented business solutions. While in the past DBAs had perfect visibility to the physical servers hosting the databases, the virtualization layers and the tools to manage them are typically out of bounds to them. While all the excitement of late has centered on VMware and other virtual machine systems, the DBAs have a valid reason for skepticism.

Posted July 27, 2011

Given all of the recent discussion around big data, NoSQL and NewSQL, this is a good opportunity to visit a topic I believe will be (or should be) forefront in our minds for the next several years - high velocity transactional systems. Let's start with a description of the problem. High velocity transactional applications have input streams that can reach millions of database operations per second under load. To complicate the problem, many of these systems simply cannot tolerate data inconsistencies.

Posted July 27, 2011

Oracle has introduced the Oracle Exadata Storage Expansion Rack to offer customers a cost-effective way to add storage to an Oracle Exadata Database Machine. "There are customers, earlier Exadata Database Machine customers, that have now started to fill up the disks that they have on the Database Machine and they are starting to look for ways to expand their storage capacity, and so this is going to be really welcome for them," says Tim Shetler, vice president of Product Management, Oracle.

Posted July 27, 2011

Sybase, an SAP company, has announced the general availability of Sybase IQ 15.3, which aims to help enterprise IT departments overcome the scalability limitations of many data warehouse approaches. By implementing a business analytics information platform that allows sharing of computing and data resources through the new Sybase IQ PlexQ technology, the company says enterprises can break down user and information silos to increase analytics adoption throughout their entire organization. There is a lot of talk about big data, but how to manage it and analyze it is only half the problem, observes David Jonker, senior product marketing manager of Sybase IQ. "The other half is how do you make it more pervasive throughout the enterprise and from our perspective that is where a lot of the existing data warehousing solutions fall down."

Posted July 27, 2011

The big data playing field grew larger with the formation of Hortonworks and HPCC Systems. Hortonworks is a new company consisting of key architects and core contributors to the Apache Hadoop technology pioneered by Yahoo. In addition, HPCC Systems, which has been launched by LexisNexis Risk Solutions, aims to offer a high performance computing cluster technology as an alternative to Hadoop.

Posted July 27, 2011

Time series data is a sequence of data points typically measured at successive times and may be spaced at uniform time intervals. Time-stamped data can be analyzed to extract meaningful statistics or other characteristics of the data. It can also be used to forecast future events based on known past events. Time series data enables applications such as economic forecasting, census analysis and forecasting, fleet management, stock market analysis, and smart energy metering. Because it is time-stamped, time series data has a special internal structure that differs from relational data. Additionally, many applications such as smart metering store data at frequent intervals that require massive storage capacity. For these reasons, it is not sufficient to manage time series information using the traditional relational approach of storing one row for each time series entry.

Posted July 07, 2011

Today, we operate in a global economy at internet speed. Globalization of our workforce has shifted the way work gets done. The explosion of wireless and edge technology has raised the expectations of consumers, who are more informed, educated, and knowledgeable about products and services. This changing landscape places immense pressure on business applications in organizations worldwide. Critical application outages caused by software defects can cost the business millions of dollars in revenue for every hour of downtime.

Posted July 07, 2011

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

Sponsors