Newsletters




Trends and Applications



Social media business intelligence is on the rise, according to a new Unisphere Research study sponsored by IBM and Marist College. The study found that while social media monitoring and analysis is in its early stages, many organizations plan to monitor, collect, stage and analyze this data over the next 1 to 5 years and. In particular, LOB respondents, who are closer to customers, show appreciation for the benefits of monitoring SMNs.

Posted March 21, 2012

MarkLogic Corporation has joined the technology partner program of Hortonworks, a leading vendor promoting the development and support of Apache Hadoop. According to the vendors, by leveraging MarkLogic and Hortonworks, organizations will be able to seamlessly combine the power of MapReduce with MarkLogic's real-time, interactive analysis and indexing on a single, unified platform. There are two main reasons that MarkLogic has chosen to partner with Hortonworks, says Justin Makeig, senior product manager at MarkLogic. One is Hortenworks' extensive experience with Hadoop installations and the second is that its core product is 100% open source.

Posted March 21, 2012

Pentaho Corporation, an open source business analytics company, has formed a strategic partnership with DataStax, a provider of big data solutions built upon the Apache Cassandra project, a high performance NoSQL database. The relationship will provide native integration between Pentaho Kettle and Apache Cassandra. This will merge the scalable, low-latency performance of Cassandra with Kettle's visual interface for high-performance ETL, as well as integrated reporting, visualization and interactive analysis capabilities. According to the companies, organizations seeking to leverage their big data have found it difficult to implement and employ analytics technologies. "One of the big challenges today is ease of use of these tools," says Ian Fyfe, Pentaho's chief technology evangelist. Often built on open source projects, it "takes a lot of deep skills to use these systems, and these are skills that are hard to find," he explains.

Posted March 21, 2012

Novell announced an update to its ZENworks suite, which includes integrated Mac device management, and full disk encryption capabilities. ZENworks 11 Support Pack 2 enables customers to lock out threats without shutting down IT access, the vendor says. ZENworks 11 now offers a more holistic approach to supporting Mac devices in the enterprise. With this release, Mac support is provided through Remote Management for Mac, Asset Management for Mac, Mac OSX Patching and Mac Bundles.

Posted March 21, 2012

The volume of business data under protection is growing rapidly, driven by the explosion of mobile computing, the use of powerful business applications that generate more data, and stringent regulations that require companies to retain data longer and maintain it in a format that is readily available upon request. The problem of massive data growth is particularly acute in traditional, large data-intensive enterprises that have become increasingly reliant on database-driven business automation systems, such as Oracle, SQL, and SAP. These organizations are also increasingly adopting a new wave of data-intensive applications to analyze and manage their "big data" - further compounding the problem.

Posted March 07, 2012

Organizational focus has been placed on the emergence of "big data" - large-scale data sets that businesses and governments use to create new value with today's computing and communications power. Big data poses many opportunities, but managing the rapid growth adds challenges, including complexity and cost. Leaders must address the implications of big data, increasing volume and detail of information captured by enterprises, the rise of multimedia, social media, and the internet.

Posted March 07, 2012

For enterprises grappling with the onslaught of big data, a new platform has emerged from the open source world that promises to provide a cost-effective way to store and process petabytes and petabytes worth of information. Hadoop, an Apache project, is already being eagerly embraced by data managers and technologists as a way to manage and analyze mountains of data streaming in from websites and devices. Running data such as weblogs through traditional platforms such as data warehouses or standard analytical toolsets often cannot be cost-justified, as these solutions tend to have high overhead costs. However, organizations are beginning to recognize that such information ultimately can be of tremendous value to the business. Hadoop packages up such data and makes it digestible.

Posted March 07, 2012

Clinical Data Management (CDM) is a company headquartered in Colorado that provides clinical information database software, enabling medical institutions to report and compile data on patient care. A longtime user of Revelation Software, dating back to Revelation G and continuing through OpenInsight, CDM was pleased with both the quality of the products and the service from Revelation. However, CDM had come to realize it needed to provide a web interface for data entry to better support its customers and also stay current with evolving technology requirements. That need was answered when Revelation launched the OpenInsight for Web (O4W) Development Toolkit, a web development toolkit that makes it possible for OpenInsight developers with limited or no HTML, XML or JavaScript experience to develop feature-rich web pages.

Posted March 07, 2012

Valuable data and trusted applications are dependent on MultiValue databases at many organizations, but there is also a need to integrate that data and those applications with other systems and provide access to users in new ways. In this special section, DBTA asks leading MultiValue vendors:What is your organization doing to help customers modernize and stay current with new technologies to address the evolving requirements of customers?

Posted March 07, 2012

SAP application performance (speed and availability) is becoming a major focus as com­panies rely increasingly on their SAP systems to support employee productivity, partner collaboration, customer relationships, revenues, brand equity and growth. With many companies running their critical business processes on SAP, high availability and acceptable speed of the business software environment are essential requirements.

Posted February 23, 2012

In celebration of ODBC's 20th anniversary this year, Progress Software Corporation has unveiled its Platinum ODBC drivers, Progress DataDirect Connect for ODBC 7.0. The standards-based, fully interoperable Progress DataDirect Connect for ODBC 7.0 driver allows application developers to reliably exchange data between cloud data and disparate data sources.

Posted February 23, 2012

Oracle has announced the availability of Oracle Advanced Analytics, a new option for Oracle Database 11g that combines Oracle R Enterprise with Oracle Data Mining. According to Oracle, Oracle R Enterprise delivers enterprise class performance for users of the R statistical programming language, increasing the scale of data that can be analyzed by orders of magnitude using Oracle Database 11g.

Posted February 23, 2012

The challenges of maintaining security and regulatory compliance as applications increasingly move to the cloud - whether public, private or hybrid - will come into greater focus in 2012, says Ryan Berg, cloud security strategy lead for IBM. The need to manage security among an increasingly mobile workforce, with many employees choosing to use their own personal devices, will also be a key concern in 2012, says Berg.

Posted February 23, 2012

Kalido, a provider of agile information management software, unveiled the latest release of the Kalido Information Engine, which helps organizations decrease the time for data mart migrations and consolidations. With this new release, customers will be able to import existing logical and physical models and taxonomies to build a more agile data warehouse. Enabling customers to take advantage of existing assets and investments "is going to dramatically reduce the time and the cost that it takes to bring together data marts into more of a data warehouse scenario," says John Evans, director of product marketing at Kalido.

Posted February 23, 2012

While emails have been "the smoking gun" in many recent court cases, the new big wave in what is "discoverable" is structured (database) data. Accessing data is simpler and much faster from structured data than non-structured data. If the response to e-discovery can come from a structured data format, it is usually much faster than the alternatives and can mitigate the risk of steep fines due to delayed response time. A new combination of stringent regulations and new technology are giving judges and litigators more muscle to subpoena more data. Structured data in any application and database, no matter how old or obsolete, can be used in court as evidence, and increasingly it is being asked for.

Posted February 09, 2012

Today's organizations must capture, track, analyze and store more information than ever before - everything from mass quantities of transactional, online and mobile data, to growing amounts of "machine-generated data" such as call detail records, gaming data or sensor readings. And just as volumes are expanding into the tens of terabytes, and even the petabyte range and beyond, IT departments are facing increasing demands for real-time analytics. In this era of "big data," the challenges are as varied as the solutions available to address them. How can businesses store all their data? How can they mitigate the impact of data overload on application performance, speed and reliability? How can they manage and analyze large data sets both efficiently and cost effectively?

Posted February 09, 2012

Businesses are struggling to cope with and leverage an explosion of complex and connected data. This need is driving many companies to adopt scalable, high performance NoSQL databases - a new breed of database solutions - in order to expand and enhance their data management strategies. Traditional "relational" databases will not be able to keep pace with "big data" demands as they were not designed to manage the types of relationships that are so essential in today's applications.

Posted January 25, 2012

Tableau Software, a provider of business intelligence software, has announced the general availability of Tableau 7.0. The release offers improvements in performance and scalability, adds new visualization types and improves the product's overall analytical power and ease-of-use. In addition, the new Tableau Data Server capabilities will make it easy to share large volumes of data, share data models in large groups, and provide enhanced management of data assets, says Chris Stolte, chief development officer, co-founder, and inventor of Tableau Software.

Posted January 25, 2012

RainStor, a provider of big data management software, has unveiled the RainStor Big Data Analytics on Hadoop, which the company describes as the first enterprise database running natively on Hadoop. It is intended to enable faster analytics on multi-structured data without the need to move data out of the Hadoop Distributed File System (HDFS) environment. There is architectural compatibility with the way Rainstor manages data and the way Hadoop Distributed File Systems manage CSV files, says Deirdre Mahon, vice president of marketing at Rainstor.

Posted January 25, 2012

The Oracle Big Data Appliance, an engineered system of hardware and software that was first unveiled at Oracle OpenWorld in October, is now generally available. The new system incorporates Cloudera's Distribution Including Apache Hadoop (CDH3) with Cloudera Manager 3.7, plus an open source distribution of R. The Oracle Big Data Appliance represents "two industry leaders coming together to wrap their arms around all things big data," says Cloudera COO Kirk Dunn.

Posted January 25, 2012

"Big data" and analytics have become the rage within the executive suite. The promise is immense - harness all the available information within the enterprise, regardless of data model or source, and mine it for insights that can't be seen any other way. In short, senior managers become more effective at business planning, spotting emerging trends and opportunities and anticipating crises because they have the means to see both the metaphorical trees and the forest at the same time. However, big data technologies don't come without a cost.

Posted January 11, 2012

CIOs and IT departments are on the frontlines of a monumental IT shift. With the number of mobile devices and applications exploding and bandwidth soaring, they are being asked to find ways to enable the brave new world of enterprise mobility. All involved - from users to IT - recognize the productivity and business efficiency benefits of this trend, but it is typically only IT that also recognizes the dangers unchecked mobility poses to sensitive corporate data.

Posted January 11, 2012

The argument that "everyone is doing it and you should too" holds no value for strategic decision-making in IT. Yet critical thinking often goes by the wayside when a hot, new trend catches on and it seems like the masses are following along. Cloud computing is certainly in vogue - most industry analysts are bullish on cloud computing adoption and anticipate enterprise spending to increase - but organizations need to steer clear of falling into the trap that moving to the cloud always delivers cost savings.

Posted January 11, 2012

In this, our last E-Edition of Database Trends and Applications for 2011, we're taking a look back at some of the most widely read articles of the past year. These articles cover a range of topics. Some provide an examination of just-emerging or quickly evolving technologies, others highlight best practices in a specific discipline, while others comment on trends observed by industry experts. Click on the "December 2011 E-Edition UPDATE" headline above to access the articles. If you missed one earlier in the year, here's your second chance. All DBTA E-Editions are archived by month on the DBTA website.

Posted December 16, 2011

Stacks of statistics from many sources share a common theme - growth rates for digital information are extremely high and undeniable. A tsunami of e-information is fueling the engine of today's corporate enterprise, and many businesses are aiming to ride the information wave to prosperity. However, many companies are not sufficiently attentive to all the potential liabilities lurking in the depths of this digital information, including the risks involved in using real, live personal customer and employee data for application development and testing purposes. There's real potential for serious data security, legal and noncompliance risks when businesses fail to protect this data.

Posted December 01, 2011

Until recently, companies were only warming up to the possibilities of cloud computing. Lately, however, for many enterprise-IT decision makers, cloud is hot, hot, hot. The sea change now underway means many companies are quickly moving from "dipping their toes into cloud computing" to a full-fledged immersion, says Thom VanHorn, vice president of marketing for Application Security, Inc. In 2012, expect to see those same companies dive right in. "The move will only accelerate," he tells DBTA.

Posted December 01, 2011

A new survey of 421 data managers and professionals affiliated with the Independent Oracle Users Group (IOUG) members finds that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. Many respondents report a significant surge of data within their data warehouses in recent times, fueled not only by growing volumes of transaction data but unstructured data as well. Now, the challenge is to find ways to extend data analysis capabilities to additional business areas.

Posted December 01, 2011

Although it is possible to develop an efficient Vehicle Tracking System using any database server, and many such solutions are already available in the market, Informix offers many advantages. Informix can reduce your disk space requirement, improve your query performance, and reduce your application development efforts, without special training or buying additional technology. The built-in technologies of Informix will reduce your total cost of ownership, so if these benefits are important to you, keep reading!

Posted November 10, 2011

Increasing concerns over security breaches from external and internal threats, regulatory compliance requirements from HIPAA, the HITECH Act, PCI DSS, and other mandates, plus the migration from physical servers to virtual machines and the cloud are prompting companies to adopt encryption as never before. Encryption protects data by transforming data into unintelligible strings of characters (called cipher text) and today is widely considered to be a security best practice.

Posted November 10, 2011

Few things in the world are changing as dramatically as data. Data has tapped out a powerful rhythm to keep time with technology's bleeding edge, leaving many technologies struggling to keep up. It should come as no surprise, then, that many of the data strategies that IT departments developed, and still widely rely upon, are no longer sufficient for today's needs. You can put data marts near the top of that list.

Posted November 10, 2011

Customer centricity has become a watchword for major corporations worldwide, but a recently released survey has revealed that many enterprises are lacking in the basic knowledge of who their customers are, not to mention their attributes, tastes, purchasing histories and relationships with other customers.

Posted October 26, 2011

Columnar database technology burst on the data warehouse scene just a couple years ago with promises of faster query speeds on vast amounts of data. They delivered on that promise, but at a cost that is no longer worth paying. Here's why.

Posted October 26, 2011

At Oracle OpenWorld in San Francisco earlier this month, Oracle users groups were out in force hosting SIG meetings, providing educational information, and presenting sessions by subject matter experts. User group presidents were also onsite, outlining ambitious plans for the year ahead.

Posted October 26, 2011

Teradata, a data analytic data solutions provider, announced the latest update to its flagship data warehouse product, as well as new features in its data warehouse appliance. Teradata Database 14 is designed as the analytical engine for powering all of the vendor's "purpose-built" platform family members - from enterprise data warehouses to appliances. "We're including big data application support," says Scott Gnau, president of Teradata Labs at a briefing at the vendor's recent user group event.

Posted October 26, 2011

DataStax, a provider of solutions based on the open source Apache Cassandra database platform, announced it is shipping an enterprise database platform designed to enable the management of both real-time and analytic workloads from a single environment. The new platform, DataStax Enterprise, is designed to leverage the features of Cassandra to provide performance enhancements and cost savings over traditional database management solutions, the vendor claims.

Posted October 26, 2011

Legacy IT systems were developed for - and still run - about 60% of all mission-critical applications. They are still stable and reliable. However, the maintenance on legacy applications over time has made them so complex that rather than helping to solve business problems, most legacy systems have become a challenge that enterprises are grappling with.

Posted October 15, 2011

Cloud computing has taken the enterprise IT world by force. IT managers and CIOs are evaluating private, public and hybrid cloud infrastructures for running corporate applications and services. Many are doing pilots and evaluating large-scale migrations to the cloud, with the hope of not only saving money but increasing services for users.

Posted October 15, 2011

Joe Clabby authors a comprehensive overview on the SHARE website this week focusing on the issue of the IT Skills Gap. While the need for mainframe professionals remains high, for instance, the supply of young mainframers remains stubbornly short. Certainly, the supply of mainframers coming out of universities is hamstrung by a number of specific factors including an uninformed perception by students about the platform's ongoing importance, a dearth of curriculum and faculty covering the subject in many computer science programs, and a lack of outreach by industry into local computer science departments to foster both academic and internship focus.

Posted October 15, 2011

During a keynote presentation last week at Oracle OpenWorld 2011, the new Oracle Big Data Appliance, an engineered system optimized for acquiring, organizing and loading unstructured data into Oracle Database 11g, was announced by Thomas Kurian, executive vice president, Product Development, Oracle.

Posted October 15, 2011

Smartphones, tablets and other handhelds are changing the way companies do business. And when these revolutionary devices can be combined with existing tried-and-true software for evolutionary change, as opposed to ripping and replacing, the results are even better.

Posted September 14, 2011

As companies learn to embrace "big data" - terabytes and gigabytes of bits and bytes, strung across constellations of databases - they face a new challenge: making the data valuable to the business. To accomplish this, data needs to be brought together to give decision makers a more accurate view of the business. "Data is dirty and it's hard work; it requires real skills to understand data semantics and the different types of approaches required for different data problems," Lawrence Fitzpatrick, president of Computech, Inc., tells DBTA. "It's too easy to see data as ‘one thing.' "

Posted September 14, 2011

As data grows, the reflex reaction within many organizations is to buy and install more disk storage. Smart approaches are on the horizon but still only prevalent among a minority of companies. How is it data has grown so far so fast? Technology growth along the lines of Moore's Law (doubling every 18 months) has made petabyte-capable hardware and software a reality. And data growth itself appears to be keeping pace with the hardware and systems. In fact, a petabyte's worth of data is almost commonplace, as shown in a new survey conducted by Unisphere Research among members of the Independent Oracle Users Group (IOUG). In "The Petabyte Challenge: 2011 IOUG Database Growth Survey," close to 1 out of 10 respondents report that the total amount of online (disk-resident) data they manage today-taking into account all clones, snapshots, replicas and backups-now tops a petabyte.

Posted September 14, 2011

Virtualization is such a broad term and a hot topic among IT professionals. However, just because your organization has conquered server virtualization, or is well underway with confidence, if you proceed with the same desktop virtualization practices, you will be setting yourself up for failure.

Posted August 29, 2011

Rampant data growth has been the stimuli to over-spending on data storage. Technology advances have enabled us to gather more data faster than any time in our history. This has been beneficial in many ways and has provided businesses more data that can enable them to optimize their sales, marketing, customer relations and product offerings. Unfortunately, in order to keep pace with data growth, businesses have had to provision more and more storage capacity, costing them millions of dollars.

Posted August 29, 2011

Objectivity, Inc., a database provider, is releasing a commercial version of InfiniteGraph, a distributed and scalable graph database designed to enable a new, cost effective, and efficient way of navigating multiple types of databases for discovery of deeper and more relevant intelligence, enabling real-time decision support. InfiniteGraph already has a large following of users within government agencies, says Jay Jarrell, president and CEO of Objectivity.

Posted August 29, 2011

Informatica Corporation has announced the availability of what the company describes as the industry's first dynamic data masking (DDM) solution. Informatica Dynamic Data Masking provides real-time, policy-driven obfuscation of sensitive data to address a wide range of common data security and privacy challenges without requiring any changes to database or application source code and is intended to address problems that cannot be solved by other technologies such as IAM (identity access management), SDM (static data masking). Informatica Dynamic Data Masking is based on technology developed by ActiveBase, which was acquired by Informatica in July, 2011.

Posted August 29, 2011

Endeca Technologies, Inc., an information management software company, has unveiled native integration of Endeca Latitude with Apache Hadoop. "This native integration between Endeca Latitude and Hadoop brings together big data processing from Hadoop and interactive search, exploration and analysis from Latitude," says Paul Sonderegger, chief strategist of Endeca Technologies.

Posted August 29, 2011

Another IT initiative is in the news. What does it really mean for you? Is it an opportunity? Or is it a distraction? Whatever your perspective, it seems clear that internet computing standards have reached another plateau of standardization and capability, such that vendors see an opportunity to pursue new models of computing.

Posted August 11, 2011

The rise of big data has garnered much of the attention in the data management arena lately. But it is not simply the sheer volume of data that is challenging data professionals. Many new types and brands of DBMSs are also popping up across organizations, bringing new problems for the data professionals who are tasked with managing them, and also giving rise to scores of "accidental database administrators" with no formal DBA training, a new Unisphere Research study reveals.

Posted August 11, 2011

Over the years, countless papers and articles have been written on enterprise resource planning (ERP) implementation project success rates and why ERP projects fail. Interestingly enough, the reasons that projects fail are the same today as they were 10 years ago: lack of top management commitment, unrealistic expectations, poor requirements definition, improper package selection, gaps between software and business requirements, inadequate resources, underestimating time and cost, poor project management, lack of methodology, underestimating impact of change, lack of training and education, and last, but not least, poor communication.

Posted August 11, 2011

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26

Sponsors