Newsletters




Trends and Applications



Social media business intelligence is on the rise, according to a new Unisphere Research study sponsored by IBM and Marist College. The study found that while social media monitoring and analysis is in its early stages, many organizations plan to monitor, collect, stage and analyze this data over the next 1 to 5 years and. In particular, LOB respondents, who are closer to customers, show appreciation for the benefits of monitoring SMNs.

Posted March 21, 2012

MarkLogic Corporation has joined the technology partner program of Hortonworks, a leading vendor promoting the development and support of Apache Hadoop. According to the vendors, by leveraging MarkLogic and Hortonworks, organizations will be able to seamlessly combine the power of MapReduce with MarkLogic's real-time, interactive analysis and indexing on a single, unified platform. There are two main reasons that MarkLogic has chosen to partner with Hortonworks, says Justin Makeig, senior product manager at MarkLogic. One is Hortenworks' extensive experience with Hadoop installations and the second is that its core product is 100% open source.

Posted March 21, 2012

Pentaho Corporation, an open source business analytics company, has formed a strategic partnership with DataStax, a provider of big data solutions built upon the Apache Cassandra project, a high performance NoSQL database. The relationship will provide native integration between Pentaho Kettle and Apache Cassandra. This will merge the scalable, low-latency performance of Cassandra with Kettle's visual interface for high-performance ETL, as well as integrated reporting, visualization and interactive analysis capabilities. According to the companies, organizations seeking to leverage their big data have found it difficult to implement and employ analytics technologies. "One of the big challenges today is ease of use of these tools," says Ian Fyfe, Pentaho's chief technology evangelist. Often built on open source projects, it "takes a lot of deep skills to use these systems, and these are skills that are hard to find," he explains.

Posted March 21, 2012

Novell announced an update to its ZENworks suite, which includes integrated Mac device management, and full disk encryption capabilities. ZENworks 11 Support Pack 2 enables customers to lock out threats without shutting down IT access, the vendor says. ZENworks 11 now offers a more holistic approach to supporting Mac devices in the enterprise. With this release, Mac support is provided through Remote Management for Mac, Asset Management for Mac, Mac OSX Patching and Mac Bundles.

Posted March 21, 2012

Improving Data Protection with Deduplication

Posted March 07, 2012

Organizational focus has been placed on the emergence of "big data" - large-scale data sets that businesses and governments use to create new value with today's computing and communications power. Big data poses many opportunities, but managing the rapid growth adds challenges, including complexity and cost. Leaders must address the implications of big data, increasing volume and detail of information captured by enterprises, the rise of multimedia, social media, and the internet.

Posted March 07, 2012

For enterprises grappling with the onslaught of big data, a new platform has emerged from the open source world that promises to provide a cost-effective way to store and process petabytes and petabytes worth of information. Hadoop, an Apache project, is already being eagerly embraced by data managers and technologists as a way to manage and analyze mountains of data streaming in from websites and devices. Running data such as weblogs through traditional platforms such as data warehouses or standard analytical toolsets often cannot be cost-justified, as these solutions tend to have high overhead costs. However, organizations are beginning to recognize that such information ultimately can be of tremendous value to the business. Hadoop packages up such data and makes it digestible.

Posted March 07, 2012

Clinical Data Management (CDM) is a company headquartered in Colorado that provides clinical information database software, enabling medical institutions to report and compile data on patient care. A longtime user of Revelation Software, dating back to Revelation G and continuing through OpenInsight, CDM was pleased with both the quality of the products and the service from Revelation. However, CDM had come to realize it needed to provide a web interface for data entry to better support its customers and also stay current with evolving technology requirements. That need was answered when Revelation launched the OpenInsight for Web (O4W) Development Toolkit, a web development toolkit that makes it possible for OpenInsight developers with limited or no HTML, XML or JavaScript experience to develop feature-rich web pages.

Posted March 07, 2012

Valuable data and trusted applications are dependent on MultiValue databases at many organizations, but there is also a need to integrate that data and those applications with other systems and provide access to users in new ways. In this special section, DBTA asks leading MultiValue vendors:What is your organization doing to help customers modernize and stay current with new technologies to address the evolving requirements of customers?

Posted March 07, 2012

SAP application performance (speed and availability) is becoming a major focus as com­panies rely increasingly on their SAP systems to support employee productivity, partner collaboration, customer relationships, revenues, brand equity and growth. With many companies running their critical business processes on SAP, high availability and acceptable speed of the business software environment are essential requirements.

Posted February 23, 2012

In celebration of ODBC's 20th anniversary this year, Progress Software Corporation has unveiled its Platinum ODBC drivers, Progress DataDirect Connect for ODBC 7.0. The standards-based, fully interoperable Progress DataDirect Connect for ODBC 7.0 driver allows application developers to reliably exchange data between cloud data and disparate data sources.

Posted February 23, 2012

Oracle has announced the availability of Oracle Advanced Analytics, a new option for Oracle Database 11g that combines Oracle R Enterprise with Oracle Data Mining. According to Oracle, Oracle R Enterprise delivers enterprise class performance for users of the R statistical programming language, increasing the scale of data that can be analyzed by orders of magnitude using Oracle Database 11g.

Posted February 23, 2012

The challenges of maintaining security and regulatory compliance as applications increasingly move to the cloud - whether public, private or hybrid - will come into greater focus in 2012, says Ryan Berg, cloud security strategy lead for IBM. The need to manage security among an increasingly mobile workforce, with many employees choosing to use their own personal devices, will also be a key concern in 2012, says Berg.

Posted February 23, 2012

Kalido, a provider of agile information management software, unveiled the latest release of the Kalido Information Engine, which helps organizations decrease the time for data mart migrations and consolidations. With this new release, customers will be able to import existing logical and physical models and taxonomies to build a more agile data warehouse. Enabling customers to take advantage of existing assets and investments "is going to dramatically reduce the time and the cost that it takes to bring together data marts into more of a data warehouse scenario," says John Evans, director of product marketing at Kalido.

Posted February 23, 2012

The Advantages of Using Structured Data for E-Discovery

Posted February 09, 2012

Today's organizations must capture, track, analyze and store more information than ever before - everything from mass quantities of transactional, online and mobile data, to growing amounts of "machine-generated data" such as call detail records, gaming data or sensor readings. And just as volumes are expanding into the tens of terabytes, and even the petabyte range and beyond, IT departments are facing increasing demands for real-time analytics. In this era of "big data," the challenges are as varied as the solutions available to address them. How can businesses store all their data? How can they mitigate the impact of data overload on application performance, speed and reliability? How can they manage and analyze large data sets both efficiently and cost effectively?

Posted February 09, 2012

Businesses are struggling to cope with and leverage an explosion of complex and connected data. This need is driving many companies to adopt scalable, high performance NoSQL databases - a new breed of database solutions - in order to expand and enhance their data management strategies. Traditional "relational" databases will not be able to keep pace with "big data" demands as they were not designed to manage the types of relationships that are so essential in today's applications.

Posted January 25, 2012

Tableau Software, a provider of business intelligence software, has announced the general availability of Tableau 7.0. The release offers improvements in performance and scalability, adds new visualization types and improves the product's overall analytical power and ease-of-use. In addition, the new Tableau Data Server capabilities will make it easy to share large volumes of data, share data models in large groups, and provide enhanced management of data assets, says Chris Stolte, chief development officer, co-founder, and inventor of Tableau Software.

Posted January 25, 2012

RainStor, a provider of big data management software, has unveiled the RainStor Big Data Analytics on Hadoop, which the company describes as the first enterprise database running natively on Hadoop. It is intended to enable faster analytics on multi-structured data without the need to move data out of the Hadoop Distributed File System (HDFS) environment. There is architectural compatibility with the way Rainstor manages data and the way Hadoop Distributed File Systems manage CSV files, says Deirdre Mahon, vice president of marketing at Rainstor.

Posted January 25, 2012

The Oracle Big Data Appliance, an engineered system of hardware and software that was first unveiled at Oracle OpenWorld in October, is now generally available. The new system incorporates Cloudera's Distribution Including Apache Hadoop (CDH3) with Cloudera Manager 3.7, plus an open source distribution of R. The Oracle Big Data Appliance represents "two industry leaders coming together to wrap their arms around all things big data," says Cloudera COO Kirk Dunn.

Posted January 25, 2012

"Big data" and analytics have become the rage within the executive suite. The promise is immense - harness all the available information within the enterprise, regardless of data model or source, and mine it for insights that can't be seen any other way. In short, senior managers become more effective at business planning, spotting emerging trends and opportunities and anticipating crises because they have the means to see both the metaphorical trees and the forest at the same time. However, big data technologies don't come without a cost.

Posted January 11, 2012

CIOs and IT departments are on the frontlines of a monumental IT shift. With the number of mobile devices and applications exploding and bandwidth soaring, they are being asked to find ways to enable the brave new world of enterprise mobility. All involved - from users to IT - recognize the productivity and business efficiency benefits of this trend, but it is typically only IT that also recognizes the dangers unchecked mobility poses to sensitive corporate data.

Posted January 11, 2012

The argument that "everyone is doing it and you should too" holds no value for strategic decision-making in IT. Yet critical thinking often goes by the wayside when a hot, new trend catches on and it seems like the masses are following along. Cloud computing is certainly in vogue - most industry analysts are bullish on cloud computing adoption and anticipate enterprise spending to increase - but organizations need to steer clear of falling into the trap that moving to the cloud always delivers cost savings.

Posted January 11, 2012

December 2011 E-Edition UPDATE

Posted December 16, 2011

Stacks of statistics from many sources share a common theme - growth rates for digital information are extremely high and undeniable. A tsunami of e-information is fueling the engine of today's corporate enterprise, and many businesses are aiming to ride the information wave to prosperity. However, many companies are not sufficiently attentive to all the potential liabilities lurking in the depths of this digital information, including the risks involved in using real, live personal customer and employee data for application development and testing purposes. There's real potential for serious data security, legal and noncompliance risks when businesses fail to protect this data.

Posted December 01, 2011

Until recently, companies were only warming up to the possibilities of cloud computing. Lately, however, for many enterprise-IT decision makers, cloud is hot, hot, hot. The sea change now underway means many companies are quickly moving from "dipping their toes into cloud computing" to a full-fledged immersion, says Thom VanHorn, vice president of marketing for Application Security, Inc. In 2012, expect to see those same companies dive right in. "The move will only accelerate," he tells DBTA.

Posted December 01, 2011

A new survey of 421 data managers and professionals affiliated with the Independent Oracle Users Group (IOUG) members finds that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. Many respondents report a significant surge of data within their data warehouses in recent times, fueled not only by growing volumes of transaction data but unstructured data as well. Now, the challenge is to find ways to extend data analysis capabilities to additional business areas.

Posted December 01, 2011

Efficient Vehicle Tracking System Software Solution with Informix Dynamic Server

Posted November 10, 2011

Few things in the world are changing as dramatically as data. Data has tapped out a powerful rhythm to keep time with technology's bleeding edge, leaving many technologies struggling to keep up. It should come as no surprise, then, that many of the data strategies that IT departments developed, and still widely rely upon, are no longer sufficient for today's needs. You can put data marts near the top of that list.

Posted November 10, 2011

Customer centricity has become a watchword for major corporations worldwide, but a recently released survey has revealed that many enterprises are lacking in the basic knowledge of who their customers are, not to mention their attributes, tastes, purchasing histories and relationships with other customers.

Posted October 26, 2011

Columnar database technology burst on the data warehouse scene just a couple years ago with promises of faster query speeds on vast amounts of data. They delivered on that promise, but at a cost that is no longer worth paying. Here's why.

Posted October 26, 2011

Ambitious Plans for 2012 Laid Out by User Group Presidents at OpenWorld

Posted October 26, 2011

Teradata, a data analytic data solutions provider, announced the latest update to its flagship data warehouse product, as well as new features in its data warehouse appliance. Teradata Database 14 is designed as the analytical engine for powering all of the vendor's "purpose-built" platform family members - from enterprise data warehouses to appliances. "We're including big data application support," says Scott Gnau, president of Teradata Labs at a briefing at the vendor's recent user group event.

Posted October 26, 2011

DataStax, a provider of solutions based on the open source Apache Cassandra database platform, announced it is shipping an enterprise database platform designed to enable the management of both real-time and analytic workloads from a single environment. The new platform, DataStax Enterprise, is designed to leverage the features of Cassandra to provide performance enhancements and cost savings over traditional database management solutions, the vendor claims.

Posted October 26, 2011

Legacy IT systems were developed for - and still run - about 60% of all mission-critical applications. They are still stable and reliable. However, the maintenance on legacy applications over time has made them so complex that rather than helping to solve business problems, most legacy systems have become a challenge that enterprises are grappling with.

Posted October 15, 2011

Cloud computing has taken the enterprise IT world by force. IT managers and CIOs are evaluating private, public and hybrid cloud infrastructures for running corporate applications and services. Many are doing pilots and evaluating large-scale migrations to the cloud, with the hope of not only saving money but increasing services for users.

Posted October 15, 2011

Joe Clabby authors a comprehensive overview on the SHARE website this week focusing on the issue of the IT Skills Gap. While the need for mainframe professionals remains high, for instance, the supply of young mainframers remains stubbornly short. Certainly, the supply of mainframers coming out of universities is hamstrung by a number of specific factors including an uninformed perception by students about the platform's ongoing importance, a dearth of curriculum and faculty covering the subject in many computer science programs, and a lack of outreach by industry into local computer science departments to foster both academic and internship focus.

Posted October 15, 2011

During a keynote presentation last week at Oracle OpenWorld 2011, the new Oracle Big Data Appliance, an engineered system optimized for acquiring, organizing and loading unstructured data into Oracle Database 11g, was announced by Thomas Kurian, executive vice president, Product Development, Oracle.

Posted October 15, 2011

Smartphones, tablets and other handhelds are changing the way companies do business. And when these revolutionary devices can be combined with existing tried-and-true software for evolutionary change, as opposed to ripping and replacing, the results are even better.

Posted September 14, 2011

As companies learn to embrace "big data" - terabytes and gigabytes of bits and bytes, strung across constellations of databases - they face a new challenge: making the data valuable to the business. To accomplish this, data needs to be brought together to give decision makers a more accurate view of the business. "Data is dirty and it's hard work; it requires real skills to understand data semantics and the different types of approaches required for different data problems," Lawrence Fitzpatrick, president of Computech, Inc., tells DBTA. "It's too easy to see data as ‘one thing.' "

Posted September 14, 2011

As data grows, the reflex reaction within many organizations is to buy and install more disk storage. Smart approaches are on the horizon but still only prevalent among a minority of companies. How is it data has grown so far so fast? Technology growth along the lines of Moore's Law (doubling every 18 months) has made petabyte-capable hardware and software a reality. And data growth itself appears to be keeping pace with the hardware and systems. In fact, a petabyte's worth of data is almost commonplace, as shown in a new survey conducted by Unisphere Research among members of the Independent Oracle Users Group (IOUG). In "The Petabyte Challenge: 2011 IOUG Database Growth Survey," close to 1 out of 10 respondents report that the total amount of online (disk-resident) data they manage today-taking into account all clones, snapshots, replicas and backups-now tops a petabyte.

Posted September 14, 2011

Virtualization is such a broad term and a hot topic among IT professionals. However, just because your organization has conquered server virtualization, or is well underway with confidence, if you proceed with the same desktop virtualization practices, you will be setting yourself up for failure.

Posted August 29, 2011

Getting Out of Storage Debt

Posted August 29, 2011

Informatica Corporation has announced the availability of what the company describes as the industry's first dynamic data masking (DDM) solution. Informatica Dynamic Data Masking provides real-time, policy-driven obfuscation of sensitive data to address a wide range of common data security and privacy challenges without requiring any changes to database or application source code and is intended to address problems that cannot be solved by other technologies such as IAM (identity access management), SDM (static data masking). Informatica Dynamic Data Masking is based on technology developed by ActiveBase, which was acquired by Informatica in July, 2011.

Posted August 29, 2011

Endeca Technologies, Inc., an information management software company, has unveiled native integration of Endeca Latitude with Apache Hadoop. "This native integration between Endeca Latitude and Hadoop brings together big data processing from Hadoop and interactive search, exploration and analysis from Latitude," says Paul Sonderegger, chief strategist of Endeca Technologies.

Posted August 29, 2011

Another IT initiative is in the news. What does it really mean for you? Is it an opportunity? Or is it a distraction? Whatever your perspective, it seems clear that internet computing standards have reached another plateau of standardization and capability, such that vendors see an opportunity to pursue new models of computing.

Posted August 11, 2011

The rise of big data has garnered much of the attention in the data management arena lately. But it is not simply the sheer volume of data that is challenging data professionals. Many new types and brands of DBMSs are also popping up across organizations, bringing new problems for the data professionals who are tasked with managing them, and also giving rise to scores of "accidental database administrators" with no formal DBA training, a new Unisphere Research study reveals.

Posted August 11, 2011

Over the years, countless papers and articles have been written on enterprise resource planning (ERP) implementation project success rates and why ERP projects fail. Interestingly enough, the reasons that projects fail are the same today as they were 10 years ago: lack of top management commitment, unrealistic expectations, poor requirements definition, improper package selection, gaps between software and business requirements, inadequate resources, underestimating time and cost, poor project management, lack of methodology, underestimating impact of change, lack of training and education, and last, but not least, poor communication.

Posted August 11, 2011

There is no doubt that virtualization is radically changing the shape of IT infrastructure, transforming the way applications are deployed and services delivered. Databases are among the last of the tier 1 applications to be hosted on virtual servers, but the past year has seen a huge wave of increase for production Oracle, SQL Server and other databases on VMware platforms. For all the benefits of virtualization, including cost-effectiveness, there are some impacts on the IT staff involved. Unfortunately for the DBAs virtualization often means losing control and visibility of their systems, which can ultimately hinder their ability to deliver database-oriented business solutions. While in the past DBAs had perfect visibility to the physical servers hosting the databases, the virtualization layers and the tools to manage them are typically out of bounds to them. While all the excitement of late has centered on VMware and other virtual machine systems, the DBAs have a valid reason for skepticism.

Posted July 27, 2011

Given all of the recent discussion around big data, NoSQL and NewSQL, this is a good opportunity to visit a topic I believe will be (or should be) forefront in our minds for the next several years - high velocity transactional systems. Let's start with a description of the problem. High velocity transactional applications have input streams that can reach millions of database operations per second under load. To complicate the problem, many of these systems simply cannot tolerate data inconsistencies.

Posted July 27, 2011

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors