Newsletters




Trends and Applications



IBM announced that all cloud services and software will be based on an open cloud architecture. As the first step, IBM unveiled a new private cloud offering based on the open sourced OpenStack software that it says speeds and simplifies managing an enterprise-grade cloud. The offering provides businesses with a core set of open source-based technologies to build enterprise-class cloud services that can be ported across hybrid cloud environments. The IBM announcement "goes a long way" to position OpenStack against other more proprietary solutions, Jim Curry, senior vice president and general manager of Rackspace's Private Cloud business, tells DBTA.

Posted March 27, 2013

At the recent Strata conference, CitusDB showcased the latest release of its scalable analytics database. According to the vendor, CitusDB 2.0 brings together the performance of PostgreSQL and the scalability of Apache Hadoop, and enables real-time queries on data that's already in Hadoop. This new functionality is possible with CitusDB's distributed query planner, and PostgreSQL's foreign data wrappers.

Posted March 27, 2013

Two big questions are on the minds of data professionals these days. How are increasing complexity and the inevitable onslaught of big data shaping the future of database administrators and data architects? How will our roles change? In the interest of studying the evolving landscape of data, the Independent Oracle User's Group (IOUG) took the pulse of the community. The Big Data Skills for Success study polled numerous individuals in the IOUG Oracle technology community, to identify just how the responsibilities of handling data are changing and what the future of these roles looks like.

Posted March 14, 2013

Special Report: Gaining Maximum Advantage with MultiValue Technologies

Posted March 14, 2013

Data keeps growing, systems and servers keep sprawling, and users keep clamoring for more real-time access. The result of all this frenzy of activity is pressure for faster, more effective data integration that can deliver more expansive views of information, while still maintaining quality and integrity. Enterprise data and IT managers are responding in a variety of ways, looking to initiatives such as enterprise mashups, automation, virtualization, and cloud to pursue new paths to data integration. In the process, they are moving beyond the traditional means of integration they have relied on for years to pull data together.

Posted March 14, 2013

Databases are restricted by reliance on disk-based storage, a technology that has been in place for several decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to information storage devices remains a hindrance in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the Independent Oracle Users Group (IOUG). The survey was underwritten by SAP Corp. and conducted by Unisphere Research, a division of Information Today, Inc.

Posted March 14, 2013

A new survey of nearly 200 data managers and professionals, who are part of the Independent Oracle Users Group (IOUG), looks at the role of the data scientist - data professionals who can aggregate data from internal enterprise data stores as well as outside sources to provide the forecasts and insight required to help lead their organizations into the future. The research was conducted by Unisphere Research, a division of Information Today, Inc.

Posted February 27, 2013

The continued expansion of structured and unstructured data storage seems to be never-ending. At the same time, database administrators' need to reduce their storage consumption is accelerating as its cost becomes more visible. Today, however, there are data optimization technologies available that can help with the continued data growth.

Posted February 27, 2013

SAP AG has introduced a new version of its Sybase IQ disk-based column store analytics server. The overriding theme of this new release, which will be generally available later in the first quarter, "is positioning IQ 16 to go from terabytes to petabytes," Dan Lahl, senior director of product marketing at SAP, tells 5 Minute Briefing. To accomplish this, IQ 16 provides enhancements in three critical areas.

Posted February 27, 2013

DataCore Software, a provider of storage virtualization software, has made enhancements to its SANsymphony-V Storage Hypervisor. The new capabilities are intended to support customers who are facing high data growth, as well as the need to enable faster response times and provide continuous availability for business-critical applications.

Posted February 27, 2013

HP announced two new software-as-a-service (SaaS) solutions intended to speed application delivery and improve visibility, collaboration and agility across often siloed or geographically dispersed application development and operations teams. HP Agile Manager accelerates application time to market with an intuitive, web-based experience that offers visibility for planning, executing and tracking Agile development projects; and HP Performance Anywhere helps resolve application performance issues before they impact business services by providing visibility and predictive analytics.

Posted February 27, 2013

Oracle president Mark Hurd and Oracle executive vice president of product development Thomas Kurian recently hosted a conference call to provide an update on Oracle's cloud strategy and recap of product-related developments. Oracle is trying to do two things for customers - simplify their IT and power their innovation, said Hurd.

Posted February 27, 2013

Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.

Posted February 27, 2013

Having vast amounts of data at hand doesn't necessarily help executives make better decisions. In fact, without a simple way to access and analyze the astronomical amounts of available information, it is easy to become frozen with indecision, knowing the answers are likely in the data, but unsure how to find them. With so many companies proclaiming to offer salvation from all data issues, one of the most important factors to consider when selecting a solution is ease of use. An intuitive interface based on how people already operate in the real world is the key to adoption and usage throughout an organization.

Posted February 13, 2013

Delivering Information Faster: In-Memory Technology Reboots the Big Data Analytics World

Posted February 13, 2013

A profound shift is occurring in where data lives. Thanks to skyrocketing demand for real-time access to huge volumes of data—big data—technology architects are increasingly moving data out of slow, disk-bound legacy databases and into large, distributed stores of ultra-fast machine memory. The plummeting price of RAM, along with advanced solutions for managing and monitoring distributed in-memory data, mean there are no longer good excuses to make customers, colleagues, and partners wait the seconds—or sometimes hours—it can take your applications to get data out of disk-bound databases. With in-memory, microseconds are the new seconds.

Posted February 13, 2013

The explosion of big data has presented many challenges for today's database administrators (DBAs), who are responsible for managing far more data than ever before. And with more programs being developed and tested, more tools are needed to help optimize data and efficiency efforts. Using techniques such as DB2's Multi-Row Fetch (MRF), DBAs are able to cut down on CPU time - and improve application efficiency. MRF was introduced in DB2 version 8 in 2004. Stated simply, it is the ability for DB2 to send multiple rows back to a requesting program at once, rather than one row at a time.

Posted January 24, 2013

Databases are hampered by a reliance on disk-based storage, a technology that has been in place for more than two decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to the permanent information storage devices is still a bottleneck in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the IOUG. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. The research results are detailed in a new report, titled "Accelerating Enterprise Insights: 2013 IOUG In-Memory Strategies Survey."

Posted January 24, 2013

EMC Greenplum has qualified Attunity RepliWeb for Enterprise File Replication (EFR) and Attunity Managed File Transfer (MFT) with EMC Greenplum Hadoop (HD). Attunity RepliWeb for EFR and Attunity MFT are high-performance, easy-to-use solutions for automating, managing and accelerating the process of making data available for big data analytics with Hadoop. According to Attunity, the products, launched earlier this year, are the first and only solutions currently qualified by EMC for Greenplum HD. "Greenplum has come into the marketplace by storm and has had a strong vision of being data-independent or data-agnostic. They want to make sure that their analytic platform can handle both structured and unstructured data and this aligns very well with Attunity's mission statement of any data, any time, anywhere," Matt Benati, vice president of Global Marketing at Attunity, tells DBTA.

Posted January 24, 2013

Enterprise NoSQL database provider MarkLogic Corporation has partnered with business intelligence vendor Tableau Software to offer analytics and visualization over unstructured big data. The partnership allows business users to leverage Tableau's business intelligence and reporting solutions to access disparate data sets of structured and unstructured data house in a MarkLogic NoSQL database. "Not only can you build rich, sophisticated applications, but you can also make use of that data where it is, and have business users connect to that data, visualize it, and do analytics over it, without involving the development center," Stephen Buxton, MarkLogic's director of product management, tells DBTA.

Posted January 24, 2013

Oracle has merged the core capabilities of the Oracle Audit Vault and Oracle Database Firewall products, creating the new Oracle Audit Vault and Database Firewall product which expands protection beyond Oracle and third-party databases with support for auditing the operating system, directories and custom sources. "It is really one single, streamlined solution to do both security and compliance for Oracle and non-Oracle databases, and extending beyond databases, to operating systems, file systems, and directories - essentially the structure surrounding your database," notes Vipin Samar, vice president, Database Security, Oracle. "Data governance is increasingly important in many organizations and, as we know from the IOUG survey that we did earlier this year, we have very few organizations that are monitoring sensitive data access," adds Roxana Bradescu, director of product marketing, Data Security, Oracle.

Posted January 24, 2013

The recent explosion of digital data has affected businesses of all sizes and has opened opportunities for companies that adopt machine learning technology - including predictive analytics - to mine intelligence from data assets. Predictive analytics has the potential to transform traditional small to medium businesses (SMBs), which have the same desire to take better advantage of their data assets as larger organizations - but the process with which they can glean strategic value from that data is significantly different.

Posted January 03, 2013

The emergence of web-scale apps has put us in the midst of a database crisis. Mobile apps, cloud-based SaaS/PaaS architectures and the distributed nature of the web have forced the software industry to make difficult compromises on how they collect, process and store data. While traditional databases provide the power and simplicity of SQL and the reliability of ACID, they don't scale without herculean-inspired workarounds. Newer, NoSQL solutions come close but don't quite make the last mile. They're designed to scale elastically, even on commodity hardware, but force developers to program powerful querying features into their application and throw away years of learning SQL skills, tools and languages. To give developers a truly modern solution for this century, we need to rethink how we process the collection and storage of data. It's time for a revolution.

Posted January 03, 2013

As organizations strive to deliver always-on access to applications users, it can be challenging to provide authorized access while simultaneously protecting against cyber-attacks. To address these challenges, two novel solutions combine the power of application delivery controllers (ADCs) with web access management (WAM) and database security technologies.

Posted January 03, 2013

For many years, enterprise data center managers have struggled to implement disaster recovery strategies that meet their RTO/RPOs and business continuity objectives while staying within their budget. While the challenges of moving, managing, and storing massive data volumes for effective disaster protection have not changed - exponential data growth and the advent of big data technologies, have made the challenge of disaster recovery protection more difficult than ever before.

Posted December 19, 2012

Progress Software, a provider of software for connecting business applications to data and services, has released DataDirect Connect and DataDirect Connect XE for ODBC 7.1. The releases provide fast, reliable, secure and scalable connectivity for Apache Hive, as well as expanded support for cloud databases including Microsoft SQL Azure. DataDirect Connect and DataDirect Connect XE for ODBC 7.1 also offer new features for IBM DB2 10.1, IBM DB2 pureScale 9.8, Teradata 14, and Microsoft SQL Server 2012.

Posted December 19, 2012

Despite the rise of big data, data warehousing is far from dead. While traditional, static data warehouses may have indeed seen their day, an agile data warehouse — one that can map to the needs of the business and change as the business changes — is quickly on the rise. Many of the conversations today around big data revolve around volume and while that is certainly valid, the issue is also about understanding data in context to make valuable business decisions. Do you really understand why a consumer takes action to buy? How do their purchases relate? When will they do it again? Big data is limited when it comes to answering these questions. An agile approach — one that gives even big data a life beyond its initial purpose — is the value data warehousing can bring to bear and is critical to long-term business success.

Posted December 19, 2012

Opportunities abound for organizations that are able to gain insight into customers, sales, markets, and processes, by analyzing data culled from a myriad of sources across the enterprise. This rich information can enable executives, managers, and professionals to answer questions never possible before. As a result, for the companies that are able to provide business decision makers with quick and efficient access to BI or analytic data from which they can create their own interfaces and reports, competitive advantage can be realized.

Posted December 19, 2012

The latest IOUG study on database security finds that there are measures that need to be taken to safeguard data from internal abuse; however, preventing privileged users from negligence or malfeasance is a serious challenge. According to this year's study, human error has beat out internal hackers or unauthorized users as the biggest security risk. In addition, more than half of respondents say their organizations still do not have, or are unaware of, data security plans to help address contingencies as they arise. These enterprise data security challenges, and more, are highlighted in a new survey of 350 data managers and professionals by the Independent Oracle Users Group. Underwritten by Oracle Corporation and conducted by Unisphere Research, a division of Information Today, Inc., it covered progress within three key areas of database security - prevention, detection, and administration.

Posted December 19, 2012

Attunity Ltd., a provider of information availability software solutions, has officially launched Attunity CloudBeam, a fully-managed data transfer SaaS platform for Amazon Web Services (AWS) Simple Storage Service (S3). With its beta completed, the high-performance data transfer solution was unveiled and demonstrated live at the AWS re: Invent Customer and Partner Conference from in Las Vegas, NV. "This is aimed at folks today that are using AWS or will be using AWS for all kinds of use cases where data is core to their strategy," Matt Benati, vice president of Global Marketing at Attunity, tells DBTA. "All these use cases demand the movement of data and it really has to be a frictionless movement of data at scale. That is what Attunity does best."

Posted December 19, 2012

Enterprise NoSQL vendor MarkLogic recently brought its summit series to New York. Themed as "Big Data, Beyond the Hype: Delivering Results," the one-day conference included presentations by MarkLogic executives as well as partners and customers. In his opening keynote, CEO Gary Bloom highlighted the need for a next-generation database to address the problems and opportunities posed by big data, while also cautioning that there are "a lot of shiny objects" in the market now trying to capture people's attention that may not deliver the necessary results.

Posted December 19, 2012

In the never-ending battle for enterprise data security, industry experts say there has been progress on several fronts, but there is still much work that needs to be done. There is an enormous amount of data that tends to leak out of the secure confines of data centers, creating a range of security issues. "There are many copies of data which have less security and scrutiny than production environments," Joseph Santangelo, principal consultant with Axis Technology, tells DBTA. "The increased reliance on outsourcers and internal contractors leave sensitive data within corporate walls open to misuse or mistakes." Or, as another industry expert describes it, the supply chain often proves to be the greatest vulnerability for data security. "A typical organization has a direct relationship with only 10% of the organizations in its supply chain — the other 90% are suppliers to suppliers," Steve Durbin, global vice president of the Information Security Forum, tells DBTA.

Posted December 06, 2012

Protecting databases using encryption is a basic data security best practice and a regulatory compliance requirement in many industries. Databases represent the hub of an information supply chain. However, only securing the hub by encrypting the database leaves security gaps because sensitive data also exists alongside the database in temporary files, Extract-Transform-Load (ETL) data, debug files, log files, and other secondary sources. According to the "Verizon 2011 Payment Card Industry Compliance Report," unencrypted data that resides outside databases is commonly stolen by hackers because it is easier to access

Posted December 06, 2012

Best Practices Special Section on NoSQL, NewSQL, and Hadoop: Bringing Big Data into the Enterprise Fold

Posted December 06, 2012

While no one can dispute the importance of enterprise resource planning (ERP) systems to organizational performance and competitiveness, executives in charge of these systems are under intense pressure to stay within or trim budgets. Close to half of the executives in a new survey say they have held off on new upgrades for at least a few years. In the meantime, at least one out of four enterprises either are scaling back or have had to scale back their recent ERP projects due to budget constraints.

Posted December 06, 2012

For years, data warehouses and extract, transform and load (ETL) have been the primary methods of accessing and archiving multiple data sources across enterprises. Now, an emerging approach - data virtualization - promises to advance the concept of the federated data warehouse to deliver more timely and easier-to-access enterprise data. These are some of the observations made at Composite Software's third Annual Data Virtualization Day, held in New York City. This year's gathering was the largest ever, with nearly 250 customers and practitioners in attendance, Composite reports.

Posted November 13, 2012

TVSN is a 24x7x365 television shopping network that sells clothing, health and beauty aids, electronics, home furnishings, collectibles, and jewelry in Australia. Customers can place orders at any hour of the day or night any way they desire, by phone or online. Since TVSN is always open and always on, downtime is just not possible. Originally only available on cable TV, at 8:30 am on Monday October 24, 2012, TVSN in combination with Network Ten Australian flipped the switch to make television shopping available to anyone in Australia who has a television. As a result, TVSN now reaches 6.5 million households. TVSN has relied on Revelation Software since the late 1980s when it was called Demtel, a telemarketing company that was one of the first to run infomercial-style ads Down Under, using a Revelation G application in the call center and warehouse.

Posted November 13, 2012

Companies are facing serious external challenges managing aging IT infrastructures and application portfolios. To decrease costs and risks while increasing flexibility and innovation, many are turning to cloud technologies. By adopting cloud platforms, companies enable the delivery of "everything as-a-service." This empowers the workforce with faster any-device access to solutions that are available, affordable and ready to use. However, in order to realize cloud computing benefits, organizations must first transform and modernize their applications portfolios. Modernization is a key to business success and a significant challenge for chief information officers (CIOs).

Posted November 13, 2012

Software operates the products and services that we use and rely on in our daily lives. It is often the competitive differentiation for the business. As software increases in size, complexity, and importance to the business, so do the business demands on development teams. Developers are increasingly accountable to deliver more innovation, under shorter development cycles, without negatively impacting quality. Compounding this complexity is today's norm of geographically distributed teams and code coming in from third-party teams. With so many moving parts, it's difficult for management to get visibility across their internal and external supply chain. Yet, without early warning into potential quality risks that could impact release schedules or create long term technical debt, there may be little time to actually do something about it before the business or customers are impacted.

Posted October 24, 2012

The recent explosion of mobile applications has dramatically altered the consumer landscape, making it the norm for users and customers alike to expect access and support anytime, anywhere. With Cisco recently reporting that mobile-connected devices are set to exceed the world's population this year, it's no surprise that the surge is overflowing into the enterprise. While there are a few leading innovators in enterprise mobility, the vast majority of businesses are still struggling to take the first steps towards a streamlined strategy. The question is no longer "Do we?" but "How do we?"

Posted October 24, 2012

Melissa Data, a provider of contact data quality and direct marketing solutions, has announced Personator, an integrated data quality web service designed to provide identity verification and fraud prevention for e-commerce applications. Personator offers the ability to determine whether associations between different elements, such as name and address, are correct, thereby increasing accuracy by ensuring a valid and correct link between the data and identity of individual customer contacts.

Posted October 24, 2012

Simba Technologies has partnered with Hortonworks, a commercial vendor of Apache Hadoop, to provide ODBC access to Hortonworks Data Platform. The use of Simba's Apache Hive ODBC Driver with SQL Connector is aimed at providing Hortonworks customers with easy access to their data for BI and analytics using the SQL-based application of their choice. "Simba's Apache Hive ODBC Driver technology makes it easier for our customers to harness the power of their big data using popular and familiar BI and analytics applications," says Shaun Connolly, Hortonworks' VP, Strategy.

Posted October 24, 2012

Percona Live 2012, a MySQL conference, was held in New York City. With nearly 300 attendees participating, the first day of the event featured tutorials with in-depth presentations on specific topics, while the second day focused on conference sessions. Also new at Percona Live this year was an exhibit hall for MySQL ecosystem participants to put their products on display and network with potential customers. Sponsors included Clustrix, Continuent, ScaleArc, Nimbus Data, Fusion-io, Tokutek, Codership, Couchbase, Akiban, Ospero, ParElastic, SkySQL, ScaleBase, and New Relic.

Posted October 24, 2012

Application performance management (APM) software provider Precise has announced the availability of Precise 9.5, a major product release designed to help organizations deliver a better experience for customers using their cloud and mobile applications. Precise 9.5 rapidly detects and analyzes application problems resulting from server and storage virtualization resource contention, and also addresses the challenge of managing mobile traffic growth. The new release focuses on three key themes, all with the common goal of identifying and resolving potential problems before they can affect the customer experience or cause an outage, Sherman Wood, vice president of product at Precise, tells DBTA.

Posted October 24, 2012

It is an understatement to say we're witnessing an example of Moore's Law — which states the number of transistors on a chip will double approximately every two years — as we seek to manage the explosion of big data. Given the impact this new wealth of information has on hundreds of millions of business transactions, there's an urgent need to look beyond traditional insight-generation tools and techniques. It's critical we develop new tools and skills to extract the insights that organizations seek through predictive analytics.

Posted October 10, 2012

When virtualization was first born, IT departments went gangbusters using this revolutionary change to get better performance out of their servers. In all the excitement of implementation, something not so very small was overlooked — backup and recovery. The lack of proper planning forced jobs and recovery to fail, and backup admins started feeling backed into a corner. Thankfully, times have changed, and IT departments, now very aware of these issues, have gotten savvy at avoiding the potential pains of virtualization infrastructure. But a new challenge has emerged.

Posted October 10, 2012

Hadoop and the Big Data Revolution

Posted October 10, 2012

Best Practices Special Section: Insight at Last - Master Data Management Emerges to Tackle Big Data

Posted September 26, 2012

Enterprise NoSQL database company MarkLogic Corporation today rolled out a new version of its flagship product,, MarkLogic 6, which includes new tools for faster application development, improved analytics and new visualization widgets to enable greater insight, and the ability to create user-defined functions for fast and flexible analysis of extremely large volumes of data. Key features of MarkLogic's NoSQL database include ACID transactions, horizontal scaling, real-time indexing, high availability, disaster recovery, government-grade security, and built-in search. With this release, in addition to MarkLogic's NoSQL flexibility, the company is focused on building features into the product that allow it to be easier to use and more accessible to a wider group of users within the enterprise.

Posted September 26, 2012

Percona, Inc. has announced the latest release of Percona Server, which it describes as its "enhanced drop-in replacement for MySQL." According to the company, Percona Server Version 5.5.27-28.0 includes new features that make it more valuable as an alternative for MySQL users. Offered free as an open source solution, Percona Server has self-tuning algorithms and support for high-performance hardware. In addition, the company is planning a two-day Percona Live Event for NYC in October and also for London in December, with speakers and tutorials spanning multiple tracks across the MySQL ecosystem. A more expansive, four-day conference, Percona Live MySQL Conference and Expo 2013, is planned for Santa Clara in April.

Posted September 26, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors