Big Data Articles
Terracotta, a provider of enterprise big data management solutions, and JackBe, a real-time intelligence software vendor, have announced a collaboration to leverage their complementary technologies for real-time big data solutions. According to the vendors, JackBe's Presto real-time data visualizations for analytics coupled with Terracotta's BigMemory high-performance and scalability enables JackBe to deliver visual data exploration and dashboards with the speed, scale and simplicity of in-memory data management.
Posted February 19, 2013
Having vast amounts of data at hand doesn't necessarily help executives make better decisions. In fact, without a simple way to access and analyze the astronomical amounts of available information, it is easy to become frozen with indecision, knowing the answers are likely in the data, but unsure how to find them. With so many companies proclaiming to offer salvation from all data issues, one of the most important factors to consider when selecting a solution is ease of use. An intuitive interface based on how people already operate in the real world is the key to adoption and usage throughout an organization.
Posted February 13, 2013
A profound shift is occurring in where data lives. Thanks to skyrocketing demand for real-time access to huge volumes of data—big data—technology architects are increasingly moving data out of slow, disk-bound legacy databases and into large, distributed stores of ultra-fast machine memory. The plummeting price of RAM, along with advanced solutions for managing and monitoring distributed in-memory data, mean there are no longer good excuses to make customers, colleagues, and partners wait the seconds—or sometimes hours—it can take your applications to get data out of disk-bound databases. With in-memory, microseconds are the new seconds.
Posted February 13, 2013
Enterprise NoSQL database provider MarkLogic is making available a free developer license for MarkLogic Enterprise Edition. The developer license provides access to MarkLogic Enterprise Edition features such as integrated search, government-grade security, clustering, replication, failover, alerting, geospatial indexing, conversion, as well as a set of application development tools. MarkLogic is also introducing a Java-based tool for importing data from MongoDB into MarkLogic.
Posted February 12, 2013
Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.
Posted February 05, 2013
IT management company ManageEngine has announced that its on-premise application performance monitoring solution, Applications Manager, now supports NoSQL databases Apache Cassandra and MongoDB. This solution monitors application performance and provides operational intelligence for NoSQL technologies in addition to support for traditional relational databases such as Oracle, MySQL and memcached.
Posted February 05, 2013
IBM has signed an agreement to acquire the software portfolio of Star Analytics Inc., a privately held business analytics company headquartered in Redwood City, California. Financial terms of the acquisition, which is expected to be completed in the first quarter of 2013, were not disclosed. According to IBM, Star Analytics software helps organizations automatically integrate essential information, reporting applications and business intelligence tools across their enterprises, on premise or from cloud computing environments. The software removes typical custom coding for specialized sources that are hard to maintain, and also eliminates manual processes that are cumbersome and time-consuming.
Posted February 01, 2013
Actian Corp. and Pervasive Software Inc. have entered into a definitive merger agreement through which Actian will acquire all of Pervasive's outstanding shares for $9.20 per share. Actian products include Action Apps, Vectorwise, the analytical database, Ingres, an independent mission-critical OLTP database, in addition to the Versant Object Database, which Actian added to its portfolio through another recent merger in which Actian acquired all the outstanding shares of Versant Corporation. According to the company, the deal values Pervasive at $161.9 million and will accelerate Actian's ability to deliver its vision of providing organizations with the capability to take action in real time as their business environment changes.
Posted January 31, 2013
EMC Corporation has updated its appliance-based unified big data analytics offering. The new EMC Greenplum Data Computing Appliance (DCA) Unified Analytics Platform (UAP) Edition expands the system's analytics capabilities and solution flexibility, achieves performance gains in data loading and scanning, and adds integration with EMC's Isilon scale-out NAS storage for enterprise-class data protection and availability. Within a single appliance, the DCA integrates Greenplum Databases for analytics-optimized SQL, Greenplum HD for Hadoop-based processing as well as Greenplum partner business intelligence, ETL, and analytics applications. The appliances have been able to host both a relational database and Hadoop for some time now, Bill Jacobs director of product marketing for EMC Greenplum, tells 5 Minute Briefing. "The significance of this launch is that we tightened that integration up even more. We make those two components directly manageable with a single administrative interface and also tighten up the security. All of that is targeted at giving enterprise customers what they need in order to use Hadoop in very mission-critical applications without having to build it all up in Hadoop themselves."
Posted January 31, 2013
SAP announced a new option for SAP Business Suite customers — SAP Business Suite powered by SAP HANA — providing an integrated family of business applications that captures and analyzes transactional data in real time on a single in-memory platform. With Business Suite on HANA, "SAP has reinvented the software that reinvented businesses," stated Rob Enslin, member of the Global Executive Board and SAP head of sales, as part of his presentation during the company's recent launch event.
Posted January 30, 2013
The explosion of big data has presented many challenges for today's database administrators (DBAs), who are responsible for managing far more data than ever before. And with more programs being developed and tested, more tools are needed to help optimize data and efficiency efforts. Using techniques such as DB2's Multi-Row Fetch (MRF), DBAs are able to cut down on CPU time - and improve application efficiency. MRF was introduced in DB2 version 8 in 2004. Stated simply, it is the ability for DB2 to send multiple rows back to a requesting program at once, rather than one row at a time.
Posted January 29, 2013
Databases are hampered by a reliance on disk-based storage, a technology that has been in place for more than two decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to the permanent information storage devices is still a bottleneck in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the IOUG. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. The research results are detailed in a new report, titled "Accelerating Enterprise Insights: 2013 IOUG In-Memory Strategies Survey."
Posted January 24, 2013
EMC Greenplum has qualified Attunity RepliWeb for Enterprise File Replication (EFR) and Attunity Managed File Transfer (MFT) with EMC Greenplum Hadoop (HD). Attunity RepliWeb for EFR and Attunity MFT are high-performance, easy-to-use solutions for automating, managing and accelerating the process of making data available for big data analytics with Hadoop. According to Attunity, the products, launched earlier this year, are the first and only solutions currently qualified by EMC for Greenplum HD. "Greenplum has come into the marketplace by storm and has had a strong vision of being data-independent or data-agnostic. They want to make sure that their analytic platform can handle both structured and unstructured data and this aligns very well with Attunity's mission statement of any data, any time, anywhere," Matt Benati, vice president of Global Marketing at Attunity, tells DBTA.
Posted January 24, 2013
Enterprise NoSQL database provider MarkLogic Corporation has partnered with business intelligence vendor Tableau Software to offer analytics and visualization over unstructured big data. The partnership allows business users to leverage Tableau's business intelligence and reporting solutions to access disparate data sets of structured and unstructured data house in a MarkLogic NoSQL database. "Not only can you build rich, sophisticated applications, but you can also make use of that data where it is, and have business users connect to that data, visualize it, and do analytics over it, without involving the development center," Stephen Buxton, MarkLogic's director of product management, tells DBTA.
Posted January 24, 2013
For many years, enterprise data center managers have struggled to implement disaster recovery strategies that meet their RTO/RPOs and business continuity objectives while staying within their budget. While the challenges of moving, managing, and storing massive data volumes for effective disaster protection have not changed - exponential data growth and the advent of big data technologies, have made the challenge of disaster recovery protection more difficult than ever before.
Posted December 19, 2012
Despite the rise of big data, data warehousing is far from dead. While traditional, static data warehouses may have indeed seen their day, an agile data warehouse — one that can map to the needs of the business and change as the business changes — is quickly on the rise. Many of the conversations today around big data revolve around volume and while that is certainly valid, the issue is also about understanding data in context to make valuable business decisions. Do you really understand why a consumer takes action to buy? How do their purchases relate? When will they do it again? Big data is limited when it comes to answering these questions. An agile approach — one that gives even big data a life beyond its initial purpose — is the value data warehousing can bring to bear and is critical to long-term business success.
Posted December 19, 2012
For years, data warehouses and extract, transform and load (ETL) have been the primary methods of accessing and archiving multiple data sources across enterprises. Now, an emerging approach - data virtualization - promises to advance the concept of the federated data warehouse to deliver more timely and easier-to-access enterprise data. These are some of the observations made at Composite Software's third Annual Data Virtualization Day, held in New York City. This year's gathering was the largest ever, with nearly 250 customers and practitioners in attendance, Composite reports.
Posted November 13, 2012
Attunity Ltd., a provider of information availability software solutions, has released Attunity Managed File Transfer (MFT) for Hadoop. The new enterprise data transfer solution is designed to accelerate big data collection processes and integrate them seamlessly into and out of Hadoop. MFT enables organizations to collect and transfer big data in both the cloud and enterprise data centers for strategic initiatives including log and machine-data analytics, business intelligence, and data archiving."Attunity MFT for Hadoop, the first of several Hadoop solutions that Attunity will unveil, is designed to deliver on the great promise of Hadoop by helping organizations achieve faster time-to-value for big data analytics projects," says Matt Benati, VP Global Marketing at Attunity.
Posted November 09, 2012
The opportunities and challenges presented by big data are addressed in a new report summarizing the results of a survey of data managers and professionals who are part of the Independent Oracle Users Group. The survey was underwritten by Oracle Corporation and conducted by Unisphere Research, a division of Information Today, Inc. Key highlights from the survey include the finding that more than one out of 10 data managers now have in excess of a petabyte of data within their organizations, and a majority of respondents report their levels of unstructured data are growing.
Posted October 24, 2012
Business analytics vendor OpTier has released OpTier APM 5.0, a solution that provides real-time business transaction analytics and deep diagnostics, enabling improved visibility and gains in productivity. OpTier also released a new Big Data Analytics solution that takes advantage of OpTier's business transaction-based platform, providing real-time big data analytics already in context and reducing time and cost.
Posted October 01, 2012
Data management vendor Terracotta, Inc. has released BigMemory Go, the latest innovation in the BigMemory line that allows customers to put as much data in memory as desired to speed application performance at big data scale. The product is being offered via a free 32GB per instance production license that can be deployed on as many servers as desired.
Posted September 25, 2012
The first computer program I ever wrote (in 1979, if you must know) was in the statistical package SPSS (Statistical Package for the Social Sciences), and the second computer platform I used was SAS (Statistical Analysis System). Both of these systems are still around today—SPSS was acquired by IBM as part of its BI portfolio, and SAS is now the world's largest privately held software company. The longevity of these platforms—they have essentially outlived almost all contemporary software packages—speaks to the perennial importance of data analysis to computing.
Posted September 19, 2012
Oracle announced enhanced support for the R statistical programming language, including new platform ports of R for Oracle Solaris and AIX in addition to Linux and Windows, connectivity to Oracle TimesTen In-Memory Database in addition to Oracle Database, and integration of hardware-specific Math libraries for faster performance. "Big data analytics is a top priority for our customers, and the R statistical programming language is a key tool for performing these analytics," says Andrew Mendelsohn, senior vice president, Oracle Database Server Technologies.
Posted September 12, 2012
Business intelligence software vendor Actuate has partnered with VoltDB, provider of ultra-high-throughput relational database systems, to offer a solution that will allow ActuateOne and VoltDB customers to process their big data more quickly and effectively, resulting in improved insights. Together, the VoltDB and ActuateOne alliance is expected to substantially reduce the time from big data access to operational insights, providing customers with a competitive advantage in the market and improving the bottom line.
Posted September 11, 2012
Big data and cloud analytics vendor Kognitio has partnered with Xtremeinsights, a provider of solutions for leveraging Hadoop in existing data management systems. Together, the partners aim to deliver software and integration technologies to businesses that want to leverage the Hadoop platform and gain actionable insights from their big data. Using its in-memory analytical platform, Kognitio speeds up the analysis of data from Hadoop clusters, enabling ad hoc, real-time analytics at a significantly lower cost. "Xtremeinsights can build the underlying infrastructure so that your business users can do ad hoc analysis on ridiculous amounts of data and get answers in real-time," Michael Hiskey, Kognitio's vice president of marketing and business development, tells DBTA.
Posted August 23, 2012
Pentaho's Business Analytics 4.5 is now certified on Cloudera's latest releases, Cloudera Enterprise 4.0 and CDH4. Pentaho also announced that its visual design studio capabilities have been extended to the Sqoop and Oozie components of Hadoop. "Hadoop is a very broad ecosystem. It is not a single project," Ian Fyfe, chief technology evangelist at Pentaho, tells DBTA. "Sqoop and Oozie are shipped as part of Cloudera's distribution so that is an important part of our support for Cloudera as well - providing that visual support which nobody else in the market does today."
Posted August 23, 2012
ParAccel, an enterprise analytics platform provider, has announced the general availability of ParAccel 4.0. "If you consider how analytics have traditionally been run - offline, static, standalone - there's a large gap that needs to be filled if organizations want to meet 21st century demands," says Chuck Berger, CEO at ParAccel. The new release builds on ParAccel's expertise to help organizations achieve high performance, interactive big data analytics with improved speed and reliability
Posted August 14, 2012
Cloud operating system provider Nimbula has unveiled its elastic Hadoop solution with MapR Technologies, allowing users to run their Hadoop clusters on private clouds. The elasticity and multi-tenancy of Nimbula Director paired with the dependability and security of MapR Hadoop Distribution allows for a fully-functional and highly-available Hadoop cluster on a single pool of infrastructure. "What we're trying to achieve is have the power of Hadoop on top of a private cloud and bringing the best of each world to the customer," Reza Malekzadeh, Nimbula's vice president of marketing & sales, tells 5 Minute Briefing. Customers can run Hadoop and non-Hadoop workloads on the same shared infrastructure.
Posted August 14, 2012
Syncsort, a global leader in high-performance data integration solutions, has certified its DMExpress data integration software for high-performance loading of Greenplum Database. Syncsort has also joined the Greenplum Catalyst Developer Program. Syncsort DMExpress software delivers extensive connectivity that makes it easy to extract and transform data from nearly any source, and rapidly load it into the massively parallel processing (MPP) Greenplum Database without the need for manual tuning or custom coding. "IT organizations of all sizes are struggling to keep pace with the spiraling infrastructure demands created by the sheer volume, variety and velocity of big data," says Mitch Seigle, vice president, Marketing and Product Management, Syncsort.
Posted July 25, 2012
Hyve Solutions, a division of SYNNEX Corporation has entered a software licensing agreement with IBM to offer IBM InfoSphere BigInsights software with its BigD family of systems hardware. The turnkey platform is intended to provide mid-market clients with an enterprise-class big data system to help them quickly deploy Hadoop-based analytics without the need for on-premise professional services or developers. To achieve enterprise-class standards in BigD systems, Hyve Solutions and IBM collaborated with Zettaset, Inc. to build in safeguards that provide service management, failover and restart, as well as alerting and monitoring features.
Posted June 19, 2012
The term "big data" refers to the massive amounts of data being generated on a daily basis by businesses and consumers alike - data which cannot be processed using conventional data analysis tools owing to its sheer size and, in many case, its unstructured nature. Convinced that such data hold the key to improved productivity and profitability, enterprise planners are searching for tools capable of processing big data, and information technology providers are scrambling to develop solutions to accommodate new big data market opportunities.
Posted May 23, 2012
Organizations are struggling with big data, which they define as any large-size data store that becomes unmanageable by standard technologies or methods, according to a new survey of 264 data managers and professionals who are subscribers to Database Trends and Applications. The survey was conducted by Unisphere Research, a division of Information Today, Inc., in partnership with MarkLogic in January 2012. Among the key findings uncovered by the survey is the fact that unstructured data is on the rise, and ready to engulf current data management systems. Added to that concern, say respondents, is their belief that management does not understand the challenge that is looming, and is failing to recognize the significance of unstructured data assets to the business.
Posted May 09, 2012
Zettaset has announced SHadoop, a new security initiative designed to improve security for Hadoop. The new initiative will be incorporated as a security layer into Zettaset's Hadoop Orchestrator data management platform. The SHadoop layer is intended to mitigate architectural and input validation issues that exist within the core Hadoop code, and improve upon user role audit tracking and user level security.
Posted April 12, 2012
RainStor, a provider of big data management software, is joining with IBM in the big data market. RainStor will work with IBM to deliver a solution that combines IBM's enterprise-class, Hadoop-based product, InfoSphere BigInsights, with RainStor's Big Data Analytics on Hadoop product to enable faster, more flexible analytics on multi-structured data, without the need to move data out of the Hadoop environment. According to the vendors, the new combined solution can reduce the TCO for customers by significantly reducing physical storage, and also improving the performance of querying and analyzing big data sets across the enterprise.
Posted March 27, 2012
At the Strata Conference today Calpont anounced InfiniDB 3, the latest release of its high performance analytic database. Designed from the ground up for large-scale, high-performance dimensional analytics, predictive analytics, and ad hoc business intelligence, the new release includes capabilities to capitalize on a variety of data structures and deployment variations to meet organizations' need for a flexible and scalable big data architecture.
Posted February 29, 2012
Composite Software has introduced version 6.1 of its Composite Data Virtualization Platform. The new release offers improved caching performance, expanded caching targets, data ship join for Teradata, and Hadoop MapReduce connectivity. Composite 6.1 also provides improvements to the data services development environment with an enhanced data services editor and new publishing options for Representational State Transfer (REST) and Open Data Protocol (OData) data services.
Posted February 17, 2012
RainStor, a provider of big data management software, has unveiled the RainStor Big Data Analytics on Hadoop, which the company describes as the first enterprise database running natively on Hadoop. It is intended to enable faster analytics on multi-structured data without the need to move data out of the Hadoop Distributed File System (HDFS) environment. There is architectural compatibility with the way Rainstor manages data and the way Hadoop Distributed File Systems manage CSV files, says Deirdre Mahon, vice president of marketing at Rainstor.
Posted January 25, 2012
The Oracle Big Data Appliance, an engineered system of hardware and software that was first unveiled at Oracle OpenWorld in October, is now generally available. The new system incorporates Cloudera's Distribution Including Apache Hadoop (CDH3) with Cloudera Manager 3.7, plus an open source distribution of R. The Oracle Big Data Appliance represents "two industry leaders coming together to wrap their arms around all things big data," says Cloudera COO Kirk Dunn.
Posted January 25, 2012