Newsletters




Big Data

The well-known three Vs of Big Data - Volume, Variety, and Velocity – are increasingly placing pressure on organizations that need to manage this data as well as extract value from this data deluge for Predictive Analytics and Decision-Making. Big Data technologies, services, and tools such as Hadoop, MapReduce, Hive and NoSQL/NewSQL databases and Data Integration techniques, In-Memory approaches, and Cloud technologies have emerged to help meet the challenges posed by the flood of Web, Social Media, Internet of Things (IoT) and machine-to-machine (M2M) data flowing into organizations.



Big Data Articles

Big data—a now well-used term intended to define the growing volume, variety, velocity, and value of information surging through organizations—has been on our radar screens for more than 2 years. In the process, it has become more than a buzz phrase thrown about at conferences and in the trade press—big data is now seen as the core of enterprise growth strategies.

Posted April 10, 2013

DBTA and Tableau Software will present a webcast on the Top Trends in Business Intelligence in 2013 on Thursday, April 11, at 11 am PT/ 2 pm ET. This webcast will enable attendees to better understand how visual analytics is helping organizations act on big data, how to realize the value of unstructured data, the new role of cloud in business intelligence, and how to leverage mobile BI.

Posted April 09, 2013

Global software vendor Progress Software has expanded its big data connectivity capabilities with the release of ODBC driver technologies that make it easier to leverage the newest data warehousing applications. Support for HiveServer2 and Cloudera CDH 4.1 Hadoop distributions in Progress DataDirect Connect XE for ODBC 7.1 are aimed at reducing complexity for developers and providing fast, reliable and secure access from multiple data sources.

Posted April 09, 2013

JackBe, a provider of real-time intelligence, and Axeda, a cloud-based service and software vendor, have announced a partnership that unifies their complementary technologies to enhance the value of machine-to-machine (M2M) data. Through the JackBe-Axeda partnership, user-driven tooling from JackBe Presto combines with Axeda Machine Cloud's M2M event processing and data management.

Posted April 09, 2013

New capabilities have been added in Serengeti 0.8.0, which extend the reach of partner-supported Hadoop versions and capabilities. Project Serengeti is an open source project sponsored by VMware. It provides Hadoop users with easy to use management tools to provision, manage, and monitor Hadoop clusters on VMware vSphere.

Posted April 09, 2013

New technologies designed to help companies and governments tackle big data have been unveiled by IBM. The new technologies include the new IBM PureData System for Hadoop, designed to make it easier and faster to deploy Hadoop in the enterprise, and "BLU Acceleration," which is aimed at improving analytical performance in data management systems.

Posted April 05, 2013

Oracle Event Processing (OEP) for Oracle Java Embedded, a smaller footprint version of Oracle Event Processing tailored for deployment on gateways, has been introduced by Oracle. According to Oracle, OEP for Oracle Java Embedded is a solution for building embedded device applications to filter, correlate and process events in real-time so that downstream applications, services and event-driven architectures are driven by true, real-time intelligence.

Posted April 03, 2013

Data keeps growing, systems and servers keep sprawling, and users keep clamoring for more real-time access. The result of all this frenzy of activity is pressure for faster, more effective data integration that can deliver more expansive views of information, while still maintaining quality and integrity. Enterprise data and IT managers are responding in a variety of ways, looking to initiatives such as enterprise mashups, automation, virtualization, and cloud to pursue new paths to data integration. In the process, they are moving beyond the traditional means of integration they have relied on for years to pull data together.

Posted April 03, 2013

Software integration vendor Talend has partnered with Caserta Concepts, a consulting and technology services firm specializing in data warehousing, business intelligence and big data analytics, to help Caserta build highly scalable big data analytics solutions for their clients.

Posted April 02, 2013

Big data, in-memory analytics and cloud computing vendor Kognitio has announced that the Kognitio Analytical Platform enables new fully parallel not-only-SQL (NoSQL) capabilities, including the R language for statistical computing and graphics.

Posted April 02, 2013

Dell Software is rolling out the latest version of its Kitenga Analytics solution, which extends the analysis of structured, semi-structured and unstructured data stored in Hadoop. Kitenga was acquired by Dell along with Quest Software in September 2012.

Posted March 27, 2013

The Independent Oracle Users Group (IOUG) will celebrate its 20th anniversary at COLLABORATE 13, a conference on Oracle technology presented jointly by the IOUG, OAUG (Oracle Applications User Group) and the Quest International User Group. The event will be held April 7 to 11 at the Colorado Convention Center in Denver. As part of the conference, the IOUG will host the COLLABORATE 13-IOUG Forum with nearly 1,000 sessions providing user-driven content. The theme of this year's COLLABORATE 13-IOUG Forum is "Elevate - take control of your career and elevate your Oracle ecosystem knowledge and expertise," says IOUG president John Matelski.

Posted March 27, 2013

At the recent Strata conference, CitusDB showcased the latest release of its scalable analytics database. According to the vendor, CitusDB 2.0 brings together the performance of PostgreSQL and the scalability of Apache Hadoop, and enables real-time queries on data that's already in Hadoop. This new functionality is possible with CitusDB's distributed query planner, and PostgreSQL's foreign data wrappers.

Posted March 27, 2013

Two big questions are on the minds of data professionals these days. How are increasing complexity and the inevitable onslaught of big data shaping the future of database administrators and data architects? How will our roles change? In the interest of studying the evolving landscape of data, the Independent Oracle User's Group (IOUG) took the pulse of the community. The Big Data Skills for Success study polled numerous individuals in the IOUG Oracle technology community, to identify just how the responsibilities of handling data are changing and what the future of these roles looks like.

Posted March 27, 2013

Infrastructure software provider TIBCO Software Inc. has announced the latest version of its data discovery and visualization platform, TIBCO Spotfire 5.5. The offering maximizes corporate investment through a unified approach to simultaneously work with data in-memory and directly query a variety of data systems and engines. Spotfire 5.5 also integrates with Teradata Aster, TIBCO DataSynapse GridServer, and TIBCO BusinessEvents, providing users with the ability to use predictive analytics for real-time events and big data insights.

Posted March 26, 2013

Platfora, a native in-memory business intelligence platform for Hadoop, is now generally available. This platform puts business users directly in touch with big data and removes the need for data warehouse and ETL software. This provides customers with meaningful insights from their data in hours instead of weeks or months, the company says. Additionally, Platfora enters the market with support from the greater Hadoop community.

Posted March 26, 2013

Google's dominance of internet search has been uncontested for more than 12 years now. Before Google, search engines such as AltaVista indexed web pages and allowed for keyword search with an interface and functionality superficially similar to that provided by Google. However, these first-generation search engines provided relatively poor ordering of results. Because an internet search would return pages ranked by the number of times a term appeared on the website, unpopular or irrelevant sites would be just as likely to achieve top rank as popular sites.

Posted March 20, 2013

Concurrent, Inc. announced that it has raised $4 million in Series A funding. This follows a $900,000 seed investment in August 2011. The company says that the new investment, led by True Ventures and Rembrandt Venture Partners, will be used to fuel product development, grow the core team and further deliver on the company's vision to simplify big data application development on Apache Hadoop. Concurrent also announced that Gary Nakamura has been named as its CEO.

Posted March 20, 2013

Attunity Ltd., a provider of information availability software solutions, released Attunity Replicate 2.1, a high-performance data delivery solution that adds improvements for data warehousing. Attunity Replicate's new performance enhancements support many data warehouses, including Amazon Redshift, EMC Greenplum and Teradata.

Posted March 13, 2013

Early bird registration is now open for the Big Data Boot Camp, a two-day intensive dive into the world of big data. The conference, produced by Database Trends and Applications, will be held Tuesday, May 21, through Wednesday, May 22, at the Hilton New York. The agenda of the Big Data Boot Camp has been designed to bring together thought leaders and practitioners who will identify emerging technologies and provide case studies and best practices in big data management.

Posted March 12, 2013

In-memory technology provider Terracotta, Inc. has announced that javax.cache, a caching standard for Java applications, has entered Draft Review Stage under the Java Community Process. It provides a standard approach for how Java applications temporarily cache data, an essential technology for in-memory solutions and a critical factor in achieving high performance and scalability of big data.

Posted March 12, 2013

Oracle has announced the general availability of Oracle Database Appliance X3-2, featuring up to twice the performance and supporting over four times the storage as compared to the original Oracle Database Appliance. Oracle Database Appliance is a complete package of software, server, storage and networking designed for simplicity and high availability, helping businesses of all sizes reduce risk and save time and money managing their data and applications.

Posted March 06, 2013

A new survey of nearly 200 data managers and professionals, who are part of the Independent Oracle Users Group (IOUG), looks at the role of the data scientist - data professionals who can aggregate data from internal enterprise data stores as well as outside sources to provide the forecasts and insight required to help lead their organizations into the future. The research was conducted by Unisphere Research, a division of Information Today, Inc.

Posted March 06, 2013

Big data analytics provider Pentaho has announced new templates for Instaview, its big data discovery application, simplifying access to and analysis of big data sources. Templates help solve the data integration challenge in gaining value from big data. Pentaho has released a Twitter template and will release templates for MongoDB, Amazon Redshift, Google Analytics and Cloudera Impala on a weekly basis for download from the Pentaho site.

Posted March 05, 2013

EMC announced a new distribution of Apache Hadoop called Pivotal HD that features native integration of EMC's Greenplum MPP database with Apache Hadoop. According to the vendor, Pivotal HD with HAWQ provides a key advancement by offering SQL processing for Hadoop, thereby expanding the platform's reach to SQL programmers.

Posted March 05, 2013

Revolution Analytics has integrated Revolution R Enterprise with Hortonworks Data Platform. In a move designed to help Hadoop customers derive more value from their big data technology investments, Revolution Analytics and Hortonworks are co-developing "in-Hadoop predictive analytics" without the need to import or export data from Hadoop, David Smith, vice president of marketing and community at Revolution Analytics, tells 5 Minute Briefing.

Posted February 28, 2013

SAP AG has introduced a new version of its Sybase IQ disk-based column store analytics server. The overriding theme of this new release, which will be generally available later in the first quarter, "is positioning IQ 16 to go from terabytes to petabytes," Dan Lahl, senior director of product marketing at SAP, tells 5 Minute Briefing. To accomplish this, IQ 16 provides enhancements in three critical areas.

Posted February 27, 2013

Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.

Posted February 27, 2013

MarkLogic said it plans to deliver an enterprise-grade application and analytics software solution based on the new Intel Distribution for Apache Hadoop software. The Intel Distribution will be combined with the MarkLogic Enterprise NoSQL database to support real-time transactional and analytic applications.

Posted February 26, 2013

LucidWorks, a provider of enterprise-grade search development platforms, has partnered with MapR Technologies, a Hadoop technology vendor, to deliver the integration between LucidWorks Search and MapR. This solution enables organizations to search their MapR Distributed File System (DFS) to discover actionable insights from information stored in Hadoop.

Posted February 26, 2013

Hortonworks has announced that the Hortonworks Data Platform is now available for Windows in addition to Linux, enabling organizations to run Hadoop-based solutions natively on Windows. According to Hortonworks, making the Hortonworks Data Platform available for Windows is a necessary step in its strategy to broaden the reach of Apache Hadoop across the enterprise.

Posted February 25, 2013

Hortonworks, a contributor to Apache Hadoop, has submitted two new incubation projects to the Apache Software Foundation and also announced the launch of the new "Stinger Initiative." These three projects seek to address key enterprise requirements regarding Hadoop application security and performance.

Posted February 21, 2013

Denodo Technologies, a provider of data virtualization software, has unveiled the Denodo Platform 5.0. This latest version of Denodo´s solution enables data virtualization for both agile business intelligence and agile development of connected application and helps organizations access all of their information sources including big data, cloud and unstructured sources. The new release was previewed at the TDWI conference in Las Vegas.

Posted February 20, 2013

Database Trends and Applications (DBTA) will host a live web event to explore the opportunities enabled by NewSQL in the cloud. Presented by Mark Sarbiewski, CMO at Clustrix, and Matt Aslett, research manager at 451 Research, and moderated by DBTA's Stephen Faig, the webinar will cover how to migrate a SQL database to the cloud; how to get scale from a database in public or private clouds; and how to ensure database availability in the cloud for business-critical applications. The webinar will be presented on Thursday, Feb. 21, at 10 am PT/ 1 pm ET.

Posted February 19, 2013

Terracotta, a provider of enterprise big data management solutions, and JackBe, a real-time intelligence software vendor, have announced a collaboration to leverage their complementary technologies for real-time big data solutions. According to the vendors, JackBe's Presto real-time data visualizations for analytics coupled with Terracotta's BigMemory high-performance and scalability enables JackBe to deliver visual data exploration and dashboards with the speed, scale and simplicity of in-memory data management.

Posted February 19, 2013

Having vast amounts of data at hand doesn't necessarily help executives make better decisions. In fact, without a simple way to access and analyze the astronomical amounts of available information, it is easy to become frozen with indecision, knowing the answers are likely in the data, but unsure how to find them. With so many companies proclaiming to offer salvation from all data issues, one of the most important factors to consider when selecting a solution is ease of use. An intuitive interface based on how people already operate in the real world is the key to adoption and usage throughout an organization.

Posted February 13, 2013

In-memory technology—in which entire data sets are pre-loaded into a computer's random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.

Posted February 13, 2013

A profound shift is occurring in where data lives. Thanks to skyrocketing demand for real-time access to huge volumes of data—big data—technology architects are increasingly moving data out of slow, disk-bound legacy databases and into large, distributed stores of ultra-fast machine memory. The plummeting price of RAM, along with advanced solutions for managing and monitoring distributed in-memory data, mean there are no longer good excuses to make customers, colleagues, and partners wait the seconds—or sometimes hours—it can take your applications to get data out of disk-bound databases. With in-memory, microseconds are the new seconds.

Posted February 13, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97

Sponsors