Newsletters




Big Data

The well-known three Vs of Big Data - Volume, Variety, and Velocity – are increasingly placing pressure on organizations that need to manage this data as well as extract value from this data deluge for Predictive Analytics and Decision-Making. Big Data technologies, services, and tools such as Hadoop, MapReduce, Hive and NoSQL/NewSQL databases and Data Integration techniques, In-Memory approaches, and Cloud technologies have emerged to help meet the challenges posed by the flood of Web, Social Media, Internet of Things (IoT) and machine-to-machine (M2M) data flowing into organizations.



Big Data Articles

As machines increasingly are fitted with internet and other network access, enterprises will be able to capture and increasingly expected to respond to more customer data than ever before. Machine-to-machine (M2M) network connections—this so-called "Internet of Things"—is positioned to become the next source of major competitive advantage. Whatever you call it, M2M is turning out to be the poster child for big data's "Three Vs": Volume, Velocity and Variety. What M2M data requires is a fourth "V" (Visualization) to convert its big data into value by giving users the ability to identify data patterns through real-time analytics.

Posted April 30, 2013

Adding to its list of recent acquisitions (Pervasive, Versant), Actian Corp. a big data management vendor, announced the purchase of ParAccel, a leader in high-performance analytics. The acquisition adds Amazon, The Royal Bank of Scotland, OfficeMax and MicroStrategy to Actian's big data customer portfolio.

Posted April 25, 2013

The conference agenda as well as the list of speakers is now available for DBTA's Big Data Boot Camp, a deep dive designed to bring together thought leaders and practitioners who will provide insight on how to collect, manage, and act on big data. The conference will be held May 21-22 at the Hilton New York. SAP is the diamond sponsor, and Objectivity and MarkLogic are platinum sponsors of the two-day event.

Posted April 25, 2013

Teradata, a provider of analytic data solutions, is offering business analysts self-service access to Apache Hadoop to enable quicker and smarter business decisions. According to the company, Teradata Enterprise Access for Hadoop, part of the Teradata Unified Data Architecture, allows business analysts to reach through Teradata directly into Hadoop to find new business value from the analysis of big data.

Posted April 23, 2013

Actian Corp. has completed the acquisition of Pervasive. The agreement for Actian to acquire Pervasive was announced earlier this year. Pervasive provides software to manage, integrate and analyze data, in the cloud or on premises, throughout the data lifecycle.

Posted April 16, 2013

Lavastorm Analytics is building a partner ecosystem with its analytics platform at the center to help business analysts optimize their organizations' big data. Cyfeon Solutions and QlikView are two of the most recent integrations with Lavastorm.

Posted April 16, 2013

In business, the rear view mirror is clearer than the windshield, said the sage of Omaha. And that is particularly true of business intelligence, composed almost entirely of such retrospectives. Consider this: Business intelligence proffers neatly organized historical data as a potential source of hindsight. Of course, there are also the dashboards of happenings in the "now" but precious little in terms of prompts to timely action. The time required to traverse that path from data to insight to intelligence to ideas to implementation to results is often the culprit. It's nowhere near quick enough, especially for businesses like banking, telecommunications and healthcare that set great store by the time value of information and the money value of time.

Posted April 10, 2013

"Big data" and the impact of analytics on large quantities of data is a persistent meme in today's Information Technology market. One of the big questions looming in IT departments about big data is what, exactly, does it mean in terms of management and administration. Will traditional data management concepts such as data modeling, database administration, data quality, data governance, and data stewardship apply in the new age of big data? According to analysts at Wikibon, big data refers to datasets whose size, type and speed of creation make it impractical to process and analyze with traditional tools . So, given that definition, it would seem that traditional concepts are at the very least "impractical," right?

Posted April 10, 2013

Oracle announced the availability of the Oracle Big Data Appliance X3-2 Starter Rack and Oracle Big Data Appliance X3-2 In-Rack Expansion. The new Oracle Big Data Appliance X3-2 Starter Rack is intended to help customers jump start their first big data projects and the new Oracle Big Data Appliance X3-2 In-Rack Expansion is aimed at helping them cost-effectively scale the system as their data grows. In addition, the Oracle Big Data Appliance X3-2 (Full Rack configuration) is now available through Oracle Infrastructure as a Service.

Posted April 10, 2013

Global software vendor Progress Software has expanded its big data connectivity capabilities with the release of ODBC driver technologies that make it easier to leverage the newest data warehousing applications. Support for HiveServer2 and Cloudera CDH 4.1 Hadoop distributions in Progress DataDirect Connect XE for ODBC 7.1 are aimed at reducing complexity for developers and providing fast, reliable and secure access from multiple data sources.

Posted April 09, 2013

JackBe, a provider of real-time intelligence, and Axeda, a cloud-based service and software vendor, have announced a partnership that unifies their complementary technologies to enhance the value of machine-to-machine (M2M) data. Through the JackBe-Axeda partnership, user-driven tooling from JackBe Presto combines with Axeda Machine Cloud's M2M event processing and data management.

Posted April 09, 2013

New capabilities have been added in Serengeti 0.8.0, which extend the reach of partner-supported Hadoop versions and capabilities. Project Serengeti is an open source project sponsored by VMware. It provides Hadoop users with easy to use management tools to provision, manage, and monitor Hadoop clusters on VMware vSphere.

Posted April 09, 2013

New technologies designed to help companies and governments tackle big data have been unveiled by IBM. The new technologies include the new IBM PureData System for Hadoop, designed to make it easier and faster to deploy Hadoop in the enterprise, and "BLU Acceleration," which is aimed at improving analytical performance in data management systems.

Posted April 05, 2013

Oracle Event Processing (OEP) for Oracle Java Embedded, a smaller footprint version of Oracle Event Processing tailored for deployment on gateways, has been introduced by Oracle. According to Oracle, OEP for Oracle Java Embedded is a solution for building embedded device applications to filter, correlate and process events in real-time so that downstream applications, services and event-driven architectures are driven by true, real-time intelligence.

Posted April 03, 2013

Data keeps growing, systems and servers keep sprawling, and users keep clamoring for more real-time access. The result of all this frenzy of activity is pressure for faster, more effective data integration that can deliver more expansive views of information, while still maintaining quality and integrity. Enterprise data and IT managers are responding in a variety of ways, looking to initiatives such as enterprise mashups, automation, virtualization, and cloud to pursue new paths to data integration. In the process, they are moving beyond the traditional means of integration they have relied on for years to pull data together.

Posted April 03, 2013

Software integration vendor Talend has partnered with Caserta Concepts, a consulting and technology services firm specializing in data warehousing, business intelligence and big data analytics, to help Caserta build highly scalable big data analytics solutions for their clients.

Posted April 02, 2013

Big data, in-memory analytics and cloud computing vendor Kognitio has announced that the Kognitio Analytical Platform enables new fully parallel not-only-SQL (NoSQL) capabilities, including the R language for statistical computing and graphics.

Posted April 02, 2013

Dell Software is rolling out the latest version of its Kitenga Analytics solution, which extends the analysis of structured, semi-structured and unstructured data stored in Hadoop. Kitenga was acquired by Dell along with Quest Software in September 2012.

Posted March 27, 2013

The Independent Oracle Users Group (IOUG) will celebrate its 20th anniversary at COLLABORATE 13, a conference on Oracle technology presented jointly by the IOUG, OAUG (Oracle Applications User Group) and the Quest International User Group. The event will be held April 7 to 11 at the Colorado Convention Center in Denver. As part of the conference, the IOUG will host the COLLABORATE 13-IOUG Forum with nearly 1,000 sessions providing user-driven content. The theme of this year's COLLABORATE 13-IOUG Forum is "Elevate - take control of your career and elevate your Oracle ecosystem knowledge and expertise," says IOUG president John Matelski.

Posted March 27, 2013

At the recent Strata conference, CitusDB showcased the latest release of its scalable analytics database. According to the vendor, CitusDB 2.0 brings together the performance of PostgreSQL and the scalability of Apache Hadoop, and enables real-time queries on data that's already in Hadoop. This new functionality is possible with CitusDB's distributed query planner, and PostgreSQL's foreign data wrappers.

Posted March 27, 2013

Two big questions are on the minds of data professionals these days. How are increasing complexity and the inevitable onslaught of big data shaping the future of database administrators and data architects? How will our roles change? In the interest of studying the evolving landscape of data, the Independent Oracle User's Group (IOUG) took the pulse of the community. The Big Data Skills for Success study polled numerous individuals in the IOUG Oracle technology community, to identify just how the responsibilities of handling data are changing and what the future of these roles looks like.

Posted March 27, 2013

Infrastructure software provider TIBCO Software Inc. has announced the latest version of its data discovery and visualization platform, TIBCO Spotfire 5.5. The offering maximizes corporate investment through a unified approach to simultaneously work with data in-memory and directly query a variety of data systems and engines. Spotfire 5.5 also integrates with Teradata Aster, TIBCO DataSynapse GridServer, and TIBCO BusinessEvents, providing users with the ability to use predictive analytics for real-time events and big data insights.

Posted March 26, 2013

Platfora, a native in-memory business intelligence platform for Hadoop, is now generally available. This platform puts business users directly in touch with big data and removes the need for data warehouse and ETL software. This provides customers with meaningful insights from their data in hours instead of weeks or months, the company says. Additionally, Platfora enters the market with support from the greater Hadoop community.

Posted March 26, 2013

Google's dominance of internet search has been uncontested for more than 12 years now. Before Google, search engines such as AltaVista indexed web pages and allowed for keyword search with an interface and functionality superficially similar to that provided by Google. However, these first-generation search engines provided relatively poor ordering of results. Because an internet search would return pages ranked by the number of times a term appeared on the website, unpopular or irrelevant sites would be just as likely to achieve top rank as popular sites.

Posted March 20, 2013

Concurrent, Inc. announced that it has raised $4 million in Series A funding. This follows a $900,000 seed investment in August 2011. The company says that the new investment, led by True Ventures and Rembrandt Venture Partners, will be used to fuel product development, grow the core team and further deliver on the company's vision to simplify big data application development on Apache Hadoop. Concurrent also announced that Gary Nakamura has been named as its CEO.

Posted March 20, 2013

Attunity Ltd., a provider of information availability software solutions, released Attunity Replicate 2.1, a high-performance data delivery solution that adds improvements for data warehousing. Attunity Replicate's new performance enhancements support many data warehouses, including Amazon Redshift, EMC Greenplum and Teradata.

Posted March 13, 2013

Early bird registration is now open for the Big Data Boot Camp, a two-day intensive dive into the world of big data. The conference, produced by Database Trends and Applications, will be held Tuesday, May 21, through Wednesday, May 22, at the Hilton New York. The agenda of the Big Data Boot Camp has been designed to bring together thought leaders and practitioners who will identify emerging technologies and provide case studies and best practices in big data management.

Posted March 12, 2013

In-memory technology provider Terracotta, Inc. has announced that javax.cache, a caching standard for Java applications, has entered Draft Review Stage under the Java Community Process. It provides a standard approach for how Java applications temporarily cache data, an essential technology for in-memory solutions and a critical factor in achieving high performance and scalability of big data.

Posted March 12, 2013

Oracle has announced the general availability of Oracle Database Appliance X3-2, featuring up to twice the performance and supporting over four times the storage as compared to the original Oracle Database Appliance. Oracle Database Appliance is a complete package of software, server, storage and networking designed for simplicity and high availability, helping businesses of all sizes reduce risk and save time and money managing their data and applications.

Posted March 06, 2013

A new survey of nearly 200 data managers and professionals, who are part of the Independent Oracle Users Group (IOUG), looks at the role of the data scientist - data professionals who can aggregate data from internal enterprise data stores as well as outside sources to provide the forecasts and insight required to help lead their organizations into the future. The research was conducted by Unisphere Research, a division of Information Today, Inc.

Posted March 06, 2013

Big data analytics provider Pentaho has announced new templates for Instaview, its big data discovery application, simplifying access to and analysis of big data sources. Templates help solve the data integration challenge in gaining value from big data. Pentaho has released a Twitter template and will release templates for MongoDB, Amazon Redshift, Google Analytics and Cloudera Impala on a weekly basis for download from the Pentaho site.

Posted March 05, 2013

EMC announced a new distribution of Apache Hadoop called Pivotal HD that features native integration of EMC's Greenplum MPP database with Apache Hadoop. According to the vendor, Pivotal HD with HAWQ provides a key advancement by offering SQL processing for Hadoop, thereby expanding the platform's reach to SQL programmers.

Posted March 05, 2013

Revolution Analytics has integrated Revolution R Enterprise with Hortonworks Data Platform. In a move designed to help Hadoop customers derive more value from their big data technology investments, Revolution Analytics and Hortonworks are co-developing "in-Hadoop predictive analytics" without the need to import or export data from Hadoop, David Smith, vice president of marketing and community at Revolution Analytics, tells 5 Minute Briefing.

Posted February 28, 2013

SAP AG has introduced a new version of its Sybase IQ disk-based column store analytics server. The overriding theme of this new release, which will be generally available later in the first quarter, "is positioning IQ 16 to go from terabytes to petabytes," Dan Lahl, senior director of product marketing at SAP, tells 5 Minute Briefing. To accomplish this, IQ 16 provides enhancements in three critical areas.

Posted February 27, 2013

Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.

Posted February 27, 2013

MarkLogic said it plans to deliver an enterprise-grade application and analytics software solution based on the new Intel Distribution for Apache Hadoop software. The Intel Distribution will be combined with the MarkLogic Enterprise NoSQL database to support real-time transactional and analytic applications.

Posted February 26, 2013

LucidWorks, a provider of enterprise-grade search development platforms, has partnered with MapR Technologies, a Hadoop technology vendor, to deliver the integration between LucidWorks Search and MapR. This solution enables organizations to search their MapR Distributed File System (DFS) to discover actionable insights from information stored in Hadoop.

Posted February 26, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160

Sponsors