Newsletters




Big Data

The well-known three Vs of Big Data - Volume, Variety, and Velocity – are increasingly placing pressure on organizations that need to manage this data as well as extract value from this data deluge for Predictive Analytics and Decision-Making. Big Data technologies, services, and tools such as Hadoop, MapReduce, Hive and NoSQL/NewSQL databases and Data Integration techniques, In-Memory approaches, and Cloud technologies have emerged to help meet the challenges posed by the flood of Web, Social Media, Internet of Things (IoT) and machine-to-machine (M2M) data flowing into organizations.



Big Data Articles

Database Trends and Applications has launched a special "Who to See at Oracle OpenWorld" section online where you can find information on what to expect at this year's conference and premium vendors that offer products and services to serve your needs as an Oracle technology professional.

Posted August 09, 2013

Oracle is advancing the role of Java for IoT (Internet of Things) with the latest releases of its Oracle Java Embedded product portfolio - Oracle Java ME Embedded 3.3 and Oracle Java ME Software Development Kit (SDK) 3.3, a complete client Java runtime and toolkit optimized for microcontrollers and other resource-constrained devices. Oracle is also introducing the Oracle Java Platform Integrator program to provide partners with the ability to customize Oracle Java ME Embedded products to reach different device types and market segments. "We see IoT as the next big wave that will hit the industry," Oracle's Peter Utzschneider, vice president of product management, explained during a recent interview.

Posted August 07, 2013

More "things" are now connected to the internet than people, a phenomenon dubbed The Internet of Things. Fueled by machine-to-machine (M2M) data, the Internet of Things promises to make our lives easier and better, from more efficient energy delivery and consumption to mobile health innovations where doctors can monitor patients from afar. However, the resulting tidal wave of machine-generated data streaming in from smart devices, sensors, monitors, meters, etc., is testing the capabilities of traditional database technologies. They simply can't keep up; or when they're challenged to scale, are cost-prohibitive.

Posted August 07, 2013

In many ways, Hadoop is the most concrete technology underlying today's big data revolution, but it certainly does not satisfy those who want quick answers from their big data. Hadoop - at least Hadoop 1.0 - is a batch-oriented framework that allows for the economical execution of massively parallel workloads, but provides no capabilities for interactive or real-time execution.

Posted August 07, 2013

Consider a professional baseball game or any other popular professional sporting event. When a fan sits in the upper deck of Dodger Stadium in Los Angeles, or any other sporting arena on earth, the fan is happily distracted from the real world. Ultimately, professional sports constitutes a trillion dollar industry - an industry whose product is on the surface entertainment but when one barely pierces that thin veneer it quickly becomes clear that the more significant product produced is data. A fan sitting in the upper deck does not think of it as such but the data scientist recognizes the innate value of the varied manifestations of the different forms of data being continuously produced. Much of this data is being used now but it will require a true Unified Data Strategy to fully exploit the data as a whole.

Posted August 07, 2013

Progress Software announced availability of new data connectivity and application capabilities as part of its Progress Pacific application platform-as-a-service (aPaaS) for building and managing business applications on any cloud, mobile or social platform. "Pacific is a platform running in the cloud that is targeted at small and medium-size businesses, ISVs and departmental IT," John Goodson, chief product officer at Progress Software, explained in an interview. Instead of requiring a highly trained IT staff to build applications, Goodson says, the platform provides a visual design paradigm that allows users with limited skills to build powerful applications that can quickly connect to any data sources.

Posted August 06, 2013

Protegrity USA, Inc., a provider of end-to-end data security solutions, has announced general availability of release 6.5 of its Data Security Platform. This latest release expands the Protegrity Big Data Protector capabilities to include support and certification on many Apache Hadoop distributions. In addition, the new File Protector Gateway Server provides another option for fine-grain data protection of sensitive data before it enters Hadoop or other data stores.

Posted August 06, 2013

Datawatch Corporation, a provider of information optimization solutions, introduced new server and management automation capabilities for its Datawatch Monarch Professional, Datawatch Data Pump and Datawatch Enterprise Server products. Datawatch says the new releases of its flagship information optimization software will enable businesses to better secure, simplify and accelerate their big data and business intelligence applications, and also extend the technology to more users.

Posted August 06, 2013

Syncsort, a provider of big data integration solutions, is expanding its partner program to recruit regional systems integrators (RSIs) that have big data practices. The company is looking for RSIs that have specialized systems integration solutions and services expertise that will add value for customers using DMX-h, DMX and MFX ETL and Sort for a variety of use cases to sort, integrate and process big data in support of critical business intelligence and analytics.

Posted August 06, 2013

To encourage partners to build, market and sell software applications on top of technology platforms from SAP, the company has introduced the new SAP PartnerEdge program for Application Development. The new partnering model, a component of the SAP PartnerEdge program, is intended to help partners create and monetize innovative, specific applications in the mobile, cloud, database or high-performance in-memory areas. Participating partners will be able to also get go-to-market support, including the SAP partner logo, free application reviews and the ability to leverage SAP Store, the online channel from SAP for enterprise applications and services.

Posted August 01, 2013

IBM says it is accelerating its Linux on Power initiative with the new PowerLinux 7R4 server as well as new software and middleware applications geared for big data, analytics and next generation Java applications in an open cloud environment. According to IBM, the new PowerLinux 7R4 server, built on the same Power Systems platform running IBM's Watson cognitive computing solution, can provide clients the performance required for the new business-critical and data-intensive workloads increasingly being deployed in Linux environments. IBM is also expanding the portfolio of software for Power Systems with the availability of IBM Cognos Business Intelligence and EnterpriseDB database software, each optimized for Linux on Power.

Posted July 30, 2013

After four years of operating BigCouch in production, Cloudant has merged the BigCouch code back into the open source Apache CouchDB project. Cloudant provides a database-as-a-service and CouchDB serves as the foundation of Cloudant's technology. The company developed BigCouch, an open source variant of CouchDB, to support large-scale, globally distributed applications.There are three main reasons Cloudant is doing this, Adam Kocoloski, co-founder and CTO at Cloudant, told 5 Minute Briefing in an interview.

Posted July 30, 2013

TIBCO Software and Composite Software have teamed up to provide a complete analytic application stack. With the combination of the TIBCO Spotfire Analytics Platform and the Composite Data Virtualization Platform, the companies say, businesses will be able to get an analytic solution in production much faster than with alternatives, and allow them to quickly adapt as data sources and business needs change.

Posted July 30, 2013

While unstructured data may represent one of the greatest opportunities of the big data revolution, it is one of its most perplexing challenges. In many ways, the very core of big data is that it is unstructured, and many enterprises are not yet equipped to handle this kind of information in an organized, systematic way. Most of the world's enterprise databases—based on a model designed in the 1970s and 1980s that served enterprises well in the decades since—suddenly seem out-of-date, and clunky at best when it comes to managing and storing unstructured data. However, insights from these disparate data types—including weblog, social media, documents, image, text, and graphical files—are increasingly being sought by the business.

Posted July 30, 2013

Join DBTA and MarkLogic for a webcast on Wednesday, July 31, to learn about the essential technologies and approaches to succeeding with predictive analytics on Big Data. In a recent survey of Database Trends and Applications subscribers, predictive analytics was cited as the greatest opportunity that big data offers to their organizations. The reason is simple — whether you're fighting crime, delivering healthcare, scoring credit or fine-tuning marketing, predictive analytics is the key to identifying risks and opportunities and making better decisions. However, to leverage the power of predictive analytics, organizations must possess the right technology and skills.

Posted July 25, 2013

A new science called "data persona analytics" (DPA) is emerging. DPA is defined as the science of determining the static and dynamic attributes of a given data set so as to construct an optimized infrastructure that manages and monitors data injection, alteration, analysis, storage and protection while facilitating data flow. Each unique set of data both transient and permanent has a descriptive data personality profile which can be determined through analysis using the methodologies of DPA.

Posted July 25, 2013

Symantec has released Data Insight 4.0, the latest version of its unstructured data governance solution, which provides insight into the ownership and usage of unstructured data, such as documents, presentations, spreadsheets and emails. "Many organizations are in the dark as to what data they have, what information they have, who owns that data, and the relevance of that data to the business. The unstructured data environment is pretty much like a black hole with limited or no visibility," Ketan Shah, product manager, Symantec Storage & Availability Management Group, told 5 Minute Briefing during a recent interview.

Posted July 23, 2013

In the realm of 21st century data organization, the business function comes first. The form of the data and the tools to manage that data will be created and maintained for the singular purpose of maximizing a business's capability of leveraging its data. Initially, this seems like an obvious statement but when examining the manner in which IT has treated data over the past four decades it becomes painfully obvious that the opposite idea has been predominant.

Posted July 09, 2013

Splunk Inc., which provides a software platform for real-time operational intelligence, has introduced the beta version of Hunk: Splunk Analytics for Hadoop. Hunk is a new software product from Splunk that integrates exploration, analysis and visualization of data in Hadoop. According to Splunk, Hunk drives improvements in the speed and simplicity of interacting with and analyzing data in Hadoop without programming, costly integration or forced data migrations.

Posted July 02, 2013

Progress Software Corporation has expanded the range of data sources for its DataDirect Cloud service and DataDirect Connect family of products. With this release, DataDirect supports data sources from Cloudera Impala to Apache Hive to Greenplum, among others. The company is also announcing beta support for Social Media Data, Relational Data, NoSQL Data and ERP Data

Posted July 02, 2013

Datameer 3.0, the newest version of Datameer's big data analytics tool for business users, builds on existing self-service data integration, analytics, and visualization capabilities, to add new Smart Analytic functions, which, automatically identify patterns, relationships, and even recommendations based on data stored in Hadoop.

Posted July 02, 2013

Composite Software has released the Composite Data Virtualization Platform 6.2 SP3. This release really has four elements, Robert Eve, executive vice president of marketing, Composite, tells 5 Minute Briefing. It updates the Hortonworks, Cloudera and the Apache Distribution of Hadoop big data integrations through the HiveServer2 interface, and then fourth, provides new access to Cloudera CDH through Impala. "The key problem with big data is that there are not enough people that are skilled at the big data tools to use them effectively," Eve notes.

Posted June 27, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 27, 2013

RainStor, a provider of an enterprise database for managing and analyzing historical data, says it has combined the latest data security technologies in a comprehensive product update that has the potential to rapidly increase adoption of Apache Hadoop for banks, communications providers and government agencies.

Posted June 27, 2013

The amount of data being generated, captured and analyzed worldwide is increasing at a rate that was inconceivable a few years ago. Exciting new technologies and methodologies are evolving to address this phenomenon of science and culture creating huge new opportunities. These new technologies are also fundamentally changing the way we look at and use data. The rush to monetize "big data" makes the appeal of various "solutions" undeniable.

Posted June 27, 2013

Informatica Corporation, a provider of data integration software, has announced an OEM partnership agreement with Zettaset, a big data management vendor. In this partnership, Informatica PowerCenter Big Data Edition will embed within the Zettaset Orchestrator Hadoop cluster management solution, a management platform that automates, accelerates and simplifies Hadoop installation, cluster management, and security for big data deployments.

Posted June 26, 2013

Database Trends and Applications (DBTA) magazine has announced the inaugural "DBTA 100: The Companies That Matter Most in Data," a list saluting this year's companies in data and enterprise information management—from long-standing industry veterans to fast-growing startups tackling big data. "Beyond the explosion of interest surrounding big data, the past several years have transformed enterprise information management, creating both challenges and opportunities for companies seeking to protect, optimize, integrate, and extract actionable insight from a sea of data assets," remarked Thomas Hogan, group publisher of Database Trends and Applications.

Posted June 26, 2013

Terracotta, a provider of in-memory technologies for enterprise big data, is making Terracotta Universal Messaging available as a standalone product. Terracotta Universal Messaging is a software platform for big data applications using real-time data-streaming across a wide range of devices and networks. Addressing the growing challenge of eliminating bottlenecks in transferring data within and beyond enterprise systems to external devices, such as sensors, smart phones and databases, Terracotta's unified messaging solution extends the use of data beyond current enterprise boundaries and accelerates time to insights and action.

Posted June 25, 2013

These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.

Posted June 19, 2013

Datawatch Corporation, a provider of information optimization solutions, has agreed to acquire Panopticon Software AB, a privately held Swedish company specializing in the delivery of real-time visual data discovery solutions. The transaction is expected to be completed before the end of Datawatch's fiscal year September 30. The acquisition of Panopticon adds additional capabilities for Datawatch that are important in the big data space, Ben Plummer, senior vice president and chief marketing officer, Datawatch, tells 5 Minute Briefing.

Posted June 17, 2013

Software AG has agreed to acquire Apama, a platform for complex event processing (CEP) and CEP-powered solutions, from Progress Software. Apama CEP allows organizations to correlate and analyze business activities across multiple data streams in real-time and take immediate action in response. "Data can lose its value in an instant," said Robin Gilthorpe, CEO of Software AG's Terracotta and member of the Software AG Group Executive Board. "This acquisition is a major step in delivering on our strategy of empowering enterprises to derive meaningful business insights and value from big data."

Posted June 13, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 13, 2013

Dell and Oracle have announced an expanded worldwide alliance, through which the two companies will introduce a new x86 infrastructure offering that combines Dell's hardware with Oracle's software. As part of the agreement, Oracle has named Dell a preferred x86 partner, and Dell has named Oracle a preferred enterprise infrastructure partner, including Oracle Linux. The expanded partnership represents an extension of Oracle's engineered systems strategy, said Oracle president Mark Hurd.

Posted June 12, 2013

To help organizations make better decisions with big data, SAS has introduced SAS Decision Manager, which integrates predictive analytics, business rules, and data management. The announcement was made at The Premier Business Leadership Series event in Amsterdam, a business conference presented by SAS.

Posted June 11, 2013

DBTA and Tableau Software will present a webcast on how organizations can take advantage of their big data assets on Wednesday, June 20, at 11 am PT/ 2 pm ET. The webcast will enable attendees to gain a better understanding of the key challenges, the technology consideration, and the best practices involved in realizing fast and easy analytics from big data.

Posted June 11, 2013

At IBM Edge 2013 in Las Vegas, IBM announced enhancements across its systems portfolio. As part of the new offerings, IBM rolled out nine new Power Systems offerings, each providing advanced capabilities in big data analytics and cloud computing. the IBM i 25th Anniversary Edition.

Posted June 11, 2013

JackBe has released its Presto Real-Time Analytics Add-On With Terracotta BigMemory (RTA Add-On), which bundles Terracotta's enterprise-grade BigMemory in-memory data management platform for high performance in-memory analytics. According to the vendors, this combination of in-memory and analytics allows Presto to mash big data with live and transactional enterprise data into actionable dashboards faster.

Posted June 11, 2013

Dell, Intel Corporation, and Revolution Analytics have announced the launch of the Big Data Innovation Center in Singapore. Merging the expertise of these three enterprises, the center provides extensive training programs, proof-of-concept capabilities and solution development support on big data and predictive analytic innovations for Asian markets.

Posted June 11, 2013

Cloudera, provider of Apache Hadoop-based software and services has partnered with VMware, a provider of virtualization and cloud infrastructure solutions. Cloudera Enterprise is now certified to run on VMware VSphere. Through the VMware Ready program, VMware and Cloudera completed a joint validation process, enabling enterprises to simplify and accelerate the use of Apache Hadoop within virtual and cloud environments.

Posted June 04, 2013

Rogue Wave Software, provider of cross-platform software development tools, has partnered with Bright Computing to deliver Rogue Wave's TotalView advanced debugger in the Bright Cluster Manager product. Bright Computing customers will receive a scalable, multi-core debugger with both reverse and memory functions to boost productivity and shorten development lifecycles. This is being offered to Bright Computing customers via registration and download.

Posted June 04, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105

Sponsors