Newsletters




Big Data

The well-known three Vs of Big Data - Volume, Variety, and Velocity – are increasingly placing pressure on organizations that need to manage this data as well as extract value from this data deluge for Predictive Analytics and Decision-Making. Big Data technologies, services, and tools such as Hadoop, MapReduce, Hive and NoSQL/NewSQL databases and Data Integration techniques, In-Memory approaches, and Cloud technologies have emerged to help meet the challenges posed by the flood of Web, Social Media, Internet of Things (IoT) and machine-to-machine (M2M) data flowing into organizations.



Big Data Articles

As part of multi-year strategic initiative with Oracle, Accenture has created a set of solutions incorporating Oracle Engineered Systems into its data center transformation consulting and outsourcing services.

Posted September 18, 2013

RainStor, a provider of an enterprise database for managing and analyzing all historical data, has introduced RainStor FastForward, a new product that enables customers to re-instate data from Teradata tape archives (also known as BAR for Backup, Archive and Restore) and move it to RainStor for query. The new RainStor FastForward product resolves a pressing challenge for Teradata customers that need to archive their Teradata warehouse data to offline tape, which can make it difficult to access and query that data when business and regulatory users require it, Deirdre Mahon, vice president of marketing, RainStor, explained in an interview.

Posted September 17, 2013

Syncsort, a provider of big data integration solutions, has announced the availability of MFX ZPCopy, a new software product that can offload mainframe copy processing to zIIP engines, and can be licensed separately as an add-on to Syncsort MFX. After looking at mainframe processing from several customers, Syncsort realized that copy-related processing accounts for hundreds of hours of CPU processing time annually and contributes to batch window bottlenecks, inflating software costs and making it more difficult to meet SLA, said Jorge Lopez, director of product marketing at Syncsort, in an interview.

Posted September 17, 2013

There may be no more commonly used term in today's IT conversations than "big data." There also may be no more commonly misused term. Here's a look at the truth behind the five most common big data myths, including the misguided but almost universally accepted notion that big data applies only to large organizations dealing with great volumes of data.

Posted September 17, 2013

Attunity Ltd., a provider of information availability software solutions, has released a new version of its data replication software intended to address requirements for big data analytics, business intelligence, business continuity and disaster recovery initiatives. Addressing expanding use cases for the solution, Attunity Replicate 3.0, is engineered to provide secure data transfer over long distances such as wide area networks (WANs), the cloud and satellite connections, said Lawrence Schwartz, vice president of marketing at Attunity, in an interview.

Posted September 16, 2013

IBM has introduced an array of new software, system and services offerings to help organizations manage big data projects. The technology is aimed at helping customers increase their confidence in their data, their speed in gaining business value out of their data, and sharpen their skill sets to address big data challenges."We have to hold that data to the same standards, manage it, and govern it appropriately for the enterprise. You can't drop those standards because it is unstructured data," said Nancy Kopp-Hensley, a director in product marketing and strategy for Big Data Systems at IBM, in an interview.

Posted September 16, 2013

Oracle holds an enviable position in the IT marketplace with a wide array of database systems, development tools, languages, platforms, enterprise applications, and servers. Riding the coattails of this industry giant is a healthy and far-flung ecosystem of software developers, integrators, consultants, and OEMs. These are the partners that will help make or break Oracle's struggle with new forces disrupting the very foundations of IT. And lately, Oracle—long known for its own brand of xenophobia and disdain for direct competitors—has been making a lot of waves by forging new alliances with old foes. This is opening up potentially lucrative new frontiers for business partners at all levels.

Posted September 16, 2013

Pentaho has launched Pentaho Business Analytics 5.0. The new release represents a redesign of the company's data integration and analytics platform, and provides analytics for big data-driven businesses supported by more than 250 new features and improvements. Pentaho Business Analytics 5.0 is the culmination of years of work on development and also incorporates the result of usability studies on the Pentaho Business Analytics interface. "This is really an evolution of our platform as a whole. It is a significant release with a simplified analytics experience," Donna Prlich, Pentaho senior director of product and solution marketing, tells 5 Minute Briefing.

Posted September 12, 2013

If you look at what is really going on in the big data space it's all about inexpensive open source solutions that are facilitating the modernization of data centers and data warehouses, and at the center of this universe is Hadoop. In the evolution of the big data market, open source is playing a seminal role as the "disruptive technology" challenging the status quo. Additionally, organizations large and small are leveraging these solutions often based on inexpensive hardware and memory platforms in the cloud or on premise.

Posted September 11, 2013

Oracle has introduced its latest ZFS Storage Appliances, the ZS3 Series, aimed at enabling customers to improve operational efficiencies, reduce data center costs, and increase business application performance.

Posted September 11, 2013

Cloudera has announced the general availability of Cloudera Search, a search engine for interactive exploration of data stored in the Hadoop Distributed File System (HDFS) and Apache HBase. In addition, the accompanying add-on RTS (Real-time Search) subscription provides technical support, legal indemnification and continual influence over the development of the open source project.

Posted September 10, 2013

To encourage partners to build, market and sell software applications on top of technology platforms from SAP, the company has introduced the new SAP PartnerEdge program for Application Development. The new partnering model, a component of the SAP PartnerEdge program, is intended to help partners create and monetize innovative, specific applications in the mobile, cloud, database or high-performance in-memory areas. Participating partners will be able to also get go-to-market support, including the SAP partner logo, free application reviews and the ability to leverage SAP Store, the online channel from SAP for enterprise applications and services.

Posted August 31, 2013

Building on the momentum of SAP's OEM (original equipment manufacturer) partner base in North America, three OEM partners - AlertEnterprise, Clockwork, and PROS - will offer their customers access to solutions with SAP HANA, a real-time in-memory technology platform. SAP says it has quadrupled the total number of OEM partners licensing SAP HANA in the first 6 months of 2013 compared to the last 6 months of 2012, demonstrating strong partner adoption in North America and globally.

Posted August 31, 2013

Ramping up for Oracle OpenWorld 2013, Oracle today announced the Oracle executive keynote schedule. Leading the lineup, CEO Larry Ellison will present the welcome keynote, "Oracle Database 12c In-Memory Database and M6 Big Memory Machine," on Sunday evening, Sept. 22, at 5 pm. Oracle OpenWorld 2013 takes place September 22 - September 26, 2013 at Moscone Center in San Francisco.

Posted August 27, 2013

Revolution Analytics, a commercial provider of software, services and support for the open source R project, plans to offer increased support for Hadoop as a platform for big data analytics with Cloudera CDH3 and CDH4 in its upcoming release of Revolution R Enterprise 7.0.

Posted August 27, 2013

Even before all the new data sources and platforms that big data has come to represent arrived on the scene, providing users with access to the information they need when they need it was a big challenge. What has changed today? The growing range of data types beyond traditional RDBMS data - and a growing awareness that effectively leveraging data from a wide variety of sources will result in the ability to compete more effectively.Join DBTA on Thursday August 29, at 11 am PT/ 2 pm ET for a special roundtable webcast to learn about the essential technologies and approaches that help to overcome the big data integration challenges that get in the way of gaining actionable insights.

Posted August 21, 2013

Data analytics, long the obscure pursuit of analysts and quants toiling in the depths of enterprises, has emerged as the must-have strategy of organizations across the globe. Competitive edge not only comes from deciphering the whims of customers and markets but also being able to predict shifts before they happen. Fueling the move of data analytics out of back offices and into the forefront of corporate strategy sessions is big data, now made enterprise-ready through technology platforms such as Hadoop and MapReduce. The Hadoop framework is seen as the most efficient file system and solution set to store and package big datasets for consumption by the enterprise, and MapReduce is the construct used to perform analysis over Hadoop files.

Posted August 21, 2013

Progress Software announced availability of new data connectivity and application capabilities as part of its Progress Pacific application platform-as-a-service (aPaaS) for building and managing business applications on any cloud, mobile or social platform. "Pacific is a platform running in the cloud that is targeted at small and medium-size businesses, ISVs and departmental IT," John Goodson, chief product officer at Progress Software, explained in an interview. Instead of requiring a highly trained IT staff to build applications, Goodson says, the platform provides a visual design paradigm that allows users with limited skills to build powerful applications that can quickly connect to any data sources.

Posted August 21, 2013

Even before all the new data sources and platforms that big data has come to represent arrived on the scene, providing users with access to the information they need when they need it was a big challenge. What has changed today? The growing range of data types beyond the traditional RDBMS data - and a growing awareness that effectively leveraging all this data will result in the ability to compete more effectively. Join DBTA for a special roundtable webcast on Thursday, August 29, to learn about the essential technologies and approaches that help to overcome the big data integration challenges that get in the way of gaining actionable insights.

Posted August 20, 2013

Pentaho Corporation, a provider of big data analytics and data integration software, has formed an alliance with Splunk Inc. to provide a big data analytics solution that enables business users to analyze machine data that is generated by websites, applications, servers, storage, network, mobile and other devices, including system sensors. As a result of the new alliance, Splunk customers will have the ability to use Pentaho's platform to do two things, said Eddie White, executive vice president, business development at Pentaho, in an interview.

Posted August 13, 2013

NuoDB has announced the last release of its current product version and a technology preview of some upcoming second-generation features available later in 2013. The preview is contained in the free download of the new NuoDB Starlings Release 1.2. The NewSQL approach is gaining greater acceptance, said Barry Morris, founder and CEO of NuoDB, in an interview. "What people are saying back to us is that they are getting all of the features of NoSQL without throwing SQL or transactions away. And that concept is becoming the popular notion of what NewSQL is."

Posted August 13, 2013

Database Trends and Applications has launched a special "Who to See at Oracle OpenWorld" section online where you can find information on what to expect at this year's conference and premium vendors that offer products and services to serve your needs as an Oracle technology professional.

Posted August 09, 2013

Oracle is advancing the role of Java for IoT (Internet of Things) with the latest releases of its Oracle Java Embedded product portfolio - Oracle Java ME Embedded 3.3 and Oracle Java ME Software Development Kit (SDK) 3.3, a complete client Java runtime and toolkit optimized for microcontrollers and other resource-constrained devices. Oracle is also introducing the Oracle Java Platform Integrator program to provide partners with the ability to customize Oracle Java ME Embedded products to reach different device types and market segments. "We see IoT as the next big wave that will hit the industry," Oracle's Peter Utzschneider, vice president of product management, explained during a recent interview.

Posted August 07, 2013

More "things" are now connected to the internet than people, a phenomenon dubbed The Internet of Things. Fueled by machine-to-machine (M2M) data, the Internet of Things promises to make our lives easier and better, from more efficient energy delivery and consumption to mobile health innovations where doctors can monitor patients from afar. However, the resulting tidal wave of machine-generated data streaming in from smart devices, sensors, monitors, meters, etc., is testing the capabilities of traditional database technologies. They simply can't keep up; or when they're challenged to scale, are cost-prohibitive.

Posted August 07, 2013

In many ways, Hadoop is the most concrete technology underlying today's big data revolution, but it certainly does not satisfy those who want quick answers from their big data. Hadoop - at least Hadoop 1.0 - is a batch-oriented framework that allows for the economical execution of massively parallel workloads, but provides no capabilities for interactive or real-time execution.

Posted August 07, 2013

Consider a professional baseball game or any other popular professional sporting event. When a fan sits in the upper deck of Dodger Stadium in Los Angeles, or any other sporting arena on earth, the fan is happily distracted from the real world. Ultimately, professional sports constitutes a trillion dollar industry - an industry whose product is on the surface entertainment but when one barely pierces that thin veneer it quickly becomes clear that the more significant product produced is data. A fan sitting in the upper deck does not think of it as such but the data scientist recognizes the innate value of the varied manifestations of the different forms of data being continuously produced. Much of this data is being used now but it will require a true Unified Data Strategy to fully exploit the data as a whole.

Posted August 07, 2013

Protegrity USA, Inc., a provider of end-to-end data security solutions, has announced general availability of release 6.5 of its Data Security Platform. This latest release expands the Protegrity Big Data Protector capabilities to include support and certification on many Apache Hadoop distributions. In addition, the new File Protector Gateway Server provides another option for fine-grain data protection of sensitive data before it enters Hadoop or other data stores.

Posted August 06, 2013

Datawatch Corporation, a provider of information optimization solutions, introduced new server and management automation capabilities for its Datawatch Monarch Professional, Datawatch Data Pump and Datawatch Enterprise Server products. Datawatch says the new releases of its flagship information optimization software will enable businesses to better secure, simplify and accelerate their big data and business intelligence applications, and also extend the technology to more users.

Posted August 06, 2013

Syncsort, a provider of big data integration solutions, is expanding its partner program to recruit regional systems integrators (RSIs) that have big data practices. The company is looking for RSIs that have specialized systems integration solutions and services expertise that will add value for customers using DMX-h, DMX and MFX ETL and Sort for a variety of use cases to sort, integrate and process big data in support of critical business intelligence and analytics.

Posted August 06, 2013

IBM says it is accelerating its Linux on Power initiative with the new PowerLinux 7R4 server as well as new software and middleware applications geared for big data, analytics and next generation Java applications in an open cloud environment. According to IBM, the new PowerLinux 7R4 server, built on the same Power Systems platform running IBM's Watson cognitive computing solution, can provide clients the performance required for the new business-critical and data-intensive workloads increasingly being deployed in Linux environments. IBM is also expanding the portfolio of software for Power Systems with the availability of IBM Cognos Business Intelligence and EnterpriseDB database software, each optimized for Linux on Power.

Posted July 30, 2013

After four years of operating BigCouch in production, Cloudant has merged the BigCouch code back into the open source Apache CouchDB project. Cloudant provides a database-as-a-service and CouchDB serves as the foundation of Cloudant's technology. The company developed BigCouch, an open source variant of CouchDB, to support large-scale, globally distributed applications.There are three main reasons Cloudant is doing this, Adam Kocoloski, co-founder and CTO at Cloudant, told 5 Minute Briefing in an interview.

Posted July 30, 2013

TIBCO Software and Composite Software have teamed up to provide a complete analytic application stack. With the combination of the TIBCO Spotfire Analytics Platform and the Composite Data Virtualization Platform, the companies say, businesses will be able to get an analytic solution in production much faster than with alternatives, and allow them to quickly adapt as data sources and business needs change.

Posted July 30, 2013

While unstructured data may represent one of the greatest opportunities of the big data revolution, it is one of its most perplexing challenges. In many ways, the very core of big data is that it is unstructured, and many enterprises are not yet equipped to handle this kind of information in an organized, systematic way. Most of the world's enterprise databases—based on a model designed in the 1970s and 1980s that served enterprises well in the decades since—suddenly seem out-of-date, and clunky at best when it comes to managing and storing unstructured data. However, insights from these disparate data types—including weblog, social media, documents, image, text, and graphical files—are increasingly being sought by the business.

Posted July 30, 2013

Join DBTA and MarkLogic for a webcast on Wednesday, July 31, to learn about the essential technologies and approaches to succeeding with predictive analytics on Big Data. In a recent survey of Database Trends and Applications subscribers, predictive analytics was cited as the greatest opportunity that big data offers to their organizations. The reason is simple — whether you're fighting crime, delivering healthcare, scoring credit or fine-tuning marketing, predictive analytics is the key to identifying risks and opportunities and making better decisions. However, to leverage the power of predictive analytics, organizations must possess the right technology and skills.

Posted July 25, 2013

A new science called "data persona analytics" (DPA) is emerging. DPA is defined as the science of determining the static and dynamic attributes of a given data set so as to construct an optimized infrastructure that manages and monitors data injection, alteration, analysis, storage and protection while facilitating data flow. Each unique set of data both transient and permanent has a descriptive data personality profile which can be determined through analysis using the methodologies of DPA.

Posted July 25, 2013

Symantec has released Data Insight 4.0, the latest version of its unstructured data governance solution, which provides insight into the ownership and usage of unstructured data, such as documents, presentations, spreadsheets and emails. "Many organizations are in the dark as to what data they have, what information they have, who owns that data, and the relevance of that data to the business. The unstructured data environment is pretty much like a black hole with limited or no visibility," Ketan Shah, product manager, Symantec Storage & Availability Management Group, told 5 Minute Briefing during a recent interview.

Posted July 23, 2013

In the realm of 21st century data organization, the business function comes first. The form of the data and the tools to manage that data will be created and maintained for the singular purpose of maximizing a business's capability of leveraging its data. Initially, this seems like an obvious statement but when examining the manner in which IT has treated data over the past four decades it becomes painfully obvious that the opposite idea has been predominant.

Posted July 09, 2013

Splunk Inc., which provides a software platform for real-time operational intelligence, has introduced the beta version of Hunk: Splunk Analytics for Hadoop. Hunk is a new software product from Splunk that integrates exploration, analysis and visualization of data in Hadoop. According to Splunk, Hunk drives improvements in the speed and simplicity of interacting with and analyzing data in Hadoop without programming, costly integration or forced data migrations.

Posted July 02, 2013

Progress Software Corporation has expanded the range of data sources for its DataDirect Cloud service and DataDirect Connect family of products. With this release, DataDirect supports data sources from Cloudera Impala to Apache Hive to Greenplum, among others. The company is also announcing beta support for Social Media Data, Relational Data, NoSQL Data and ERP Data

Posted July 02, 2013

Datameer 3.0, the newest version of Datameer's big data analytics tool for business users, builds on existing self-service data integration, analytics, and visualization capabilities, to add new Smart Analytic functions, which, automatically identify patterns, relationships, and even recommendations based on data stored in Hadoop.

Posted July 02, 2013

Composite Software has released the Composite Data Virtualization Platform 6.2 SP3. This release really has four elements, Robert Eve, executive vice president of marketing, Composite, tells 5 Minute Briefing. It updates the Hortonworks, Cloudera and the Apache Distribution of Hadoop big data integrations through the HiveServer2 interface, and then fourth, provides new access to Cloudera CDH through Impala. "The key problem with big data is that there are not enough people that are skilled at the big data tools to use them effectively," Eve notes.

Posted June 27, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78

Sponsors