Newsletters




Big Data

The well-known three Vs of Big Data - Volume, Variety, and Velocity – are increasingly placing pressure on organizations that need to manage this data as well as extract value from this data deluge for Predictive Analytics and Decision-Making. Big Data technologies, services, and tools such as Hadoop, MapReduce, Hive and NoSQL/NewSQL databases and Data Integration techniques, In-Memory approaches, and Cloud technologies have emerged to help meet the challenges posed by the flood of Web, Social Media, Internet of Things (IoT) and machine-to-machine (M2M) data flowing into organizations.



Big Data Articles

Cloudant, an IBM company, has introduced an on-premises version of Cloudant software that companies can install in their own data centers to run their own DBaaS. According to the company, the addition of Cloudant Local (IBM Cloudant Data Layer Local Edition), provides customers with a strategy for managing application data with any mix of infrastructure and deployment strategy.

Posted October 30, 2014

Percona, which provides enterprise-grade MySQL Support, consulting, training, managed services, and server development service, will be hosting Percona Live 2014 in London from November 3 to 4.

Posted October 29, 2014

Twitter and IBM have formed a new partnership to help improve organizations' understanding of their customers, markets and trends. The alliance brings together Twitter data with IBM's cloud-based analytics, customer engagement platforms, and consulting services. IBM says the collaboration will focus on 3 key areas.

Posted October 29, 2014

VMware has acquired the assets of Continuent. The Continuent team is joining VMware's Hybrid Cloud Business Unit. The acquisition offers "concrete benefits" to Continuent customers, said Robert Hodges, CEO of Continuent.

Posted October 29, 2014

While data warehouses have been the main data storage repository for companies since the 1970s, companies have begun to look on the horizon for what is next. To provide information about the key technologies, features, best practices, and pitfalls to consider when evaluating a data lake approach, Database Trends and Applications recently hosted a special roundtable webcast presented by Rich Reimer, VP of marketing and product management, Splice Machine; Rodan Zadeh, director of product marketing, Attunity; and George Corugedo, CTO and co-founder, RedPoint Global Inc.

Posted October 28, 2014

Within many companies' marketing departments there is a greater emphasis than ever before on using big data to make their products more appealing to customers. A major use for the data is to not only provide the best possible experience for the consumer, but to be able to provide it efficiently. Teradata's enhancements to the Teradata Integrated Marketing Cloud. are aimed at improving digital asset management and performance, real-time interaction management, and use of data in real time.

Posted October 28, 2014

Platfora, which provides a big data analytics platform built natively on Hadoop and Spark, has introduced Platfora 4.0 with advanced visualizations, geo-analytics capabilities, and collaboration features to enable users with a range of skill levels to work iteratively with data at scale.

Posted October 28, 2014

Rocket Software has announced Rocket Data Virtualization version 2.1, a mainframe data virtualization solution for universal access to data, regardless of location, interface or format.

Posted October 28, 2014

Protegrity, a provider of data security solutions, has announced an expanded partnership with Hadoop platform provider Hortonworks. Protegrity Avatar for Hortonworks extends the capabilities of HDP native security with Protegrity Vaultless Tokenization (PVT) for Apache Hadoop, Extended HDFS Encryption, and the Protegrity Enterprise Security Administrator, for advanced data protection policy, key management and auditing.

Posted October 28, 2014

Big data continues to grow at an exponential rate for many enterprises. One issue that continues to grow as well is the threat to data security.

Posted October 28, 2014

At SAP TechEd & d-code, SAP announced new innovations for the latest release of SAP HANA, the fall update of SAP HANA Cloud Platform, and a new SAP API Management technology.

Posted October 22, 2014

SAP SE has announced the SAP Cloud for Planning solution, an enterprise performance management (EPM) solution designed around user experience and built for the cloud. The SAP Cloud for Planning solution will be built natively on SAP HANA Cloud Platform, the in-memory platform-as-a-service (PaaS) from SAP.

Posted October 22, 2014

Attunity has introduced Replicate 4.0 which provides high-performance data loading and extraction for Apache Hadoop. The solution has been certified with the Hortonworks and Cloudera Hadoop distributions.

Posted October 22, 2014

SAP and BI provider Birst have formed a partnership to provide analytics in the cloud on the SAP HANA Cloud Platform. This collaboration intends to bring together the next-generation cloud platform from SAP with Birst's two-tier data architecture to provide instant access to an organization's data and help eliminate BI wait time.

Posted October 22, 2014

Oracle Platinum Partner Data Intensity, a provider of Oracle-focused application management and cloud services, has acquired business analytics and database management specialist CLEAR MEASURES. According to Data Intensity, the acquisition will enable it to extend its database technology coverage and remote managed services, as well as enter the analytics and business intelligence services market with proven solutions that are already used in more than 200 customer implementations.

Posted October 22, 2014

Oracle has expanded its data integration portfolio with the addition of Oracle Enterprise Metadata Management, a platform to help organizations govern data across the enterprise including structured and unstructured data, and across Oracle and third-party data integration, database, and business analytics platforms. "This is the first time that we have made a comprehensive offering in the area of metadata management," said Jeff Pollock, vice president of product management for Oracle Data Integration.

Posted October 22, 2014

At the most fundamental level, consider that at the end of the day NoSQL and SQL are essentially performing the same core task — storing data to a storage medium and providing a safe and efficient way to later retrieve said data. Sounds pretty simple — right? Well, it really is with a little planning and research. Here's a simple checklist of 5 steps to consider as you embark into the world of NoSQL databases.

Posted October 22, 2014

Apache Hadoop has been a great technology for storing large amounts of unstructured data, but to do analysis, users still need to reference data from existing RDBMS based systems. This topic was addressed in "From Oracle to Hadoop: Unlocking Hadoop for Your RDBMS with Apache Sqoop and Other Tools," a session at the Strata + Hadoop World conference, presented by Guy Harrison, executive director of Research and Development at Dell Software, David Robson, principal technologist at Dell Software, and Kathleen Ting, a technical account manager at Cloudera and a co-author of O'Reilly's Apache Sqoop Cookbook.

Posted October 22, 2014

In his presentation at the Strata + Hadoop World conference, titled "Unseating the Giants: How Big Data is Causing Big Problems for Traditional RDBMSs," Monte Zweben, CEO and co-founder of Splice Machine, addressed the topic of scale-up architectures as exemplified by traditional RDBMS technologies versus scale-out architectures, exemplified by SQL on Hadoop, NoSQL and NewSQL solutions.

Posted October 22, 2014

Today, many companies still have most of their transactional data in relational database management systems which support various business-critical applications, from order entry to financials. But in order to maintain processing performance, most companies limit the amount of data stored there, making it less useful for in-depth analysis. One alternative, according to a recent DBTA webcast presented by Bill Brunt, product manager, SharePlex, at Dell, and Unisphere Research analyst Elliot King, is moving the data to Hadoop to allow it to be inexpensively stored and analyzed for new business insight.

Posted October 22, 2014

Microsoft SQL Server 2014 finally went RTM (Released to Manufacturing) at the beginning of this month. Here's a look at the key new features within three major areas of enhancement: Mission-Critical Performance, Business Intelligence, and Hybrid Cloud.

Posted October 22, 2014

To help simplify the process for the user with self-service BI tools, Logi Analytics has announced the latest version of its business intelligence platform Logi Info. "Self-service has been around for a while, but it never seems to deliver on its promise. Largely, that is because we are mismatching people and their capabilities with the tool sets and information they need," explained Brian Brinkmann, VP of Product for Logi Analytics.

Posted October 21, 2014

MapR Technologies, one of the top ranked distributors for Hadoop, has announced that MapR-DB is now available for unlimited production use in the freely-downloadable MapR Community Edition. "From a developer standpoint, they can combine the best of Hadoop, which is deep predictive analytics across the data, as well as a NoSQL database for real-time operations," explained Jack Norris, chief marketing officer for MapR Technologies.

Posted October 21, 2014

Companies are facing the "big squeeze" created by IT budgets that are relatively flat, growing by only 3% to 4% a year, versus data growth that is averaging 30% to 40%, and a consensus that data is a valuable commodity that cannot be thrown away, said Monte Zweben, CEO and co-founder of Splice Machine in his presentation at the Strata + Hadoop World conference.

Posted October 21, 2014

Datameer has introduced Datameer 5.0 with Smart Execution, a technology that examines dataset characteristics, analytics tasks and available system resources to determine the most appropriate execution framework for each workload.

Posted October 21, 2014

At Strata + Hadoop World in New York, Microsoft announced an update to Microsoft Azure HDInsight, its cloud-based distribution of Hadoop. Customers can now process millions of Hadoop events in near real time, with Microsoft's preview of support for Apache Storm clusters in Azure HDInsight. In addition, as part of its integration with the Azure platform, Hortonworks announced that the Hortonworks Data Platform (HDP) has achieved Azure Certification.

Posted October 20, 2014

Revolution Analytics, a commercial provider of open source R software, has released Revolution R Open and Revolution R Plus.

Posted October 15, 2014

With Cloudera 5.2 the focus is on building products to deliver on the promise of the enterprise data hub that Cloudera introduced last year, said Clarke Patterson, senior director of product marketing at Cloudera. In particular, new capabilities make the technology more accessible to users who are not data scientists and also increase the level of security, two hurdles which can stand in the way of Hadoop adoption.

Posted October 15, 2014

Share the tips and best practices you use in your work all the time—the code snippets and favorite queries you keep within easy reach. "Best practices" include acknowledged and sensible ways of carrying out an activity; "tips" illustrate how some activity could be done in a better way or reveal some undocumented or not-so-well-known feature.

Posted October 15, 2014

Informatica PowerCenter v. 9.6.1 and Data Quality v. 9.6.1 have achieved Oracle Exadata Optimized and Oracle SuperCluster Optimized status through the Oracle PartnerNetwork (OPN). Customers can utilize Informatica PowerCenter and Data Quality to ingest, cleanse and transform various types of data into Oracle Exadata and Oracle SuperCluster to maximize the value of their engineered systems investment.

Posted October 15, 2014

The newest release of Oracle Exalytics In-Memory Machine, an engineered system for business analytics, includes Intel Xeon processors customized for Oracle business analytics workloads, supporting 50% speed, 50% more processing cores and 50% more memory compared to the previous generation. The Oracle Database In-Memory has also been certified with Oracle Exalytics In-Memory Machine, expanding the scope of in-memory analytics to include the full capabilities of the Oracle Database.

Posted October 15, 2014

Generally available today, EMC and Pivotal have announced the Data Lake Hadoop Bundle 2.0 that includes EMC's Data Computing Appliance (DCA), a high-performance big data computing appliance for deployment and scaling of Hadoop and advanced analytics, Isilon scale-out NAS (network attached storage), as well as the Pivotal HD Hadoop distribution and the Pivotal HAWQ parallel SQL query engine. The idea is to provide a turn-key offering that combines compute, analytics and storage for customers building scale-out data lakes for enterprise predictive analytics.

Posted October 14, 2014

ParStream has introduced an analytics platform purpose-built for the speed and scale of the Internet of Things (IoT). The ParStream Analytics Platform is designed to scale to handle the massive volumes and high velocity of IoT data and is expected to help companies generate actionable insights by enabling analysis with greater flexibility and closer to the source.

Posted October 14, 2014

Building on its data lake approach, Pivotal today announced the next step in this vision with the implementation of an architecture that builds upon disk-based storage with memory-centric processing frameworks.

Posted October 14, 2014

The new Dell In-Memory Appliance for Cloudera Enterprise is designed to provide customers with a processing engine combined with interactive analytics in a preconfigured and scalable solution, and will begin shipping Oct. 15, 2014.

Posted October 14, 2014

In its first server announcement since completing the IBM System x server acquisition, Lenovo has announced plans to collaborate with VMware. This alliance extends the 16-year development relationship between System x and VMware and broadens the partnership to include the full range of Lenovo's expanded server business.

Posted October 14, 2014

MongoDB has introduced enhancements to MongoDB Management Service (MMS), a cloud service to simplify operations for MongoDB deployments and reduce operational overhead.

Posted October 14, 2014

To help IT organizations extend the agility provided by continuous integration into continuous delivery, VMware also unveiled a major update of VMware vRealize Operations (formerly VMware vCenterOperations Management Suite. The new VMware vRealize Code Stream enables DevOps teams to deliver frequent, reliable software releases.

Posted October 14, 2014

Dataguise, a provider of security and data governance solutions for big data, has expanded its DgSecure platform to support Hadoop in the cloud, including full support for Amazon EMR (Elastic MapReduce). Additionally, big data cloud service providers, Altiscale and Qubole, have joined Dataguise's Big Data Protection Partner Program (BDP3) to leverage DgSecure in providing comprehensive discovery, protection and visibility to sensitive data for their cloud-based Hadoop customers.

Posted October 13, 2014

Two former Facebook engineers, Bobby Johnson and Lior Abraham, and former Intel engineer, Ann Johnson, have formed Interana to address what they say is an analytics void in event data. Espousing the philosophy that event data holds the key business metrics that companies care about most, Interana's solution is a database that has been specifically designed for event time data. Many methods in the past involved using general-purpose systems which were not designed to answer the types of questions posed by event data and it also took days to process, according to the company.

Posted October 13, 2014

Splunk, which provides software for machine-generated big data analysis, has announced Splunk Enterprise 6.2, Splunk Mint, and Splunk Hunk 6.2. "What we are doing with this release is fundamentally broadening the number of users that can do advanced analytics," stated Shay Mowlem, VP, product marketing at Splunk.

Posted October 13, 2014

IBM is adding new analytics capabilities to the mainframe platform, helping enable better data security and providing clients with the ability to integrate Hadoop big data. By applying analytic tools to business transactions as they are occurring, mainframe systems can enable clients to have true real-time insights. With the analytics on the System z platform, clients can also incorporate social media into their real-time analytic.

Posted October 13, 2014

GT Software has added enhancements to its flagship Ivory Service Suite line, incorporating greater support for big data elements and messaging formats.

Posted October 13, 2014

IBM, which has made a billion-dollar investment to broaden the use of cognitive computing, is announcing the launch of Watson World HQ today at 51 Astor Place. IBM said it chose NYC's Silicon Alley for Watson World to tap into the ecosystem of talent and capital centered around New York University, Columbia University, CUNY and Cooper Union, as well as venture capital firms and an expanding tech startup and developer community. Starting now, Watson's cognitive services and tools will be available to all users of Bluemix, IBM's open, cloud-based platform for mobile and web app development.

Posted October 13, 2014

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16

Sponsors