Newsletters




Big Data

The well-known three Vs of Big Data - Volume, Variety, and Velocity – are increasingly placing pressure on organizations that need to manage this data as well as extract value from this data deluge for Predictive Analytics and Decision-Making. Big Data technologies, services, and tools such as Hadoop, MapReduce, Hive and NoSQL/NewSQL databases and Data Integration techniques, In-Memory approaches, and Cloud technologies have emerged to help meet the challenges posed by the flood of Web, Social Media, Internet of Things (IoT) and machine-to-machine (M2M) data flowing into organizations.



Big Data Articles

Big data is causing a big shift in three areas - people, process and technology, said SAS thought leader Anne Buff, who gave a keynote at Data Summit dotted with examples of the rich opportunities for greater customer insight that can be achieved as well as the pitfalls lying in wait for companies that fail to use their data wisely.

Posted May 21, 2014

Couchbase Mobile is a new suite of products that combines mobile data synchronization and a native NoSQL mobile database. According to Couchbase, the products are designed to eliminate the network connection as a limitation for application features in order to meet the always-on expectations of today's users.

Posted May 21, 2014

Delphix has formed a new partnership with DBmaestro to increase the speed and lower the cost of application development. According to the companies, teams or individual developers can work in parallel, with an enforced version control process, making sure best practices are followed, and integrating and merging their individual database changes, thus completing application development projects in less time and with lower cost.

Posted May 20, 2014

To help simplify solution implementation, increase productivity and accelerate time to value with Oracle engineered systems for clients, Accenture is acquiring Enkitec, an Oracle Platinum partner specializing in Oracle Exadata implementations and Oracle database administration and development.

Posted May 20, 2014

The Internet of Things is resulting in more customer data at executives' fingertips than ever before. In 2009, there were 2.5 billion things connected to the internet, creating data. That number will explode to 30 billion things by 2020. So how can businesses harness the power of all that customer data being generated by this new networked reality to create insights that help build lifelong customer relationships? And how can they keep from being overwhelmed by all that noise?

Posted May 14, 2014

Dell KACE K1000 version 6.0 enables organizations to boost their Internet of Things (IoT) readiness with new features that provide greater visibility across their entire IT infrastructure, and create a foundation for improved endpoint security.

Posted May 13, 2014

Concurrent CEO Gary Nakamura says the latest release of Cascading will give enterprises the flexibility to build data-oriented applications on Hadoop once, and then run the applications on the platform that best meets their business needs. "What we are providing is a standard way to develop data-centric applications without the risk of having to rewrite those applications when distributions or the providers of the computation engines underneath it change direction one day."

Posted May 13, 2014

ParStream, provider of a real time database for fast analytics, and BI vendor Yellowfin, have formed a new alliance. The alliance will enable organizations to process massive amounts of stored and streaming data, in structured and semi-structured formats, in real-time, which can then be delivered throughout the enterprise in the form of visualizations to support decision-makers with timely, fact-based actionable intelligence.

Posted May 13, 2014

DataStax and Databricks are partnering to integrate Cassandra and Spark. "More and more, we see customers in the community wanting to do analytics on data in as real time as possible. That is what this is really about," said Martin Van Ryswyk, executive vice president of engineering, DataStax.

Posted May 12, 2014

The value proposition for the Splice Machine database, according to Monte Zweben, CEO and cofounder of Splice Machine, is that it enables companies to replace traditional RDBMSs when they hit a wall, either from a performance or cost perspective, with a full-featured, transactional SQL database on Hadoop, to power both operational applications and real-time analytics.

Posted May 12, 2014

More than 30 new cloud services have been added to IBM's BlueMix Platform-as-a-Service (PaaS), which is designed to help developers more quickly integrate applications and speed deployment of new cloud services. IBM is leveraging BlueMix's foundation on SoftLayer for this expansion.

Posted May 12, 2014

EMC is acquiring DSSD, Inc., a developer of rack-scale flash storage architecture for I/O-intensive in-memory databases and big data workloads. DSSD is complementary to EMC's all-flash and hybrid storage portfolio, and will support the emerging tier of next-generation in-memory and big data workloads, according to David Goulden, CEO of EMC Information Infrastructure.

Posted May 12, 2014

It's an inevitable fact that every software system will have problems, but an enterprise-grade Hadoop infrastructure puts minimizing and managing these system errors at the forefront. When considering a distribution's dependability, you should evaluate a Hadoop distribution's position in five foundational necessities.

Posted May 08, 2014

MongoDB's Kelly Stirman and Cloudera's Yuri Bukhan recently talked with DBTA about the companies' new partnership and what it will mean for the big data ecosystem in the future. There is a need to demistify big data, they say, so that organizations can understand what technologies are right for their individual needs.

Posted May 08, 2014

At an event in NYC, Oracle president Mark Hurd and EVP John Fowler unveiled Solaris 11.2, which represents an evolution from operating system to a complete platform, they said. The phenomena of cloud and engineered systems are driven by the same requirement, which is the need to transfer work from the IT budget to the R&D budget to make things work more efficiently, observed Hurd, who asked, "Who really wants to glue an operating system to a server?"

Posted May 08, 2014

Just after data is created, there is high value attached to it. As data begins to age, its value does not diminish, but the nature of that value begins to change. For many enterprises that are dealing with large data volumes, timely data access can be a major issue, especially when customers demand quick response times. Let's examine the challenge of processing data in real time reliably and meeting customers' expectations for quick responses.

Posted April 30, 2014

CodeFutures has been granted a patent by the U.S. Patent Office for its continuous replication technology that helps DBAs perform maintenance and other processing functions without downtime. CodeFutures' new patent provides the technology backbone for dbShards/Replicate, its replication and failover solution for enterprise databases.

Posted April 29, 2014

New Power Systems servers from IBM enable data centers to manage big data at high speeds, all built on an open server platform. Built on IBM's Power8 technology and designed for managing large data stores, the new scale-out IBM Power Systems servers culminate a $2.4 billion investment, three-plus years of development, and exploit the innovation of hundreds of IBM patents.

Posted April 28, 2014

Splunk is shipping a new version of its virtual environment reporting app. Version 3.1 of the Splunk App for VMware includes 200 out-of-the-box reports, the ability to identify outliers for real-time triage, and built-in correlation into storage systems including a direct drill-down into data from NetApp Data ONTAP.

Posted April 28, 2014

Leveraging Dell Boomi AtomSphere's single-instance, multi-tenant architecture to gain deep insight into customer usage metrics, the spring 2014 release of the integration platform offers new features, including Boomi Resolve, which helps users solve common errors quickly by automatically listing possible solutions in order of relevancy, and Predictive Assistance, which integrates near-real-time customer usage metrics, to support optimization of the use of the Boomi integration platform.

Posted April 28, 2014

Helping data analysts and business users make better data-driven decisions, Alteryx Analytics 9.0 enables the blending of new and established sources of data such as social media feeds, Google Analytics, Marketo, and SAS Analytics.

Posted April 24, 2014

The Dell PowerEdge R920 server, the company's highest performing server, has achieved certification for SAP HANA, and initial SAP workload testing shows a world record 4-socket Linux benchmark.

Posted April 24, 2014

TEKsystems, an IT staffing solutions provider, says that employers are finding it increasingly difficult to hire business intelligence and security experts.

Posted April 23, 2014

Cloud database technology may be ready for the enterprise, but enterprises are not quite ready for cloud databases. Even leading cloud database proponents agree that cloud databases are a relatively new—and untested phenomenon.

Posted April 16, 2014

Twitter is acquiring Gnip, a provider of social data and a long-standing Twitter data partner. "Together we plan to offer more sophisticated data sets and better data enrichments, so that even more developers and businesses big and small around the world can drive innovation using the unique content that is shared on Twitter," said Jana Messerschmidt, Twitter VP, Global Business Development & Platform, in a blog announcing the acquisition.

Posted April 15, 2014

MapR has added the complete Apache Spark technology stack to the MapR Distribution for Hadoop. Spark is an Spark is an in-memory processing framework that provides speed, programming ease, and advantages for real-time processing.

Posted April 15, 2014

To accommodate its expanding customer base, the safety science organization UL overhauled its IT infrastructure, implementing Oracle Exalogic Elastic Cloud and Oracle E-Business Suite. The result? The new IT environment runs up to 10 times faster than UL's previous environment, allowing UL to more quickly deliver information to its customers and drive better decision-making.

Posted April 10, 2014

Calypso Technology, Inc., an integrated capital markets platform provider, announced that Calypso version 14 has achieved Oracle Exadata Optimized status through Oracle PartnerNetwork (OPN). According to Tej Sidhu, senior vice president for Engineering and CTO at Calypso Technology, in simulations of data-intensive straight through-processing tasks, Calypso achieved performance gains of up to 500% using Exadata hardware.

Posted April 10, 2014

Kelly Stirman, director of product marketing at MongoDB, emphasized simplicity, reliability, and security in a recent conversation about the new capabilities of MongoDB 2.6. "Don't let the 2.6 fool you. It is absolutely the biggest release we have ever done."

Posted April 08, 2014

IBM, which marked the 50th anniversary of the mainframe today, looked to the future by rolling out new mobile, storage, cloud, and Hadoop for Big Data offerings for System z. According to IBM, as it celebrates this landmark occasion, more than 70% of enterprise data resides on a mainframe and 71% of all Fortune 500 companies have their core businesses on a mainframe

Posted April 08, 2014

Splice Machine is the newest member of the more than 800-member Cloudera Connect Partner Program. According to Splice Machine, its technology enables Cloudera users to tap into real-time updates with transactional integrity and standard ANSI SQL, which the company says are necessary features for organizations that are looking to become real-time, data-driven businesses.

Posted April 07, 2014

InfiniDB has announced the results of a new, independent benchmark from Radiant Advisors that examined the performance of leading open source SQL-on-Hadoop query engines, including InfiniDB for Hadoop 4.0

Posted April 07, 2014

Teradata has introduced the Teradata Database 15 with a new software product called Teradata QueryGrid that provides virtual compute capability within and beyond the Teradata Unified Data Architecture. The company also announced Teradata Active Enterprise Data Warehouse 6750 platform with new capabilities to support customers' most demanding real-time workloads.

Posted April 07, 2014

About 3 years ago, the AMP (Algorithms, Machines, People) lab was established at U.C. Berkeley to attack the emerging challenges of advanced analytics and machine learning on big data. The resulting Berkeley Data Analytics Stack—particularly the Spark processing engine—has shown rapid uptake and tremendous promise.

Posted April 04, 2014

Driving a wide range of applications, from operational applications such as fraud detection to strategic analysis such as customer segmentation, advanced analytics goes deeper than traditional business intelligence activities into the "why" of the situation, and delivers likely outcomes.

Posted April 04, 2014

Using the Oracle Database 12c Multitenant features sets up the 12c environment in a way that provides some additional flexibility and supports a move to a pluggable databases setting. If database instances are created as CDB with a single PDB, and the requirement comes to consolidate several of these instances together, the move to a Multitenant environment is a whole lot easier.

Posted April 04, 2014

A new partnership between Hortonworks and LucidWorks, which provides a search development platform leveraging Apache Solr, will enable users throughout an organization to easily access and gain insight from big data sets that were previously available only to developers, analyst and data scientists.

Posted April 03, 2014

The theme for COLLABORATE 14-IOUG Forum is "Become Your Office Superhero," because, while you may look like a mild mannered technical resource in meetings or at your desk, you fight a daily battle to protect your organization's data, improve performance and generate new business opportunities. COLLABORATE is your chance to recharge your superpowers and to take on new skills.

Posted April 02, 2014

Organizations should exercise caution when it comes to implementing new technologies for data governance. Sometimes, the lowest tech solution is the best one and spending money on more software without laying the groundwork actually sets companies up to fail.

Posted April 02, 2014

Attunity has launched Maestro, a new big data platform designed to simplify mapping, executing and managing the flow of data between heterogeneous systems—both analytical and operational/transactional systems—to help organizations provide the right information to the right system when it is needed.

Posted April 02, 2014

SAP has officially announced Adaptive Server Enterprise (ASE) 16, a major release of the company's RDBMS, as well as Replication Server, 15.7 SP200 which includes synchronous replication for HA/DR for SAP Business Suite on ASE. It has been 9 years since Sybase (acquired by SAP in 2010) announced ASE 15 in August 2005. With the new release, said Dan Lahl, SAP VP, Database & Technology Marketing, SAP is providing "the right system of record that is going to be able to handle the new world of transaction processing."

Posted April 02, 2014

Efforts in both data quality and master data management have only been partially successful. Not only is data quality difficult to achieve, it is a difficult problem even to approach. In addition, the scope of the problem keeps broadening.

Posted April 01, 2014

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105

Sponsors