Newsletters




Big Data

The well-known three Vs of Big Data - Volume, Variety, and Velocity – are increasingly placing pressure on organizations that need to manage this data as well as extract value from this data deluge for Predictive Analytics and Decision-Making. Big Data technologies, services, and tools such as Hadoop, MapReduce, Hive and NoSQL/NewSQL databases and Data Integration techniques, In-Memory approaches, and Cloud technologies have emerged to help meet the challenges posed by the flood of Web, Social Media, Internet of Things (IoT) and machine-to-machine (M2M) data flowing into organizations.



Big Data Articles

Just when you thought NoSQL meant the end of SQL, think again, and realize why you need to hold on to your relational database administrator like it was 1999. NoSQL has proven to be a resilient next-generation database technology for increasingly common internet-era specialized workloads. Now approaching a decade after its arrival on the scene, NoSQL is moving beyond architectural marvels to practical tools in the software development toolkit and, in that process, unveiling tried-and-true capabilities formerly known to be the scalpels of the enterprise relational database. Let's go back to the future and take a look at how the DBA is becoming as relevant as ever while NoSQL evolves for the enterprise.

Posted May 19, 2015

RedPoint Global was founded in 2006 by Dale Renner, Lewis Clemmens, and George Corugedo, who previously had worked together at Accenture. Based in Wellesley, Mass., RedPoint collaborates with clients around the world in 11 different verticals. "We have always been very focused on the data, and recognize that a lot of business problems live and die by the quality of the data," says Corugedo.

Posted May 19, 2015

Hadoop is contributing to the success of data analytics. Anad Rai, IT manager at Verizon Wireless, examined the differences between traditional versus big data at Data Summit 2015 in a session titled "Analytics: Traditional Versus Big Data." The presentation, which was part of the IOUG track moderated by Alexis Bauer Kolak, education manager at the IOUG, showed how big data technologies are helping data discovery and improving the transformation of information and knowledge into wisdom.

Posted May 14, 2015

At Data Summit 2015 in New York City, Tony Shan, chief architect, Wipro, gave a talk on the key components of a successful big data methodology and shared lessons learned from real world big data implementations. According to Shan, there is an 8-step process for a big data framework with specific techniques and methods.

Posted May 14, 2015

The data lake is one of the hottest topics in the data industry today. It is a massive storage reservoir that allows data to be stored in its rawest forms. Hadoop Day at Data Summit 2015 concluded with a panel on everything data lake featuring James Casaletto, solutions architect for MapR, Joe Caserta, president and founder of Caserta Concepts, and George Corugedo, CTO with RedPoint Global Inc.

Posted May 14, 2015

With the influx of big data solutions and technologies comes a bevy of new problems, according to Data Summit 2015 panelists Miles Kehoe, search evangelist at Avalon Consulting, and Anne Buff, business solutions manager for SAS best practices at the SAS Institute. Kehoe and Buff opened the second day of Data Summit with a keynote discussion focusing on resolving data conundrums.

Posted May 14, 2015

To transform data into value, IT must move from thinking about what it does to data, and instead focus on business outcomes and what can be done with the data to advance the business, according to Edd Dumbill, vice president, strategy, Silicon Valley Data Science, who gave the welcome keynote at Data Summit 2015.

Posted May 14, 2015

Data has changed and, with this change, cracks in the armor of traditional data warehousing approaches are forming. The concept of the data warehouse is still sound. However, businesses can be more successful by acknowledging that the traditional enterprise data warehouse cannot solve all problems today.

Posted May 14, 2015

Ian Abramson, principal senior consultant, SWI Systemware Innovation Corporation, Oracle Ace, and IOUG past president, provided a look back at the evolution of BI and a peek at what may lie ahead for analytics in a talk titled "Analytics in the Time of Big Data," as part of the IOUG track at Data Summit 2015 in New York.

Posted May 13, 2015

If used correctly, machine data can provide a company a significant advantage in terms of understanding user and machine behavior, fighting cyber security risks and fraudulent behavior, service levels and customer behavior. In his talk at Data Summit 2015, Dejan Deklich, vice president, engineering platform and cloud at Splunk, discussed issues around machine data analysis and showcased some prominent use cases.

Posted May 13, 2015

At Data Summit 2015 in New York, James Casaletto of MapR and David Teplow of Integra provided deep dives into the world of Hadoop, past, present, and future.

Posted May 12, 2015

In order to break down barriers in creating and storing data, understanding the modern data architecture is key. That was the focus of Mike Lamble, CEO at Clarity Solution Group, and Ron Huizenga's, product manager at Embarcadero Technologies, presentation at Data Summit 2015.

Posted May 12, 2015

Capgemini is extending its long-standing strategic partnership with SAP, allowing Capgemini to act as a single point of contact for customers globally, and delivering SAP products and support services through one consolidated framework. By signing a global value-added reseller (VAR) agreement with SAP, Capgemini is among a select group of global SAP partners that are part of the global program, which has specific entry requirements that include global reach, reseller capabilities and revenue targets.

Posted May 12, 2015

Splice Machine is partnering with Talend to enable customers to simplify data integration and streamline data workflows on Hadoop. Through this partnership, organizations building operational data lakes with Splice Machine can augment Talend's data integration technology with its data quality capabilities.

Posted May 12, 2015

Pentaho users will now be able to use Apache Spark within Pentaho thanks to a new native integration solution that will enable the orchestration of all Spark jobs. Pentaho Data Integration (PDI), an effort initiated by Pentaho Labs, will enable customers to increase productivity, reduce maintenance costs, and dramatically lower the skill sets required as Spark is incorporated into big data projects.

Posted May 12, 2015

HP has made multiple contributions to the OpenStack Kilo release, including new converged storage management automation and new flash storage technologies to support flexible, enterprise-class clouds. HP's storage contributions to the OpenStack Kilo release focus on two strategic goals.

Posted May 11, 2015

Teradata has made enhancements to the Teradata Database's hybrid row and column capabilities to provide quicker access to data stored on columnar tables and drive faster query performance. Other relational database management systems store data tables in rows or columns, and each method offers benefits, depending on the application and type of data. However, they have been mutually exclusive. Teradata's new hybrid row and column capabilities allow the best of both worlds.

Posted May 08, 2015

Cloudera is now offering support for Capgemini's new reference architecture for the SAP HANA platform and Cloudera Enterprise. "By bringing the power of Cloudera's enterprise data hub offering to the ecosystem in support of SAP HANA, we can enable Capgemini's clients to expand the amount of data they have within their environment in a cost-efficient manner," said Tim Stevens, vice president of corporate and business development at Cloudera.

Posted May 08, 2015

It's not enough to collect data. In order for data to provide advantage, you have to drive your business with analytics on that data.

Posted May 07, 2015

The certification enables Nimble Storage to participate in SAP's program for SAP HANA tailored data center integration using its certified solutions. Through participation in the program, customers can leverage their existing hardware and infrastructure components for their SAP HANA-based environments, providing further choice for organizations even when working in heterogeneous environments.

Posted May 07, 2015

Software AG has made updates to its Terracotta In-Memory Data Management platform. New improvements to Terracotta Open Source Kit 4.3 include distributed storage and off-heap storage.The platform is used for boosting performance, scalability, and building real-time applications. Additionally, Terracotta helps developers leverage in-memory storage for current and emerging data workloads.

Posted May 07, 2015

Tableau's cloud analytic solution, Tableau Online, is being upgraded to version 9.0. The new release enables faster performance, and provides additional live database connection support, single sign-on, and other new features designed to help users do more with their data in the cloud. The new update brings a complete redesign of Tableau Online to deliver a faster, more scalable, resilient, and extensible platform with capabilities such as parallel queries, query fusion, vectorization and smarter query caches that will make Tableau Online as much as 10 times faster.

Posted May 07, 2015

When databases are built from a well-designed data model, the resulting structures provide increased value to the organization. The value derived from the data model exhibits itself in the form of minimized redundancy, maximized data integrity, increased stability, better data sharing, increased consistency, more timely access to data, and better usability.

Posted May 06, 2015

Pivotal has made updates to its big data suite that include upgrades to the Pivotal HD enterprise-grade Apache Hadoop distribution, which is now based on the Open Data Platform core, and performance improvements for Pivotal Greenplum Database.

Posted May 05, 2015

As organizations embark on big data projects, how do they choose among all the diverse players across NoSQL, NewSQL, and Hadoop? In this special DBTA roundtable webcast, you'll learn why open standards matter to your big data investment.

Posted May 05, 2015

The Spring 2015 release of the SnapLogic Elastic Integration Platform extends the platform's cloud and big data integration capabilities to the Internet of Things (IoT) with support for Message Queuing Telemetry Transport (MQTT), a lightweight machine-to-machine connectivity protocol.

Posted May 05, 2015

The importance of BI analytics has skyrocketed, with the growth in data over the past few years. To help provide more information on this topic, Eldad Farkash, founder & CTO of Sisense will be speaking at Data Summit in New York City on the topic of "Succeeding with Big Data Analytics."

Posted May 05, 2015

Deep Information Sciences has closed $8 million in Series A funding. The round brings the total invested in Deep to $18 million. The funding will assist in the growth of the Deep Engine, which break downs the performance, speed and scale limitations of databases to help businesses achieve new insights and opportunities from big data.

Posted May 05, 2015

Dell is partnering with Datawatch Corporation to continue growing its analytics business by integrating Datawatch's interactive visualization and dashboarding capabilities directly into its Statistica advanced analytics platform.

Posted April 30, 2015

CA Workload Automation Advanced Integration 1.0 for SAP Business Warehouse has received SAP certification. Specifically, the SAP Integration and Certification Center has certified that CA Workload Automation Advanced Integration 1.0 integrates with SAP Business Warehouse to provide a unified view for jobs running in both SAP and non-SAP applications.

Posted April 30, 2015

BackOffice Associates' HiT Software division, a provider of data replication and change data capture solutions for heterogeneous database environments, has announced the release of version 8.5 of its flagship product DBMoto.

Posted April 29, 2015

In a recent DBTA webcast, Shane Johnson, senior product marketing manager, Couchbase, discussed the relationship between NoSQL and Hadoop, detailing the multiple ways to integrate NoSQL databases with Hadoop. "It's not Hadoop or Couchbase Server. It's Hadoop and Couchbase Server," said Johnson.

Posted April 28, 2015

Splice Machine, a provider of Hadoop RDMS, announced that it is partnering with mrc (michaels, ross & cole ltd), to allow Splice Machine's Hadoop RDBMS to be certified and integrated with mrc's m-Power platform. "Our partnership with mrc gives businesses a solution that can speed real-time application deployment on Hadoop with the staff and tools they currently have, while also offering affordable scale-out on commodity hardware for future growth," said Monte Zweben, co-founder and CEO, Splice Machine.

Posted April 28, 2015

Embarcadero Technologies, a provider of software solutions for application and database development, has unveiled the new XE7 version of ER/Studio, its flagship data architecture suite.

Posted April 28, 2015

IBM continues to experience tough quarters, but its mainframe business stands out like a bright beacon. For a platform that consistently is pronounced to be on the verge of obsolescence, it proved again and again that it provides more value to businesses than farms of commodity servers. Able to run clouds, with the best security of any platform, mainframes continue to provide their worth.

Posted April 27, 2015

ProfitBricks, a provider of cloud infrastructure for IaaS, has announced the release of a Node.js SDK and an SDK for Ruby, written against its recently launched REST API.

Posted April 27, 2015

Cloud technology was a dominant focus at COLLABORATE 15, which took place earlier this month, according to Melissa English, president of the Oracle Applications Users Group (OAUG). "What's on top of everybody's mind is cloud strategy," English noted.

Posted April 27, 2015

Predixion Software, a developer of cloud-based predictive analytics (PA) software, announced that Software AG will lead the company's series D funding round. the company says that this fourth round of funding, which includes participation from existing financial and strategic investors, including GE Software Ventures, will support Predixion's move into the Internet of Things (IoT) analytics market.

Posted April 27, 2015

Pivotal HAWQ is now available on the Hortonworks Data Platform (HDP), enabling the benefits of SQL on Hadoop to be leveraged by enterprises that are investing in HDP. This marks the first time that the features and capabilities of Pivotal HAWQ have been made available outside of Pivotal. The availability aligns with a common Open Data Platform (ODP) Core that allows users to leverage the best-of-breed technology across providers.

Posted April 27, 2015

The future will flourish with machines. We've been told this in pop culture for decades, from the helpful robots of the Jetsons, to the infamous Skynet of the Terminator movies, to the omniscient "computer" of Star Trek. Smart, connected devices will be ubiquitous and it's up to us, the humans, to decide what's next. But the Internet of Things (IoT) is about more than devices and data.

Posted April 23, 2015

As database administrators (DBAs) undertake more responsibilities than ever before, Janis Griffin, database performance evangelist and senior DBA, SolarWinds, with 26 years of experience under her belt, is looking to calm worries over increased workloads. As the benefits of cloud becomes more apparent, virtualization is going to be the wave of the future, Griffin noted. Virtualizating data or databases abstracts the technical environment, Griffin said, making it more flexible.

Posted April 23, 2015

There are many factors contributing to data environment changes, including users, technology, economics, and data itself. These four sources of change are creating opportunities to deliver competitive advantage but also new management, administration and optimization challenges.

Posted April 23, 2015

SUSE and Veristorm are partnering to provide certified high-performance Hadoop solutions that run directly on Linux on IBM z Systems, IBM Power Systems, and x86-64. Customers with IBM z Systems can team SUSE Linux Enterprise Server for System z with Veristorm zDoop, a commercial distribution of Hadoop supported on mainframes.

Posted April 23, 2015

There are actually many advantages to adopting or subscribing to a cloud-based data services infrastructure. For starters—and this may be the only reason companies need to make the move—there's the simplicity cloud and data as a service can offer. In many ways, cloud and data as a service free enterprises and their data teams from the technical intricacies of deploying systems and solutions.

Posted April 22, 2015

Progress Software has introduced a preview program for a standards-based connectivity solution to deliver fast transactions and analytics for SAP HANA. Called "Progress DataDirect ODBC" for SAP HANA, the connectivity solution will support both high-volume transactional workloads and massive analytics, provide connectivity to virtually any application including all major BI and analytics tools, and meet the demands of low latency, real-time query and analysis with superior throughput and CPU efficiency.

Posted April 22, 2015

SAP SE has announced an Industry 4.0 implementation project with GEA to address condition monitoring and predictive maintenance. GEA, a supplier for the food processing industry and a wide range of process industries, will work with SAP to optimize the performance of its separator and decanter machinery with the SAP Predictive Maintenance and Service solution, cloud edition. Based on SAP HANA Cloud Platform, the solution aims to bring together technology, sensors, and machine data with business processes, applications, and practices.

Posted April 22, 2015

A new release of the HP Haven Big Data Enterprise and OnDemand Platform incorporates advanced analytics and predictive capabilities for enterprises working with large volumes and varieties of information.

Posted April 22, 2015

While the new data stores and other software components are generally open source and incur little or no licensing costs, the architecture of the new stacks grows ever more complex, and this complexity is creating a barrier to adoption for more modestly sized organizations.

Posted April 22, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81

Sponsors