Big Data Articles
Oracle announced that fiscal 2014 Q3 total revenues were up 4% to $9.3 billion. In constant currency, Oracle's Cloud Software subscriptions revenues grew 25% and its Engineered Systems revenue grew more than 30% in the quarter, said Oracle president and CFO Safra Catz in a statement released by the company.
Posted March 19, 2014
Today, businesses are ending up with more and more critical dependency on their data infrastructure. If underlying database systems are not available, manufacturing floors cannot operate, stock exchanges cannot trade, retail stores cannot sell, banks cannot serve customers, mobile phone users cannot place calls, stadiums cannot host sports games, gyms cannot verify their subscribers' identity. Here is a look at some of the trends and how they are going to impact data management professionals.
Posted March 19, 2014
At an event in New York City, IBM and leading researchers at the New York Genome Center (NYGC) unveiled a new collaboration to advance genomic medicine using IBM Watson, the cognitive computing system that won fame and fortune in its 2011 Jeopardy! competition.
Posted March 19, 2014
Cloudera has closed on a new round of funding for $160 million which will be used to further drive the enterprise adoption of and innovation in Hadoop and promote the enterprise data hub (EDH) market; support geographic expansion into Europe and Asia; expand its services and support capabilities; and scale the field and engineering organizations. The funding round was led by T. Rowe Price, and included an investment by Google Ventures and an affiliate of MSD Capital, L.P., the private investment firm for Michael S. Dell and his family.
Posted March 18, 2014
Forward-looking CIOs and IT organizations are exploring new strategies for tapping into non-traditional sources of information such as websites, tweets and blogs. While this is a step in right direction, it misses the bigger picture of the big data landscape.
Posted March 17, 2014
Pivotal has introduced Pivotal HD 2.0 and Pivotal GemFire XD, which along with the HAWQ query engine, form the foundation for the Business Data Lake architecture, a big data application framework for enterprise
Posted March 17, 2014
Along with big data there come some fundamental challenges. The biggest challenge is that big data is not able to be analyzed using standard analytical software. There is new technology called "textual disambiguation" which allows raw unstructured text to have its context specifically determined.
Posted March 14, 2014
Tableau and Splunk have formed an alliance, which allows users of Tableau's visual analytics software to access machine data with Splunk Enterprise. The ability to use Tableau to visualize structured data with machine data in Splunk will enable customers to gain new business insights.
Posted March 10, 2014
SAP underscored the company's strategy which focuses squarely on HANA and the cloud in a webcast presented by Jonathan Becher, chief marketing officer of SAP, and Vishal Sikka, member of the Executive Board of SAP AG, Products & Innovation. The company rolled out new and enhanced offerings for the SAP HANA Cloud Platform, the SAP HANA Marketplace, HANA's new pricing, innovations on top of HANA, and also announced that HANA had broken the Guinness World Record for the largest data warehouse ever built - 12.1PB.
Posted March 05, 2014
AvePoint Compliance Guardian Service Pack (SP) 2, the latest release of AvePoint's enterprise platform for managing information, availability, risk, and compliance adds support for cloud and social platforms, improved incident tracking and management and encryption and redaction.
Posted February 28, 2014
The explosion of virtual machines (VMs) has propelled the growth of data storage across all industries. Yet, despite the clear benefits, the popularity of virtual servers is placing strain on traditional data center infrastructure and storage devices. What's the solution?
Posted February 27, 2014
Database Trends and Applications (DBTA) and IBM have launched a Big Data Survey to provide a timely analysis of the key technologies, practices and strategies that businesses across different industries are using to improve confidence in their decision making. Respond with a completed survey by March 2nd, 2014, in order to be included in a drawing to win one of three $200 American Express gift cards to be awarded at the conclusion of the study.
Posted February 26, 2014
The newest version of DataStax's eponymously named enterprise database platform adds a new in-memory option as well as enterprise search enhancements to support high performance. Along with the latest release of its enterprise NoSQL database, DataStax has also introduced version 4.1 of DataStax OpsCenter which supplies improved capacity management capabilities and visual monitoring of production database clusters.
Posted February 26, 2014
The popular "The Hitchhiker's Guide to the Galaxy" outlined, in an entertaining fashion, the core question of big data, to find the "answer to the Ultimate Question of Life, The Universe, and Everything." Today, big data describes our development along this path to answer all the questions by collecting and interpreting all the data, but not all data is equal and not all the answers are worthwhile computing.
Posted February 26, 2014
ISUG-TECH 2014 in Atlanta, April 14-17, will be the first technology event to explore the features and functionality in Sybase ASE 16. The conference will provide more than 150 hours of technical session, more than 60 hours of workshops, and more than 50 expert speakers, and a range of workshops, including the SAP ASE Quick Start for Oracle and SQL Server DBAs workshop which will double as a refresher course for people who had been Sybase DBAs but have not recently been in that role.
Posted February 26, 2014
After almost two decades of delivering technology training and education to customers, partners, consultants, and employees through the SAP TechEd conference series, SAP is changing the conference name to SAP d-code.
Posted February 26, 2014
The latest release of Embarcadero's portfolio of database tools adds first-class support for Teradata in addition to updating support for the latest releases of the major RDBMSs. Overall, a key theme for the XE5 releases is an emphasis on scale, as big data, with big models and big applications, requires close collaboration across big teams, said Henry Olson, Embarcadero director of product management.
Posted February 26, 2014
In order to be effective, big data analytics must present a clear and consistent picture of what's happening in and around the enterprise. Does a new generation of databases and platforms offer the scalability and velocity required for cloud-based, big data-based applications—or will more traditional relational databases come roaring back for all levels of big data challenges?
Posted February 26, 2014
IBM has agreed to acquire Cloudant, Inc., a database-as-a-service (DBaaS) provider that enables developers to create mobile and web apps. Following the close, which is expected to take place in the first quarter of 2014, Cloudant will join IBM's newly formed Information and Analytics Group led by Senior Vice President Bob Picciano, a business unit within the IBM Software & Systems Group.
Posted February 24, 2014
Infobright has added a new Approximate Query functionality to the Infobright Enterprise Edition. The new functionality, which is now in beta testing, speeds up customer access to the results of their ad hoc queries.
Posted February 21, 2014
Informatica 9.6 aims to simplify use for all types of data projects - ranging from day to day reporting to comprehensive data analysis. In addition, the latest release of the data integration platform delivers critical business data up to five times faster than traditional approaches, the company says.
Posted February 20, 2014
MapR Technologies, a provider of an an enterprise-grade platform for NoSQL and Hadoop, is expanding its Asia Pac presence with a new office in Seoul and a partnership agreement with LG CNS, a global IT service provider that will provide system integration and consulting services across Korea for the MapR Distribution for Hadoop and NoSQL.
Posted February 19, 2014
Splice Machine, provider of a real-time SQL-on-Hadoop database for big data applications, has completed a $15M Series B round of funding, led by InterWest Partners, along with returning Series A investor Mohr Davidow Ventures (MDV). The investment will be used to accelerate product development and expand sales and marketing in preparation for the company's upcoming public beta offering later this quarter.
Posted February 18, 2014
Alpine Data Labs has introduced a collaboration solution that aims to help deliver on the predictive analytics potential of big data. Alpine Chorus 3.0 is a product that it allows users to log in, work with colleagues, and explore data.
Posted February 18, 2014
Addressing CFOs' needs for real-time global views of their organizations and the ability to quickly forecast the effects of the dynamic world in which their businesses operate, Oracle has introduced the Real Time Bottom Line, which includes two new applications for PeopleSoft Financials customers. Engineered for Oracle Engineered Systems, the new Oracle In-Memory Applications are designed to help the office of the CFO drive business performance through faster, well-informed decisions based on real-time simulation of business, organizational and regulatory changes.
Posted February 14, 2014
Today at the Strata conference in Santa Clara, MapR Technologies unveiled the latest MapR Distribution including Hadoop 2.2 with YARN for next-generation resource management. The company also announced availability of the MapR Sandbox for Hadoop, which provides a fully-configured virtual machine installation of the MapR Distribution for Apache Hadoop to allow users to jump-start their Hadoop exploration; and the early access release of the HP Vertica Analytics Platform on MapR.
Posted February 11, 2014
Solid State Disk (SSD)—particularly flash SSD—promised to revolutionize database performance by providing a storage media that was orders of magnitude faster than magnetic disk, offering the first significant improvement in disk I/O latency for decades. Aerospike is a NoSQL database that attempts to provide a database architecture that can fully exploit the I/O characteristics of flash SSD.
Posted February 10, 2014
When it comes to implementing a big data strategy in a Microsoft SQL Server shop, you're generally going to consider three approaches, one of which is a cloud implementation. SQL Server 2012, and even more so in the upcoming SQL Server 2014 release, has built out a very strong Apache Hadoop infrastructure on Windows Azure called HDInsight. Despite all its goodness, it is in the cloud—and a lot of people aren't ready to go there yet.
Posted February 10, 2014
Hortonworks and Red Hat have announced an expanded partnership, as well as the introduction of the Hortonworks Data Platform (HDP) plug-in for Red Hat Storage, now available through a beta program.
Posted February 10, 2014
Expanding its big data analytics portfolio, Apigee has acquired InsightsOne, a technology company that provides big data predictive intelligence to help businesses understand consumer preferences. The acquisition of InsightsOne adds a predictive analytics infrastructure that is consistent with Apigee's big data offerings, built on top of a massively scalable distributed processing foundation based on open source Hadoop and an in-memory real-time process.
Posted February 05, 2014
CodeFutures is now the sponsor of MapDB, a database for Java developers. According to CodeFutures, MapDB will remain open source under the Apache 2.0 license, enabling free access to the core technology.
Posted February 04, 2014
Cloudera introduced simplified packaging and pricing for its product line-up to align closely with the way customers use Hadoop. At the top of the stack is Cloudera Enterprise Data Hub Edition which gives customers everything they need to build an enterprise data hub, including unlimited supported use of all Cloudera's advanced components.
Posted February 03, 2014
Combining a high-speed operational database with in-memory analytics, the latest release of VoltDB's flagship product features a ten-fold throughput improvement of analytic queries and can do writes and reads on millions of data events per second, according to the company. By enabling organizations to take advantage of data when it arrives, VoltDB says the database is purpose-built to support telecommunications billing, gaming, sensor management, smart energy and capital markets applications.
Posted January 29, 2014
With demand growing for easy-to-use analytical applications that can make big data actionable for more users, MicroStrategy has launched MicroStrategy PRIME, a cloud-based, in-memory analytics service. The company says it has been built from the ground up to support the engineering challenges that are associated with development of powerful new information-driven apps.
Posted January 28, 2014
DocAve Online, AvePoint's software-as-a-service platform for Microsoft Office 365 management, has added enhanced data protection, governance, and reporting to help companies improve the protection and control of their cloud-based assets. The company says the policy enforcement enhancements in DocAve Online Service Pack (SP) 3, hosted on Windows Azure, enable companies to have comprehensive governance policies that will address all-in cloud, hybrid, or on-premises deployments.
Posted January 27, 2014
New funding will enable in-memory database technology provider MemSQL to expand product development, support its growing customer base, and capitalize on the market for big data technologies. The $35 million Series B funding round was led by Accel Partners.
Posted January 22, 2014
Volume is only one of the challenges organizations face. Real-time processing of in-motion high-velocity feeds is crucial to truly unlock big data's potential. A look at where data is originating and being consumed puts the opportunity and importance of velocity processing into context. What's the solution?
Posted January 20, 2014
At an event in NYC, Ginni Rometty, IBM chairman, president and CEO, unveiled IBM's new Watson Group, which is aimed at enabling a new class of software, services and apps that think, improve by learning, and discover answers and insights to complex questions from massive amounts of big data. IBM will invest more than $1 billion into the Watson Group, which will be headquartered in New York City's Silicon Alley technology hub.
Posted January 09, 2014
It's time to look back at some of the most interesting big data blog posts of the past 12 months. These 12 posts provide warnings, tips and tricks, and often a touch of humor as well.
Posted January 08, 2014
0xdata, maker of H2O, the open source in-memory prediction engine for big data, has announced H2O's full support of Scala, a programming language and application community for big data and machine learning.
Posted December 30, 2013
Not all Hadoop packages offer a unique distribution of the Hadoop core, but all attempt to offer a differentiated value proposition through additional software utilities, hardware, or cloud packaging. Against that backdrop, Intel's distribution of Hadoop might appear to be an odd duck since Intel is not in the habit of offering software frameworks, and the brand, while ubiquitous, is not associated specifically with Hadoop, databases or big data software. However, given its excellent partnerships across the computer industry, Intel has support from a variety of vendors, including Oracle and SAP, and many of the innovations in its distribution show real promise.
Posted December 18, 2013
A new rapid-deployment solution from SAP aims to address the issue of big data storage access and analysis which companies are grappling with as they attempt to balance what information needs to be accessible in real time and what can be stored for historical analysis. The SAP NetWeaver Business Warehouse (BW) Near-Line Storage rapid-deployment solution facilitates seamless data transfer between the business warehouse and the near-line storage that holds historical data. This, the company says, helps limit a business' burden around housing volumes of big data while also creating an online and accelerated retrieval system with the near-line storage.
Posted December 18, 2013