Key findings from a new study, "Big Data Opportunities," will be presented at Big Data Boot Camp at the Hilton New York. Big Data Boot Camp will kick off at 9 am on Tuesday, May 21, with a keynote from John O'Brien, founder and principal of Radiant Advisors, on the dynamics and current issues being faced in today's big data analytic implementations. Directly after the opening address, David Jonker, senior director of Big Data Marketing, SAP, will showcase the results of the new big data survey, which revealed a variety of practical approaches that organizations are adopting to manage and capitalize on big data. The study was conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by SAP.
At SAPPHIRE NOW in Orlando, SAP AG unveiled SAP HANA service pack 6 (SP6). In a key new capability, HANA now enables real-time insights across heterogeneous systems - including Hadoop. In addition, there are new geospatial capabilities, as well as a new open hardware architecture deployment model for tailored integration with customers' existing data centers. The added capabilities are part of a continued expansion of HANA as platform, and will be available by the end of this quarter, David Jonker, senior director of Big Data Marketing, SAP, tells 5 Minute Briefing.
OpenText, a provider of enterprise information management (EIM) software, announced the expansion of the OpenText ECM Suite for SAP Solutions in support of SAP's latest technologies, including the SAP HANA platform, and cloud and mobility solutions. This support builds on a partnership that spans more than two decades and is growing every year, Patrick Barnert, senior vice president, Partners and Alliances, OpenText, tells 5 Minute Briefing. OpenText, he adds, is the first SAP ISV partner to have its products fully tested and confirmed by SAP to be integrated with SAP Business Suite powered by SAP HANA.
Informatica Corporation, a provider of data integration software, has introduced Informatica Cloud Summer 2013, the latest release of its cloud-based integration and data management software as a service (SaaS) solutions. Cloud Summer 2013 features the new Informatica Cloud SAP Connector, new Informatica Cloud Extend capabilities, new cloud connectors and integration templates, as well as general availability of the Informatica Cloud Data Masking service.
Talend, a global open source software provider, has released version 5.3 of its integration platform, scaling the integration of data, application and business processes of any complexity. Talend version 5.3 reduces the skill sets and development costs necessary to leverage big data and Hadoop by enabling integration developers without specific expertise to develop on big data platforms.
Index Engines, an enterprise information management and archiving solutions vendor, has released the Catalyst Data Profiling Engine, providing a cost- and time-effective solution to big data issues. Processing all forms of unstructured files and document types, it creates a searchable index of what exists, where it's located, who owns it, when it was last accessed and what key terms are in it. This information is provided through summary reports for immediate insight into enterprise storage.
BI software vendor Yellowfin has signed an agreement with DIKW, a Dutch provider of business intelligence consulting, training and services. Through this reseller agreement, DIKW will make Yellowfin's solution, and associated training and implementation services, available to its clients and businesses throughout Benelux (Netherlands, Belgium and Luxembourg).
IBM unveiled new business process and integration software and services that will help organizations accelerate adoption of big data, cloud, mobile and social business technologies. At the cornerstone of this strategy is IBM MessageSight, a new appliance designed to help organizations manage and communicate with the billions of mobile devices and sensors found in systems such as automobiles, traffic management systems, smart buildings and household appliances.
IBM is spearheading the development and adoption of Kernel-based Virtual Machine (KVM) for financial services firms. On May 2, IBM will launch the KVM Center of Excellence for Wall Street located in New York City with a grand opening event for IBM clients and business partners in the financial services industry. This center will act as the hub for KVM-related trainings, client and partner briefings and thought leadership.
Upcoming Webinars for Data Management Pros
Database Trends and Applications – Articles
Trends & Applications
NoSQL databases are becoming increasingly popular for analyzing big data. There are very few NoSQL solutions, however, that provide the combination of scalability, reliability and data consistency required in a mission-critical application. As the open source implementation of Google's BigTable architecture, HBase is a NoSQL database that integrates directly with Hadoop and meets these requirements for a mission-critical database.
Pick Cloud, Inc. has signed a reseller agreement with AccuSoft Enterprises to sell its line of AccuTerm terminal emulation products. According to Pick Cloud, the reseller agreement enables Pick Cloud to offer and leverage AccuSoft's GUI development tools as well as their connectivity and terminal emulation products.
DBTA - COLUMNS
Notes on NoSQL
The term "NoSQL" is widely acknowledged as an unfortunate and inaccurate tag for the non-relational databases that have emerged in the past five years. The databases that are associated with the NoSQL label have a wide variety of characteristics, but most reject the strict transactions and stringent relational model that are explicitly part of the relational design. The ACID (Atomic-Consistent-Independent-Durable) transactions of the relational model make it virtually impossible to scale across data centers while maintaining high availability, and the fixed schemas defined by the relational model are often inappropriate in today's world of unstructured and rapidly mutating data.
Computer games have been front-runners in many important developments in the IT industry, including digital distribution, cloud storage, user driven design, and crowd sourcing. So it's not surprising that game developers are in a leading position when it comes to big data analytics and machine learning. Online games have the ability to monitor all aspects of player behavior, so, just as Google is able to refine your search results by analyzing your previous searches and comparing them to the billions of searches done every day, online game companies are able to modify game behavior to ensure a more optimal game experience by observing what works - and what doesn't - in the gamer's world.
It seems that juggling is the most useful of all skills when embarking on a data warehousing project. During the discovery and analysis phase, the workload grows insanely large, like some mutant science fiction monster. Pressures to deliver can encourage rampant corner-cutting to move quickly, while the need to provide value urges caution in order not to throw out the proverbial baby with the bath water as the project speeds along. Change data capture is one area that is a glaring example of the necessary juggling and balancing.
Data is not sedentary. Once data has been created, organizations tend to move it around to support many different purposes—different applications, different geographies, different users, different computing environments, and different DBMSs. Data is copied and transformed and cleansed and duplicated and stored many times throughout the organization. Different copies of the same data are used to support transaction processing and analysis; test, quality assurance, and operational systems; day-to-day operations and reporting; data warehouses, data marts, and data mining; and distributed databases. Controlling this vast sea of data falls on the DBA who uses many techniques and technologies to facilitate data movement and distribution.
SQL Server Drill Down
When you decide to undertake your own benchmarking project, it's a strongly recommended best practice to write up a benchmarking plan. A benchmark must produce results that are both reliable and repeatable so that we can foster conclusions that are predictable and actionable. Keeping the "reliable and repeatable" mantra in mind necessitates a few extra steps.