Newsletters




Trends and Applications



In 2014, we continued to watch big data enable all things "big" about data and its business analytics capabilities. We also saw the emergence (and early acceptance) of Hadoop Version 2 as a data operating platform, with cornerstones of YARN (Yet Another Resource Negotiator) and HDFS (Hadoop Distributed File System). In 2015, the mainstream adoption with enterprise data strategies and acceptance of the data lake will continue as data management and governance practices provide further clarity. The cautionary tale of 2014 to ensure business outcomes drive big data adoption, rather than the hype of previous years, will likewise continue.

Posted January 21, 2015

It is no secret that we are in the data age. Data comes at us from all directions, in all shapes and sizes.Incumbent vendors and startups constantly add new features, build on top of emerging open source projects, and claim to solve the next wave of challenges. Within the Hadoop ecosystem alone, there are (at least) 11 Hadoop-related open source projects. Making sense of it can be a time-consuming headache. To bring clarity and peace of mind, here are the top 5 big data predictions for 2015 and beyond.

Posted January 21, 2015

Registration is now open for Data Summit 2015, providing the opportunity to connect with the best minds in the industry, learn what works, and chart your course forward in an increasingly data-driven world. The event is designed to offer a comprehensive educational experience designed to guide attendees through the key issues in data management and analysis today.

Posted January 21, 2015

As data volumes continue to grow rapidly, ways to store and analyze data more efficiently and cost effectively are being explored. More than 50% of organizations surveyed by Unisphere Research in August 2014 reported that they currently use cloud-based services. Another study found that the number of big data projects being planned or in production will triple over the next 18 months. As the worlds of big data and cloud computing converge, many businesses have begun to look for ways to utilize both together to create competitive advantages for themselves.

Posted January 21, 2015

Cloud is being embraced strongly at Oracle. That is the message loud and clear from the company, which brought its CloudWorld to New York City on January 13. During the event, which drew customers, analysts, and thought leaders, Oracle offered an overview of its cloud strategy and the progress the company has made thus far in addressing consumers and enterprise constituents in the cloud. Oracle's cloud strategy is very simple, said Thomas Kurian, president, Oracle Product Development. The idea is to bring Oracle's business applications, technology, software, database, middleware, analytic tools and infrastructure "to any customer anywhere in the world through the internet browser."

Posted January 21, 2015

There's good news and bad news on the cybersecurity front, an IBM study finds. Over the past 2 years, there has been a 50% decline in the number of cyberattacks against U.S. retailers. However, the number of records stolen from them remains at near record highs. IBM security researchers report that in 2014, cyber attackers still managed to steal more than 61 million records from retailers despite the decline in attacks, demonstrating cyber criminals' increasing sophistication and efficiency. Ironically, while Black Friday and Cyber Monday were identified as the two biggest shopping days of the year by IBM's Digital Analytics Benchmark, cyber attackers reduced their activity across all industries on Black Friday and Cyber Monday, rather than taking action.

Posted January 21, 2015

SQL Server 2012/2014 delivers compelling new capabilities that make an upgrade worthwhile. However, along the upgrade path, companies have also discovered key obstacles to achieving the new opportunities. Without awareness and understanding about these challenges - and their potential consequences - an organization's upgrade to SQL Server 2012/2014 can be costly, inefficient, and plagued by system errors. Here are the top 5 SQL Server 2012/2014 ‘gotchas' as well as tips for addressing them.

Posted January 07, 2015

We are stuck with an antiquated two-dimensional integration and implementation model which assumes that before value can be realized, we need to get components of that value into a more stable and normalized condition. A third dimension is critical for analytics and BI to stay effective against the growth of data and activities at the edge.

Posted January 07, 2015

Statisticians and data miners have been using R software and language for analytics within research and academia environments for some time. This open source community has contributed thousands of packages to meet the diverse needs across research and business analytics. And, to top off the perks of the software and language - R is free. But even in the face of its benefits, programmers still face fundamental challenges.

Posted January 07, 2015

The Sony hack not only sidelined a major film, "The Interview," but it is beginning to look as if the data breach may be worthy of a Hollywood movie itself. The incident which exposed sensitive company and employee data has drawn widespread scrutiny on how it happened as well as how it has been handled. The still-unfolding events are placing a renewed focus on the importance of enterprise data security and what must be done to ensure it.

Posted January 07, 2015

Similar to server, storage, and network virtualization, data virtualization simplifies how data is presented and managed for users, while employing technologies - under the covers - for abstraction, decoupling, performance optimization and the efficient use/reuse of scalable resources.

Posted December 15, 2014

2015 is going to be a big year for big data in the enterprise, according to Oracle. Neil Mendelson, Oracle vice president of big data and advanced analytics, shared Oracle's "Top 7" big data predictions for 2015. "The technology is moving very quickly and it is gaining to the point where a broader set of people can get into it - not just because it is affordable - but because they no longer require specialized skills in order to take advantage of it," he said.

Posted December 15, 2014

2014 has been described as the year of the data breach. But 2015 will be the year of the regulator, says Suni Munshani, CEO of Protegrity, a provider of data security solutions. According to the Identity Theft Resource Center, there were 708 breaches that took place in the past year, grabbing headlines and sending warnings to retailers to prepare for the 2014 holiday shopping season that is now in full swing.

Posted December 15, 2014

Data has grown at an exponentially faster rate over the past 2 years than all of the previous years combined. "Ninety percent of data that exists today has been created in the past 2 years," said Bill Brunt, product manager of Dell SharePlex, during a recent DBTA webcast. This increase in data has caused companies to look for more efficient ways of moving data within their organizations. Dell SharePlex strives to offer the best overall customer experience when working with their software.

Posted December 15, 2014

Big data continues to have an impact on business, government, and society at large. Here's a look back at 10 noteworthy blog posts from 2014. These blog posts explore the benefits and downsides of advancements in big data technologies.

Posted December 15, 2014

The rise of digital platforms is spurring new innovation and new thinking within the data management world to an unprecedented degree, and it is recasting the look, feel, and functionality of solutions and approaches to data applications. There are many business opportunities arising from digital platforms that call for proactive leadership or engagement by data managers. A prime example is the emergence of the Internet of Things, which SMAC makes possible.

Posted December 03, 2014

Exploding data assets and the need for greater agility are helping to drive the move to virtualization. More than two-thirds of organizations in a recent Unisphere Research survey among members of the Independent Oracle Users Group indicate that the number of Oracle databases they manage is expanding. At the same time, many managers admit that their IT departments are sluggish when it comes to responding to new business requirements. For more than 50% of organizations, it takes their IT department 30 days or more to respond to new initiatives or deploy new solutions. For one-quarter of organizations, it takes 90 days or more.

Posted December 03, 2014

Posted December 02, 2014

Posted December 02, 2014

Posted December 02, 2014

Posted December 02, 2014

Posted December 02, 2014

Posted December 02, 2014

Posted December 02, 2014

Posted December 02, 2014

Posted December 02, 2014

Posted December 02, 2014

Data is increasingly being recognized as a rich resource flowing through organizations from a continually growing range of sources. But to realize its full potential, this data must be accessed by an array of users to support both real-time decision making and historical analysis, integrated with other information, and still kept safe from hackers and others with malicious intent. Fortunately, leading vendors are developing products and services to help. Here, DBTA presents the list of Trend-Setting Products in Data and Information Management for 2015.

Posted December 02, 2014

The increasing pressure of compliance regulations and security policies makes the deployment of high-level database protection a must-have for any organization. For those looking for ways to advance database security, here are 5 SQL Server best practices to maintain database security and streamline compliance.

Posted November 12, 2014

The Juno release of OpenStack, which made its debut on October 16, represents a significant milestone. The success of a cloud platform fundamentally depends on the ability to easily deploy applications on that platform, and the Juno release builds on Icehouse in making OpenStack a much more compelling option for customers wishing to operate a private cloud within their own enterprise, or for service providers who wish to provide a public cloud service.

Posted November 12, 2014

Business intelligence, analytics, and just-in-time reporting have exploded in recent years. The documentation manuals for older database software are filled with tips on shutting down access to data warehouses for the weekend while data loads are performed, dropping and recreating indexes along the way for a performance boost during the load. However, today's high-velocity world does not allow for that; instead, the data supply chain must constantly be flowing from source to ODS to warehouse to the rich dashboards of eager executives and business analysts at all times.

Posted November 12, 2014

The opportunity and imperative to optimize energy use in the data center industry is at an all-time high. In the modern digital era, data centers are as essential as power plants, a massive and critical infrastructure upon which our social, business, retail, healthcare, and government services are run. Gathering good data streams — metrics that matter to both business and IT — and correlating them through powerful analytics will amplify bottom line results. Here are 5 key ways more efficient power utilization can enable data centers to be more effective and efficient.

Posted October 22, 2014

We are in the midst of a business performance revolution, one where companies and customers alike expect instant access to the tools of commerce from anywhere at any time. Mobility is integral to this revolution, as the enterprise mobility phenomenon is quickly becoming a key driver of business innovation.

Posted October 22, 2014

At the most fundamental level, consider that at the end of the day NoSQL and SQL are essentially performing the same core task — storing data to a storage medium and providing a safe and efficient way to later retrieve said data. Sounds pretty simple — right? Well, it really is with a little planning and research. Here's a simple checklist of 5 steps to consider as you embark into the world of NoSQL databases.

Posted October 22, 2014

Apache Hadoop has been a great technology for storing large amounts of unstructured data, but to do analysis, users still need to reference data from existing RDBMS based systems. This topic was addressed in "From Oracle to Hadoop: Unlocking Hadoop for Your RDBMS with Apache Sqoop and Other Tools," a session at the Strata + Hadoop World conference, presented by Guy Harrison, executive director of Research and Development at Dell Software, David Robson, principal technologist at Dell Software, and Kathleen Ting, a technical account manager at Cloudera and a co-author of O'Reilly's Apache Sqoop Cookbook.

Posted October 22, 2014

In his presentation at the Strata + Hadoop World conference, titled "Unseating the Giants: How Big Data is Causing Big Problems for Traditional RDBMSs," Monte Zweben, CEO and co-founder of Splice Machine, addressed the topic of scale-up architectures as exemplified by traditional RDBMS technologies versus scale-out architectures, exemplified by SQL on Hadoop, NoSQL and NewSQL solutions.

Posted October 22, 2014

Today, many companies still have most of their transactional data in relational database management systems which support various business-critical applications, from order entry to financials. But in order to maintain processing performance, most companies limit the amount of data stored there, making it less useful for in-depth analysis. One alternative, according to a recent DBTA webcast presented by Bill Brunt, product manager, SharePlex, at Dell, and Unisphere Research analyst Elliot King, is moving the data to Hadoop to allow it to be inexpensively stored and analyzed for new business insight.

Posted October 22, 2014

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

Sponsors