Newsletters




Trends and Applications



Percona makes MySQL faster and more reliable for nearly 2,000 customers worldwide. Founded in 2006 to provide MySQL Consulting services, we've grown rapidly with the addition of MySQL Support, Remote DBA, Training, and Server Development services. Our global workforce of nearly 100 now provides 24x7, worldwide coverage to our customer base of leading MySQL users.

Posted June 03, 2013

Big Data has changed the game for all of us. The performance and expertise required to manage and use the ever-increasing flow of data into our companies have effectively sidelined older technologies. A new generation of data analytics tools is emerging; built to scale without losing performance, able to incorporate more data from diverse sources in disparate formats, and deliver high confidence results in our on demand world.

Posted June 03, 2013

Today's IT professionals are faced with a number of critical challenges, including rolling out mobile solutions to their employees and contractors, delivering 100% uptime, and seamlessly integrating solutions from many vendors—all while keeping costs under control. That's where Rocket Software can really bring value to organizations that rely on MultiValue databases for their mission-critical work.

Posted June 03, 2013

The volume and diversity of information being exchanged in both structured and unstructured formats in the enterprise has shifted the center of gravity for data from on-premises to cloud architectures. In order to take advantage of the market opportunities that this Golden Age of data has offered, many organizations are innovating in the form of SaaS, or Software-as-a-Service, because of its inherent benefits of speedy procurement, on-demand installation, portability to any device, efficient utilization of compute, network and storage resources and the ability to consume and repurpose use on demand.

Posted June 03, 2013

Our vision is simple - to transform the way in which real-time data are processed. Businesses want to drive latency out of their operations. Machine data generated by their servers, networks and sensors contain valuable insights into transactions, performance and fraud for example, but is only useful if acted on in real-time. However, achieving this using today's IT technology is like driving using only your rear view mirror - you'll never see what's coming until it's too late!

Posted June 03, 2013

Customers across a wide variety of industries know analyzing Big Data is critical to sustain competitiveness. More importantly, they are quickly realizing that traditional architectures won't help them, especially when it comes to integrating data - what we commonly know as ETL.

Posted June 03, 2013

Terracotta, Inc. is the leading provider of game changing high-volume, high-value Big Data management solutions for Global enterprises. Its flagship product, BigMemory, is a Big Data in-memory solution that delivers performance at any scale. Terracotta's other award-winning data management solutions include Ehcache—the defacto caching standard for enterprise Java—and Quartz—a leading job scheduler.

Posted June 03, 2013

It's become clear to me that the Business Intelligence (BI) market is undergoing a period of significant change. Organizations now realize the potential and value of empowering people—throughout the enterprise—with the ability to take better, faster fact-based action.

Posted June 03, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 03, 2013

Many argue that for big data to truly matter, you need some serious computing power and some data scientist wizards with mathematical prowess. While the volume of big data certainly can present challenges, there is also the velocity and variety of it to consider. Variety is actually the biggest concern and issue for most organizations that find it difficult to track, correlate and glean insight from data that's coming from so many diverse sources.

Posted May 23, 2013

Three things people need to think about in a big data implementation are persistence, context and access, John O'Brien, founder and principal, Radiant Advisors, told attendees during his keynote, "The Big Data Paradigm," at DBTA's Big Data Boot Camp. O'Brien's talk provided an overview of the technologies and issues that attendees would learn about during the conference which took place this week in New York City. Following the opening address, David Jonker, senior director, Big Data Marketing, SAP, highlighted the results of a new big data survey, the "2013 Big Data Opportunities Survey," which revealed a variety of practical approaches that organizations are adopting to manage and capitalize on big data.

Posted May 23, 2013

DBTA's Big Data Boot Camp provided attendees with an immersion experience in the world of big data. Presentations and panel discussions with data experts covered the value provided by Hadoop and related solutions, how it all fits into the overall enterprise data management picture, real-world use cases in which big data technologies are now being deployed, how to get started, the legal implications of big data management that organizations need to be aware of before they initiate a big data project, and how to successfully engage with customers through social media. Ten salient points emerged from the two-day conference which wrapped up at the Hilton New York on Wednesday.

Posted May 23, 2013

OpenText, a provider of enterprise information management (EIM) software, announced the expansion of the OpenText ECM Suite for SAP Solutions in support of SAP's latest technologies, including the SAP HANA platform, and cloud and mobility solutions. This support builds on a partnership that spans more than two decades and is growing every year, says Patrick Barnert, senior vice president, Partners and Alliances, OpenText. OpenText, he adds, is the first SAP ISV partner to have its products fully tested and confirmed by SAP to be integrated with SAP Business Suite powered by SAP HANA.

Posted May 23, 2013

Talend, a global open source software provider, has released version 5.3 of its integration platform, scaling the integration of data, application and business processes of any complexity. Talend version 5.3 reduces the skill sets and development costs necessary to leverage big data and Hadoop by enabling integration developers without specific expertise to develop on big data platforms. "The big challenge that organizations are facing today is not about getting the data; it's not about the platform to explore it; it's really about people who are operating the platform. There is a big shortage of developers with big data and Hadoop skills, and a big shortage of data scientists," Yves de Montcheuil, vice president of marketing at Talend, tells DBTA

Posted May 23, 2013

NuoDB, Inc., a provider of a cloud data management system offering SQL compliance and guaranteed ACID transactions, has introduced the NuoDB Starlings Release 1.1. Following on from its 1.0 release in January, NuoDB's Starlings Release 1.1 focuses on overall usability in three key areas, Seth Proctor, NuoDB chief architect, tells DBTA. The enhancements focus on greater Microsoft Windows support, general performance and stability, and an improved development and management experience in the web console, says Proctor.

Posted May 23, 2013

What does it mean to be a "real-time enterprise"? It means having a well-connected organization, free of the shackles of siloed and slow-moving data environments. It means being able to compete in today's hyper-competitive economy by giving decision makers immediate access to the right information, at the right time.

Posted May 23, 2013

A new release of Oracle Secure Global Desktop, part of Oracle's Desktop Virtualization portfolio, extends secure access to cloud-hosted and on-premise enterprise applications and desktops from Apple iPad and iPad mini tablets, without the need for a VPN client. Now, through support of the HTML5 standard, Oracle Secure Global Desktop 5.0 allows tablet users to access enterprise applications with just a web browser, so they can use their own devices for work. The new release provides tablet users, in addition to PC, MAC and desktop users, certified access to Oracle Exalogic Elastic Cloud and web-based applications such as Oracle E-Business Suite, Oracle Siebel CRM, Oracle Primavera, and many others.

Posted May 23, 2013

NoSQL databases are becoming increasingly popular for analyzing big data. There are very few NoSQL solutions, however, that provide the combination of scalability, reliability and data consistency required in a mission-critical application. As the open source implementation of Google's BigTable architecture, HBase is a NoSQL database that integrates directly with Hadoop and meets these requirements for a mission-critical database.

Posted May 09, 2013

What does it mean to be a "real-time enterprise"? It means having a well-connected organization, free of the shackles of siloed and slow-moving data environments. It means being able to compete in today's hyper-competitive economy by giving decision makers immediate access to the right information, at the right time.

Posted May 09, 2013

As machines increasingly are fitted with internet and other network access, enterprises will be able to capture and increasingly expected to respond to more customer data than ever before. Machine-to-machine (M2M) network connections—this so-called "Internet of Things"—is positioned to become the next source of major competitive advantage. Whatever you call it, M2M is turning out to be the poster child for big data's "Three Vs": Volume, Velocity and Variety. What M2M data requires is a fourth "V" (Visualization) to convert its big data into value by giving users the ability to identify data patterns through real-time analytics.

Posted April 25, 2013

A recent webcast presented by DBTA and Tableau Software on the Top Trends in Business Intelligence in 2013 is now available on-demand. Unisphere Research analyst Joe McKendrick made the case that by now it is well accepted that the ability to leverage data and analytics can be a critical game-changer for organizations, perhaps more so than even pricing or product innovation. However, too often the power of analytics is still not accessible across a broad enough range of enterprise users to make that strong impact. In light of the high volume of data that exists today, visualizations can be highly effective in presenting data, noted Tableau executive Suzanne L. Hoffman.

Posted April 25, 2013

The conference agenda as well as the list of speakers is now available for DBTA's Big Data Boot Camp, a deep dive designed to bring together thought leaders and practitioners who will provide insight on how to collect, manage, and act on big data. The conference will be held May 21-22 at the Hilton New York. SAP is the diamond sponsor, and Objectivity and MarkLogic are platinum sponsors of the two-day event.

Posted April 25, 2013

IBM announced it has enhanced the Tivoli System Automation family with a product focused on helping smaller zEnterprise customers. The IBM Automation Control for z/OS (IACz) product is targeted at single System z customers looking to move from manual scripting to policy-based automation.

Posted April 25, 2013

MarkLogic Corporation, the provider of an enterprise NoSQL database platform, announced that it has closed a $25 million round of growth capital led by Sequoia Capital and Tenaya Capital, with participation from Northgate Capital. MarkLogic CEO Gary Bloom also made a personal investment in this financing round. With capital to fuel sales and marketing, MarkLogic seeks to go after the broader market of enterprise class customers while also targeting several key areas for feature expansion, Bloom told DBTA.

Posted April 25, 2013

Confio Software released version 8.3 of its Ignite database performance monitoring software at COLLABORATE 13 this week. Ignite 8.3 enhancements were developed specifically to address the needs of DBAs with very large database deployments spread out geographically as well as enterprise-level requirements for security and compliance, Don Bergal, chief marketing officer of Confio, tells DBTA.

Posted April 25, 2013

Cloud computing has become a mainstream business technology strategy that is delivering the agility and flexibility that businesses need to move forward. To meet the requirements cloud brings to enterprises, new breeds of databases are emerging—either running in the cloud, or designed to optimize enterprise cloud computing.

Posted April 10, 2013

Big data has unceremoniously ended the era of the "all-purpose database." The days of sticking uniform data into a single database and running all your business applications off it are gone. Business data today comes in a variety of formats, from countless sources, in huge volumes and at fantastic speeds. Some data is incredibly valuable the instant it arrives, other data is only valuable when combined with large amounts of additional data and analyzed over time.

Posted April 10, 2013

In business, the rear view mirror is clearer than the windshield, said the sage of Omaha. And that is particularly true of business intelligence, composed almost entirely of such retrospectives. Consider this: Business intelligence proffers neatly organized historical data as a potential source of hindsight. Of course, there are also the dashboards of happenings in the "now" but precious little in terms of prompts to timely action. The time required to traverse that path from data to insight to intelligence to ideas to implementation to results is often the culprit. It's nowhere near quick enough, especially for businesses like banking, telecommunications and healthcare that set great store by the time value of information and the money value of time.

Posted April 10, 2013

Big data—a now well-used term intended to define the growing volume, variety, velocity, and value of information surging through organizations—has been on our radar screens for more than 2 years. In the process, it has become more than a buzz phrase thrown about at conferences and in the trade press—big data is now seen as the core of enterprise growth strategies.

Posted March 27, 2013

Dell Software is rolling out the latest version of its Kitenga Analytics solution, which extends the analysis of structured, semi-structured and unstructured data stored in Hadoop. Kitenga was acquired by Dell along with Quest Software in September 2012.

Posted March 27, 2013

10gen, the MongoDB company, has released MongoDB 2.4, featuring hashed-based sharding, capped arrays, text search, and geospatial enhancements. 10gen has also introduced MongoDB Enterprise as part of a new MongoDB Enterprise subscription level, featuring new monitoring and security features including Kerberos Authentication and role-based privileges.

Posted March 27, 2013

The Independent Oracle Users Group (IOUG) will celebrate its 20th anniversary at COLLABORATE 13, a conference on Oracle technology presented jointly by the IOUG, OAUG (Oracle Applications User Group) and the Quest International User Group. The event will be held April 7 to 11 at the Colorado Convention Center in Denver. As part of the conference, the IOUG will host the COLLABORATE 13-IOUG Forum with nearly 1,000 sessions providing user-driven content. The theme of this year's COLLABORATE 13-IOUG Forum is "Elevate - take control of your career and elevate your Oracle ecosystem knowledge and expertise," says IOUG president John Matelski.

Posted March 27, 2013

IBM announced that all cloud services and software will be based on an open cloud architecture. As the first step, IBM unveiled a new private cloud offering based on the open sourced OpenStack software that it says speeds and simplifies managing an enterprise-grade cloud. The offering provides businesses with a core set of open source-based technologies to build enterprise-class cloud services that can be ported across hybrid cloud environments. The IBM announcement "goes a long way" to position OpenStack against other more proprietary solutions, Jim Curry, senior vice president and general manager of Rackspace's Private Cloud business, tells DBTA.

Posted March 27, 2013

At the recent Strata conference, CitusDB showcased the latest release of its scalable analytics database. According to the vendor, CitusDB 2.0 brings together the performance of PostgreSQL and the scalability of Apache Hadoop, and enables real-time queries on data that's already in Hadoop. This new functionality is possible with CitusDB's distributed query planner, and PostgreSQL's foreign data wrappers.

Posted March 27, 2013

Two big questions are on the minds of data professionals these days. How are increasing complexity and the inevitable onslaught of big data shaping the future of database administrators and data architects? How will our roles change? In the interest of studying the evolving landscape of data, the Independent Oracle User's Group (IOUG) took the pulse of the community. The Big Data Skills for Success study polled numerous individuals in the IOUG Oracle technology community, to identify just how the responsibilities of handling data are changing and what the future of these roles looks like.

Posted March 14, 2013

Designing solutions that address MultiValue database customers' evolving needs in an affordable way while also maintaining the core attributes on which they have built their IT infrastructure has never been more important. Mobility, software-as-a service, enhanced analytics on burgeoning data volumes, and easy integration with other computing platforms are areas on which MV companies are focusing their efforts. In this special report, DBTA asks leading MV vendors: What are the new capabilities you are implementing to help fulfill customers' changing requirements? Hear from Revelation Software's Mike Ruane and Rocket U2's Susie Siegesmund; Kore Technologies' Ken Dickinson and Entrinsik's Doug Leupen; as well as BlueFinity's David Cooper, jBASE's David Peters, and Pick Cloud's Mark Pick.

Posted March 14, 2013

Data keeps growing, systems and servers keep sprawling, and users keep clamoring for more real-time access. The result of all this frenzy of activity is pressure for faster, more effective data integration that can deliver more expansive views of information, while still maintaining quality and integrity. Enterprise data and IT managers are responding in a variety of ways, looking to initiatives such as enterprise mashups, automation, virtualization, and cloud to pursue new paths to data integration. In the process, they are moving beyond the traditional means of integration they have relied on for years to pull data together.

Posted March 14, 2013

Databases are restricted by reliance on disk-based storage, a technology that has been in place for several decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to information storage devices remains a hindrance in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the Independent Oracle Users Group (IOUG). The survey was underwritten by SAP Corp. and conducted by Unisphere Research, a division of Information Today, Inc.

Posted March 14, 2013

A new survey of nearly 200 data managers and professionals, who are part of the Independent Oracle Users Group (IOUG), looks at the role of the data scientist - data professionals who can aggregate data from internal enterprise data stores as well as outside sources to provide the forecasts and insight required to help lead their organizations into the future. The research was conducted by Unisphere Research, a division of Information Today, Inc.

Posted February 27, 2013

The continued expansion of structured and unstructured data storage seems to be never-ending. At the same time, database administrators' need to reduce their storage consumption is accelerating as its cost becomes more visible. Today, however, there are data optimization technologies available that can help with the continued data growth.

Posted February 27, 2013

SAP AG has introduced a new version of its Sybase IQ disk-based column store analytics server. The overriding theme of this new release, which will be generally available later in the first quarter, "is positioning IQ 16 to go from terabytes to petabytes," Dan Lahl, senior director of product marketing at SAP, tells 5 Minute Briefing. To accomplish this, IQ 16 provides enhancements in three critical areas.

Posted February 27, 2013

DataCore Software, a provider of storage virtualization software, has made enhancements to its SANsymphony-V Storage Hypervisor. The new capabilities are intended to support customers who are facing high data growth, as well as the need to enable faster response times and provide continuous availability for business-critical applications.

Posted February 27, 2013

HP announced two new software-as-a-service (SaaS) solutions intended to speed application delivery and improve visibility, collaboration and agility across often siloed or geographically dispersed application development and operations teams. HP Agile Manager accelerates application time to market with an intuitive, web-based experience that offers visibility for planning, executing and tracking Agile development projects; and HP Performance Anywhere helps resolve application performance issues before they impact business services by providing visibility and predictive analytics.

Posted February 27, 2013

Oracle president Mark Hurd and Oracle executive vice president of product development Thomas Kurian recently hosted a conference call to provide an update on Oracle's cloud strategy and recap of product-related developments. Oracle is trying to do two things for customers - simplify their IT and power their innovation, said Hurd.

Posted February 27, 2013

Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.

Posted February 27, 2013

Having vast amounts of data at hand doesn't necessarily help executives make better decisions. In fact, without a simple way to access and analyze the astronomical amounts of available information, it is easy to become frozen with indecision, knowing the answers are likely in the data, but unsure how to find them. With so many companies proclaiming to offer salvation from all data issues, one of the most important factors to consider when selecting a solution is ease of use. An intuitive interface based on how people already operate in the real world is the key to adoption and usage throughout an organization.

Posted February 13, 2013

In-memory technology—in which entire data sets are pre-loaded into a computer's random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.

Posted February 13, 2013

A profound shift is occurring in where data lives. Thanks to skyrocketing demand for real-time access to huge volumes of data—big data—technology architects are increasingly moving data out of slow, disk-bound legacy databases and into large, distributed stores of ultra-fast machine memory. The plummeting price of RAM, along with advanced solutions for managing and monitoring distributed in-memory data, mean there are no longer good excuses to make customers, colleagues, and partners wait the seconds—or sometimes hours—it can take your applications to get data out of disk-bound databases. With in-memory, microseconds are the new seconds.

Posted February 13, 2013

The explosion of big data has presented many challenges for today's database administrators (DBAs), who are responsible for managing far more data than ever before. And with more programs being developed and tested, more tools are needed to help optimize data and efficiency efforts. Using techniques such as DB2's Multi-Row Fetch (MRF), DBAs are able to cut down on CPU time - and improve application efficiency. MRF was introduced in DB2 version 8 in 2004. Stated simply, it is the ability for DB2 to send multiple rows back to a requesting program at once, rather than one row at a time.

Posted January 24, 2013

Databases are hampered by a reliance on disk-based storage, a technology that has been in place for more than two decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to the permanent information storage devices is still a bottleneck in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the IOUG. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. The research results are detailed in a new report, titled "Accelerating Enterprise Insights: 2013 IOUG In-Memory Strategies Survey."

Posted January 24, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

Sponsors