Newsletters




Trends and Applications



In order to be effective, big data analytics must present a clear and consistent picture of what's happening in and around the enterprise. Does a new generation of databases and platforms offer the scalability and velocity required for cloud-based, big data-based applications—or will more traditional relational databases come roaring back for all levels of big data challenges?

Posted February 10, 2014

There's no doubt that the management at Target had a miserable holiday season at the end of last year, between all the bad PR that came out about the online theft of 40 million customers' data records—later revised to be even higher—and the costs of providing disclosures and working with banks, and the headaches of potentially expensive lawsuits that are being filed. Such is every organization's nightmare, the price of openness and accessibility. According to a new survey of 322 data and IT managers, there is a growing awareness among enterprise executives and managers about the potential issues to enterprise data security.

Posted February 10, 2014

The hyperscale market is just beginning to drive major transformations in compute infrastructure. Companies deploying hyperscale architectures serve the widest customer segments, build the largest data centers, and break new ground with real-time, data-centric services. To do so, they require hundreds of petabytes of server-side non-volatile memory to effectively provide the services that better the lives of their global customers.

Posted February 10, 2014

New IOUG (Independent Oracle Users Group) research underwritten by EMC looks at the problem of mission-critical-application downtime and its impact on organizations. According to this new global survey, among respondents with at least two data centers and rapid replication solutions, 46% indicate are less than satisfied with their current strategies.

Posted February 10, 2014

To say that big data is the sum of its volume, variety, and velocity is a lot like saying that nuclear power is simply and irreducibly a function of fission, decay, and fusion. It's to ignore the societal and economic factors that—for good or ill—ultimately determine how big data gets used. In other words, if we want to understand how big data has changed data integration, we need to consider the ways in which we're using—or in which we want to use—big data.

Posted January 20, 2014

DBTA is seeking speakers who possess unique insight into leading technologies, and experience with successful IT and business strategies for the Data Summit conference in New York City, May 12-14, 2014. The deadline to submit your proposal is January 31, 2014.

Posted January 20, 2014

Providing "enterprise BI" that includes social analytics will be a significant challenge to many enterprises in the near future. This is one of the primary reasons for the success of the new wave of innovative and easy-to-use BI and social media analytical tools within the last several years.

Posted January 20, 2014

In today's business landscape, organizations are increasingly focusing on improving the customer experience to ensure that they're staying with, or ahead of, the competition. It's widely understood that in order to improve the customer experience, it's imperative that organizations understand the customer and tailor their services or products to each demographic and customer segment. However, two major developments are bringing about a marked change to this tried-and-true customer experience strategy: the proliferation of big data and the shrinking size of customer segments.

Posted January 20, 2014

Volume is only one of the challenges organizations face. Real-time processing of in-motion high-velocity feeds is crucial to truly unlock big data's potential. A look at where data is originating and being consumed puts the opportunity and importance of velocity processing into context. What's the solution?

Posted January 20, 2014

While all the excitement is currently focused on new-age solutions that have surfaced in the past few years—NoSQL, NewSQL, cloud, and open source databases—there is still a great deal of uncertainty and consternation among corporate and IT leaders as to what role new data sources will play in business futures.

Posted January 20, 2014

Institutions around the world depend on Vernon Systems Limited, based in Auckland, New Zealand, to provide sophisticated collection management and web access for cultural treasures. Revelation Software's products have been core to Vernon Systems' business, providing the main development environment for Vernon Systems since it was founded in 1986.

Posted January 20, 2014

Software-defined networking (SDN) is IT's new black, displacing cloud as the technology darling du jour. But while all the focus on the network layers is ultimately good for applications—after all, an optimized network is critical for applications today—SDN does not address challenges in the application layers that are just as key to ensuring performance, security, and availability of applications in the data center and into the cloud.

Posted January 07, 2014

Before standardizing on a specific master data management (MDM) solution for your IT infrastructure, take the time to look under the hood to make sure the MDM platform is capable of keeping up with your business and providing long-term value. By understanding the advantages of today's MDM technology, and focusing on key and often-overlooked technical functionality, you may just find that you are in fact the unsung hero.

Posted January 07, 2014

In 2013, two major shifts in the big data landscape occurred, which can be described as the Battle Over Persistence and the Race for Access Hill. The acceptance of leveraging the strengths of various database technologies in an optimized Modern Data Platform has more or less been resolved, but the recognition of a single point of access and context is next. Likewise, the race for access will continue well into 2014.

Posted January 07, 2014

Award-winning Attunity Replicate is automated, easy-to-use, high-performance data replication and loading software.

Posted December 20, 2013

The data-driven demands on organizations have never been greater. Two of the most pressing concerns that organizations face today are the need to provide analytic access to newer data types such as machine-generated data, documents and graphics, and the need to control the cost of information management for growing data stores. DBTA's new list of Trend-Setting Products in Data for 2014 highlights the products, platforms, and services that seek to provide organizations with the tools necessary to address rapidly changing market requirements.

Posted December 20, 2013

Dell Boomi AtomSphere®, the world's largest integration cloud, enables customers to connect any combination of cloud and on-premise applications without software, appliances or coding. Organizations of all sizes, from growing companies to very large enterprises, enjoy rapid time to value as a result of drastically reduced implementation times and substantial cost savings over traditional integration solutions.

Posted December 20, 2013

Companies are rapidly running out of space in their data warehouses. Odds are, most organizations have far less capacity available than they think. Their last upgrade should have provided enough space for at least two years. However, with the rapid growth of Big Data, that's not often the case.

Posted December 20, 2013

An enterprise RDBMS that spans the globe is now a reality with the geographically distributed TransLattice Elastic DatabaseTM (TED). The nodes are placed wherever needed, without the distance limitations often associated with distributed systems. The system scales out easily and the data is spread across the nodes.

Posted December 20, 2013

Revolution REnterprise (RRE) is the fastest, enterprise-class big data big analytics platform available today. Supporting a variety of big data statistics, predictive modeling and machine learning capabilities, RRE provides users with cost-effective and fast big data analytics that are fully compatible with the R language, the de facto standard for modern analytics users.

Posted December 20, 2013

OpenInsight, from Revelation Software, is a database development suite that provides Windows, Web 2.0 and .NET tools to develop and deploy mission critical applications. These tools can be used with OpenInsight's proprietary NoSQL database, nearly any flavor of SQL database, or any of the many MultiValue databases.

Posted December 20, 2013

Many of the world's most successful companies use Teradata, the world-class enterprise data warehouse, for their high-stakes analytics needs. With rapid data growth and business demands for access to new data sources, the warehouse is experiencing constantly changing demands. Scale and performance are central to those business needs and organizations are now looking to optimize their most critical analytics environments. As data ages, it is less frequently analyzed so organizations are now taking a serious look at dedicated archiving solutions. Doing so is critical for companies with significant compliance requirements. That's where RainStor comes in.

Posted December 20, 2013

Business intelligence initiatives, real-time dashboards, and improved reporting are on everyone's radar. Unfortunately, data is more dispersed than ever before, increasingly distributed in the various SaaS applications that a business relies upon. This makes the challenge of implementing these initiatives far more difficult than in the ‘good ole days,' where one could simply connect the tools directly to a database. SaaS data is not open and accessible in a standard way. Instead, it is exposed via APIs (programming interfaces) that require custom coding and expensive integration projects just to connect the pieces together.

Posted December 20, 2013

Objectivity, Inc. is the Enterprise Database leader of real-time, complex Big Data solutions. Our leading edge technologies: InfiniteGraph, The Distributed Graph Database™ and Objectivity/DB, a distributed and scalable object management database, enable organizations to develop scalable solutions to discover hidden relationships for improved Big Data analytics, develop new ROI opportunities and improve inter-departmental business processes to achieve greater return on data-related investments.

Posted December 20, 2013

Performance as a Service is the hottest new service every database manager must know about because it delivers rapid problem resolution for application and database performance issues. Ntirety provides Database Performance as a Service through a powerful combination of technology and skills which incorporates our own Ntrust™ Database Appliance with AppDynamics PRO and DBTuna software. By bundling these together in a service, you don't need to purchase, install or manage new hardware and software to gain access to powerful performance management capabilities.

Posted December 20, 2013

Since 1985, data quality's been our obsession—the driving force behind many of our scalable data cleansing and enrichment solu¬tions. It's this passion that's culminated in the creation of Melissa Data's new flagship Personator technology, our next-generation enterprise data quality solution.

Posted December 20, 2013

Kore Technologies is a leading provider of enterprise integration, data warehousing, and business intelligence solutions for the mid-market, specializing in companies that use the UniData/ UniVerse (U2) database. Kourier Integrator is Kore's flagship product for Extract, Transform, and Load (ETL) and Enterprise Application Integration (EAI), enabling companies to integrate and connect with disparate databases and best-in-class applications.

Posted December 20, 2013

A Wal-Mart store in Chicago offers an endcap—will the additional product sales be greater than the trade funds expense? As important—are there trade funds available for this particular product right now? To compile the data needed to assess these promotional programs, sales executives logged into 40 (forty!) different applications to monitor dozens of screens. With JackBe Presto, the company transformed those 40 systems into a single sign-on.

Posted December 20, 2013

What do FarmVille, Guess and the Obama campaign have in common? These are shining examples of successes created by data-driven organizations. The technology that makes these possible is the HP Vertica Analytic Platform, a highly scalable and purpose-built platform for big data analytics. Founded in 2005 by database legend Michael Stonebraker, and acquired by HP in 2011, Vertica has become the defacto standard for analytics within companies like Zynga, Guess, Twitter, Comcast, Cerner, HP, and many others.

Posted December 20, 2013

EnterpriseDB has created the products and an ecosystem of services and support to enable global enterprises to deploy open source software in the data center using Postgres to power their most important applications. The success of open source has been realized in other layers of the enterprise stack; Xen and KVM for virtualization, Linux for operating systems, and JBoss and Apache for middleware. Forward-thinking CIOs are now turning increasingly to the database layer and to Postgres to reduce their reliance on costly proprietary solutions.

Posted December 20, 2013

Want to stay ahead of the competition? Then you know that this endeavor demands systematic analysis of information on new patents, new technologies, competitors, competing products, market developments, industries and customer expectations. For this purpose, an efficient "radar system" provides essential support in managing these tasks: the Empolis Competitive Intelligence solution is the antenna that brings important information (or signals) to your screen.

Posted December 20, 2013

The race is on! Winners and losers in business are being decided based on who can extract more value from exponentially increasing information, with agility, to meet business goals. The vision of many to many, data sources to data consumers, is very appealing to top executives but IT is struggling to get there fast enough. Data Virtualization offers the solution that is fast and strategic at the same time. With Denodo, business strategists, CIOs and other IT experts can plan the implementation of a shared data layer across the enterprise, expose a common data model and a unified interface over a multiplicity of diverse data sources that can feed and support an increasing number of business applications, from BI and analytics to portals, operational applications and web and mobile apps.

Posted December 20, 2013

Delphix delivers agility to enterprise application projects, addressing the largest source of inefficiency and inflexibility in the datacenter—provisioning, managing, and refreshing databases for business-critical applications. With Delphix in place, QA engineers spend more time testing and less time waiting for new data, increasing utilization of expensive test infrastructure. Analysts and managers make better decisions with fresh data in data marts and warehouses.

Posted December 20, 2013

Thank you, DBTA, for this distinctive honor, and for the opportunity to share a few words about what makes DBI Software's pureFeat™ Performance Management suite for IBM DB2 LUW distinctively different.

Posted December 20, 2013

The promise of "Big Data" has driven organizations to rethink their approach to traditional business intelligence. To stay competitive, organizations need to harness all of the relevant information to run the business regardless of its type (variety), its size (volume) or the speed in which its delivered (velocity). Datawatch is at the forefront of Next Generation Analytics by providing organizations the ability to analyze and understand Any Data Variety, regardless of structure, at Real-time Velocity, through an unmatched Visual Data Discovery environment.

Posted December 20, 2013

Database monitoring is useless unless your monitoring system can seamlessly raise intelligent alerts to dispatch the optimum level of response. That's why Datavail, the largest pure-play database services company in North America, devel¬oped Datavail Delta, a tool built to monitor a wide variety of OS and Database parameters. Windows Server 2003, 2008, 2008R2 and 2012 and SQL Server versions 2000, 2005, 2008, 2008R2 and 2012 are compatible with Delta.

Posted December 20, 2013

Seamless access for data analysis across heterogeneous data sources represents ‘the holy grail' within mainstream enterprises. Designed for Big Data processing and performance at scale, Cirro is a revolutionary approach to bridging corporate analytic data silos.

Posted December 20, 2013

As the sponsor of Cisco's acquisition of Composite Software, I am often asked about expected synergies from combining the leaders in data virtualization and networking. While I cannot divulge all our secrets, analyst firm EMA was prescient in their recent report entitled "Data Virtualization Meets the Network."

Posted December 20, 2013

As a leader in the data modeling space, CA ERwin is privileged to be an integral part of organizations' key strategic initiatives such as business intelligence and analytics, data governance, or data quality—many of which revolve around data. At CA Technologies, we understand that data runs your business, and we've put a strong focus on developing a solution that can act as an "information hub" for these initiatives.

Posted December 20, 2013

Database Plugins is pleased to have its keystone product, the Database Plugin Server, selected as a trend-setting application for 2013 by DBTA. The Database Plugin Server has certainly fostered a continuing line of innovative products.

Posted December 20, 2013

More businesses are beginning to realize the value that big data analytics can provide, according to ParStream, which has compiled a list of five key trends that will impact the big data landscape in the next year. In 2014, "fast data" - which enables companies to make real-time, fact-based decisions by using historical and live data to improve the way they run their business, deliver solutions, and engage with customers - will become the factor that separates inexperienced big data solution providers from established ones, the company says.

Posted December 17, 2013

Database technology has gone through somewhat of a renaissance in recent years, given new goals and requirements for storage, processing and analytics and new database vendors have emerged. When evaluating modern data management technologies, there are five key characteristics to consider.

Posted December 17, 2013

Oracle has announced the fifth generation database machine, the Oracle Exadata Database Machine X4, which adds enhancements to improve performance and quality of service for OLTP, DBaaS (database as a service), and data warehousing. Tim Shetler, vice president of product management, Oracle, shared his views on the update in an interview. "There is no price change with this generation. It is the same price as before. We are just giving customers more capacity and more performance," said Shetler.

Posted December 17, 2013

As unstructured data overtakes structured data within enterprises, the coming year will see the start of a reassessment of how data is architected, stored, and queried in enterprises. To meet this challenge, new technologies and solutions have already begun to transform data management within enterprises.

Posted December 17, 2013

Big data can be a Siren, whose beautiful call lures unsuspecting sailors to a rocky destruction. The potential value of big data analysis to increase income (or lower expenses) for the company tends to drown out the calls for risk oversight. Understanding the legal and regulatory consequences will help keep your company safe from those dangerous rocks.

Posted December 17, 2013

It's time to look back at some of the most interesting big data blog posts of the past 12 months. These 12 posts provide warnings, tips and tricks, and often a touch of humor as well.

Posted December 17, 2013

Serena Software, a provider of orchestrated application development and release management solutions, has announced Serena Release Manager v5. The new release automates application deployments, provides visibility, control and standardization of the release process, and supports coordination and collaboration for release teams.

Posted December 17, 2013

The latest release of CA ERwin Data Modeler, a solution for collaboratively visualizing and managing business data, addresses two major objectives - the need for organizations to manage more data across more platforms, and to easily share that data with an expanding number of users with a range of roles and skill sets.

Posted December 17, 2013

There are many ways that big data can help businesses make better decisions and succeed more quickly, ranging from product innovation to manufacturing to marketing. To help organizations get their big data projects off on the right foot, here are five essential truths about big data analytics.

Posted December 17, 2013

A newly published Unisphere Research survey of 160 data managers and professionals who are part of the Independent Oracle Users Group (IOUG) and currently running Oracle Databases finds that business demand for database services as well as the associated data volumes has been on an upward trajectory. The survey reveals deep concern among IT managers and decision makers with meeting demand for database services in a world where both the number of requests as well as the associated data volumes is steadily climbing.

Posted December 04, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25

Sponsors