Newsletters




Trends and Applications



Today, businesses are ending up with more and more critical dependency on their data infrastructure. If underlying database systems are not available, manufacturing floors cannot operate, stock exchanges cannot trade, retail stores cannot sell, banks cannot serve customers, mobile phone users cannot place calls, stadiums cannot host sports games, gyms cannot verify their subscribers' identity. Here is a look at some of the trends and how they are going to impact data management professionals.

Posted March 26, 2014

As the deployment of big data analytics becomes part of a business' "must-do" list, cloud platforms offer the scalability, flexibility, and on-demand ability to handle workload spikes that are hard to come by with on-premises systems. But, as is the case with every system and data environment, it takes careful planning to achieve the desired business value—which is to enable unprecedented opportunities to better understand and engage with key customer segments and markets.

Posted March 12, 2014

Much of the information contained in databases is sensitive and can be sold for cash or, such as in cases of theft by a disgruntled employee or by a hacker with political motivations, to cause the organization loss of business or reputation, especially if the organization is found to be in breach of regulations or industry standards that demand high levels of data security. However, there are 5 key steps that can be taken to ensure database security.

Posted March 12, 2014

Data Summit will take place at the New York Hilton Midtown, from May 12 to May 14. The advance program is now available and registration is open with a special early bird registration rate when you register before April 11, 2014.

Posted March 12, 2014

Rogue IT practices have grown exponentially with the use of public cloud services. Will these herald the end of internal IT or a means for healthy IT transformation? The risk exists for either outcome. It all depends on how IT organizations approach it.

Posted March 12, 2014

To recognize the best information management solutions in the marketplace, Database Trends and Applications has launched the DBTA Readers' Choice Awards, a program in which the winners will be selected by the experts whose opinions count above all others - you. The nominations period will conclude on March 28, 2014, and voting will begin on April 11.

Posted March 12, 2014

Database Trends and Applications (DBTA) and IBM have launched a Big Data Survey to provide a timely analysis of the key technologies, practices and strategies that businesses across different industries are using to improve confidence in their decision making. Respond with a completed survey by March 2nd, 2014, in order to be included in a drawing to win one of three $200 American Express gift cards to be awarded at the conclusion of the study.

Posted February 26, 2014

The newest version of DataStax's eponymously named enterprise database platform adds a new in-memory option as well as enterprise search enhancements to support high performance. Along with the latest release of its enterprise NoSQL database, DataStax has also introduced version 4.1 of DataStax OpsCenter which supplies improved capacity management capabilities and visual monitoring of production database clusters.

Posted February 26, 2014

Enterprise data warehouses aren't going away anytime soon. Despite claims that Hadoop will usurp the role of data warehousing, Hadoop needs data warehouses, just as data warehouses need Hadoop. However, making the leap from established data warehouse environments—the kind most companies still have, based on extract, transform and load (ETL) inputs with a relational data store and query and analysis tools—to the big data realm isn't a quick hop.

Posted February 26, 2014

In many ways, IT protection is like a game of poker. There are two things you need to win: a strong ability to play and the best hand you can get. With the former, a lot of it comes down to knowing what not to do.

Posted February 26, 2014

While minimizing risk is important for financial services organizations, companies need to think beyond fraud detection and understand the wider benefits predictive analytics can provide in terms of better customer experience, driving opportunities and increasing revenue in a highly competitive market. Among other uses, a comprehensive customer database, combined with advanced analytics, will allow organizations to spot trends, identify cohorts, and micro-target discrete populations.

Posted February 26, 2014

Believing that its ability to do advanced customer analytics on an increasingly vast data set would be enhanced by a platform that would grow exponentially, the customer science company dunnhumby turned to Oracle Exadata Database Machine and Oracle Advanced Analytics. dunnhumby's global chief information officer Yael Cosset explains what the company gained with the implementation.

Posted February 26, 2014

In order to be effective, big data analytics must present a clear and consistent picture of what's happening in and around the enterprise. Does a new generation of databases and platforms offer the scalability and velocity required for cloud-based, big data-based applications—or will more traditional relational databases come roaring back for all levels of big data challenges?

Posted February 10, 2014

There's no doubt that the management at Target had a miserable holiday season at the end of last year, between all the bad PR that came out about the online theft of 40 million customers' data records—later revised to be even higher—and the costs of providing disclosures and working with banks, and the headaches of potentially expensive lawsuits that are being filed. Such is every organization's nightmare, the price of openness and accessibility. According to a new survey of 322 data and IT managers, there is a growing awareness among enterprise executives and managers about the potential issues to enterprise data security.

Posted February 10, 2014

The hyperscale market is just beginning to drive major transformations in compute infrastructure. Companies deploying hyperscale architectures serve the widest customer segments, build the largest data centers, and break new ground with real-time, data-centric services. To do so, they require hundreds of petabytes of server-side non-volatile memory to effectively provide the services that better the lives of their global customers.

Posted February 10, 2014

New IOUG (Independent Oracle Users Group) research underwritten by EMC looks at the problem of mission-critical-application downtime and its impact on organizations. According to this new global survey, among respondents with at least two data centers and rapid replication solutions, 46% indicate are less than satisfied with their current strategies.

Posted February 10, 2014

To say that big data is the sum of its volume, variety, and velocity is a lot like saying that nuclear power is simply and irreducibly a function of fission, decay, and fusion. It's to ignore the societal and economic factors that—for good or ill—ultimately determine how big data gets used. In other words, if we want to understand how big data has changed data integration, we need to consider the ways in which we're using—or in which we want to use—big data.

Posted January 20, 2014

DBTA is seeking speakers who possess unique insight into leading technologies, and experience with successful IT and business strategies for the Data Summit conference in New York City, May 12-14, 2014. The deadline to submit your proposal is January 31, 2014.

Posted January 20, 2014

Providing "enterprise BI" that includes social analytics will be a significant challenge to many enterprises in the near future. This is one of the primary reasons for the success of the new wave of innovative and easy-to-use BI and social media analytical tools within the last several years.

Posted January 20, 2014

In today's business landscape, organizations are increasingly focusing on improving the customer experience to ensure that they're staying with, or ahead of, the competition. It's widely understood that in order to improve the customer experience, it's imperative that organizations understand the customer and tailor their services or products to each demographic and customer segment. However, two major developments are bringing about a marked change to this tried-and-true customer experience strategy: the proliferation of big data and the shrinking size of customer segments.

Posted January 20, 2014

Volume is only one of the challenges organizations face. Real-time processing of in-motion high-velocity feeds is crucial to truly unlock big data's potential. A look at where data is originating and being consumed puts the opportunity and importance of velocity processing into context. What's the solution?

Posted January 20, 2014

While all the excitement is currently focused on new-age solutions that have surfaced in the past few years—NoSQL, NewSQL, cloud, and open source databases—there is still a great deal of uncertainty and consternation among corporate and IT leaders as to what role new data sources will play in business futures.

Posted January 20, 2014

Institutions around the world depend on Vernon Systems Limited, based in Auckland, New Zealand, to provide sophisticated collection management and web access for cultural treasures. Revelation Software's products have been core to Vernon Systems' business, providing the main development environment for Vernon Systems since it was founded in 1986.

Posted January 20, 2014

Software-defined networking (SDN) is IT's new black, displacing cloud as the technology darling du jour. But while all the focus on the network layers is ultimately good for applications—after all, an optimized network is critical for applications today—SDN does not address challenges in the application layers that are just as key to ensuring performance, security, and availability of applications in the data center and into the cloud.

Posted January 07, 2014

Before standardizing on a specific master data management (MDM) solution for your IT infrastructure, take the time to look under the hood to make sure the MDM platform is capable of keeping up with your business and providing long-term value. By understanding the advantages of today's MDM technology, and focusing on key and often-overlooked technical functionality, you may just find that you are in fact the unsung hero.

Posted January 07, 2014

In 2013, two major shifts in the big data landscape occurred, which can be described as the Battle Over Persistence and the Race for Access Hill. The acceptance of leveraging the strengths of various database technologies in an optimized Modern Data Platform has more or less been resolved, but the recognition of a single point of access and context is next. Likewise, the race for access will continue well into 2014.

Posted January 07, 2014

Award-winning Attunity Replicate is automated, easy-to-use, high-performance data replication and loading software.

Posted December 20, 2013

The data-driven demands on organizations have never been greater. Two of the most pressing concerns that organizations face today are the need to provide analytic access to newer data types such as machine-generated data, documents and graphics, and the need to control the cost of information management for growing data stores. DBTA's new list of Trend-Setting Products in Data for 2014 highlights the products, platforms, and services that seek to provide organizations with the tools necessary to address rapidly changing market requirements.

Posted December 20, 2013

Dell Boomi AtomSphere®, the world's largest integration cloud, enables customers to connect any combination of cloud and on-premise applications without software, appliances or coding. Organizations of all sizes, from growing companies to very large enterprises, enjoy rapid time to value as a result of drastically reduced implementation times and substantial cost savings over traditional integration solutions.

Posted December 20, 2013

Companies are rapidly running out of space in their data warehouses. Odds are, most organizations have far less capacity available than they think. Their last upgrade should have provided enough space for at least two years. However, with the rapid growth of Big Data, that's not often the case.

Posted December 20, 2013

An enterprise RDBMS that spans the globe is now a reality with the geographically distributed TransLattice Elastic DatabaseTM (TED). The nodes are placed wherever needed, without the distance limitations often associated with distributed systems. The system scales out easily and the data is spread across the nodes.

Posted December 20, 2013

Revolution REnterprise (RRE) is the fastest, enterprise-class big data big analytics platform available today. Supporting a variety of big data statistics, predictive modeling and machine learning capabilities, RRE provides users with cost-effective and fast big data analytics that are fully compatible with the R language, the de facto standard for modern analytics users.

Posted December 20, 2013

OpenInsight, from Revelation Software, is a database development suite that provides Windows, Web 2.0 and .NET tools to develop and deploy mission critical applications. These tools can be used with OpenInsight's proprietary NoSQL database, nearly any flavor of SQL database, or any of the many MultiValue databases.

Posted December 20, 2013

Many of the world's most successful companies use Teradata, the world-class enterprise data warehouse, for their high-stakes analytics needs. With rapid data growth and business demands for access to new data sources, the warehouse is experiencing constantly changing demands. Scale and performance are central to those business needs and organizations are now looking to optimize their most critical analytics environments. As data ages, it is less frequently analyzed so organizations are now taking a serious look at dedicated archiving solutions. Doing so is critical for companies with significant compliance requirements. That's where RainStor comes in.

Posted December 20, 2013

Business intelligence initiatives, real-time dashboards, and improved reporting are on everyone's radar. Unfortunately, data is more dispersed than ever before, increasingly distributed in the various SaaS applications that a business relies upon. This makes the challenge of implementing these initiatives far more difficult than in the ‘good ole days,' where one could simply connect the tools directly to a database. SaaS data is not open and accessible in a standard way. Instead, it is exposed via APIs (programming interfaces) that require custom coding and expensive integration projects just to connect the pieces together.

Posted December 20, 2013

Objectivity, Inc. is the Enterprise Database leader of real-time, complex Big Data solutions. Our leading edge technologies: InfiniteGraph, The Distributed Graph Database™ and Objectivity/DB, a distributed and scalable object management database, enable organizations to develop scalable solutions to discover hidden relationships for improved Big Data analytics, develop new ROI opportunities and improve inter-departmental business processes to achieve greater return on data-related investments.

Posted December 20, 2013

Performance as a Service is the hottest new service every database manager must know about because it delivers rapid problem resolution for application and database performance issues. Ntirety provides Database Performance as a Service through a powerful combination of technology and skills which incorporates our own Ntrust™ Database Appliance with AppDynamics PRO and DBTuna software. By bundling these together in a service, you don't need to purchase, install or manage new hardware and software to gain access to powerful performance management capabilities.

Posted December 20, 2013

Since 1985, data quality's been our obsession—the driving force behind many of our scalable data cleansing and enrichment solu¬tions. It's this passion that's culminated in the creation of Melissa Data's new flagship Personator technology, our next-generation enterprise data quality solution.

Posted December 20, 2013

Kore Technologies is a leading provider of enterprise integration, data warehousing, and business intelligence solutions for the mid-market, specializing in companies that use the UniData/ UniVerse (U2) database. Kourier Integrator is Kore's flagship product for Extract, Transform, and Load (ETL) and Enterprise Application Integration (EAI), enabling companies to integrate and connect with disparate databases and best-in-class applications.

Posted December 20, 2013

A Wal-Mart store in Chicago offers an endcap—will the additional product sales be greater than the trade funds expense? As important—are there trade funds available for this particular product right now? To compile the data needed to assess these promotional programs, sales executives logged into 40 (forty!) different applications to monitor dozens of screens. With JackBe Presto, the company transformed those 40 systems into a single sign-on.

Posted December 20, 2013

What do FarmVille, Guess and the Obama campaign have in common? These are shining examples of successes created by data-driven organizations. The technology that makes these possible is the HP Vertica Analytic Platform, a highly scalable and purpose-built platform for big data analytics. Founded in 2005 by database legend Michael Stonebraker, and acquired by HP in 2011, Vertica has become the defacto standard for analytics within companies like Zynga, Guess, Twitter, Comcast, Cerner, HP, and many others.

Posted December 20, 2013

EnterpriseDB has created the products and an ecosystem of services and support to enable global enterprises to deploy open source software in the data center using Postgres to power their most important applications. The success of open source has been realized in other layers of the enterprise stack; Xen and KVM for virtualization, Linux for operating systems, and JBoss and Apache for middleware. Forward-thinking CIOs are now turning increasingly to the database layer and to Postgres to reduce their reliance on costly proprietary solutions.

Posted December 20, 2013

Want to stay ahead of the competition? Then you know that this endeavor demands systematic analysis of information on new patents, new technologies, competitors, competing products, market developments, industries and customer expectations. For this purpose, an efficient "radar system" provides essential support in managing these tasks: the Empolis Competitive Intelligence solution is the antenna that brings important information (or signals) to your screen.

Posted December 20, 2013

The race is on! Winners and losers in business are being decided based on who can extract more value from exponentially increasing information, with agility, to meet business goals. The vision of many to many, data sources to data consumers, is very appealing to top executives but IT is struggling to get there fast enough. Data Virtualization offers the solution that is fast and strategic at the same time. With Denodo, business strategists, CIOs and other IT experts can plan the implementation of a shared data layer across the enterprise, expose a common data model and a unified interface over a multiplicity of diverse data sources that can feed and support an increasing number of business applications, from BI and analytics to portals, operational applications and web and mobile apps.

Posted December 20, 2013

Delphix delivers agility to enterprise application projects, addressing the largest source of inefficiency and inflexibility in the datacenter—provisioning, managing, and refreshing databases for business-critical applications. With Delphix in place, QA engineers spend more time testing and less time waiting for new data, increasing utilization of expensive test infrastructure. Analysts and managers make better decisions with fresh data in data marts and warehouses.

Posted December 20, 2013

Thank you, DBTA, for this distinctive honor, and for the opportunity to share a few words about what makes DBI Software's pureFeat™ Performance Management suite for IBM DB2 LUW distinctively different.

Posted December 20, 2013

The promise of "Big Data" has driven organizations to rethink their approach to traditional business intelligence. To stay competitive, organizations need to harness all of the relevant information to run the business regardless of its type (variety), its size (volume) or the speed in which its delivered (velocity). Datawatch is at the forefront of Next Generation Analytics by providing organizations the ability to analyze and understand Any Data Variety, regardless of structure, at Real-time Velocity, through an unmatched Visual Data Discovery environment.

Posted December 20, 2013

Database monitoring is useless unless your monitoring system can seamlessly raise intelligent alerts to dispatch the optimum level of response. That's why Datavail, the largest pure-play database services company in North America, devel¬oped Datavail Delta, a tool built to monitor a wide variety of OS and Database parameters. Windows Server 2003, 2008, 2008R2 and 2012 and SQL Server versions 2000, 2005, 2008, 2008R2 and 2012 are compatible with Delta.

Posted December 20, 2013

Seamless access for data analysis across heterogeneous data sources represents ‘the holy grail' within mainstream enterprises. Designed for Big Data processing and performance at scale, Cirro is a revolutionary approach to bridging corporate analytic data silos.

Posted December 20, 2013

As the sponsor of Cisco's acquisition of Composite Software, I am often asked about expected synergies from combining the leaders in data virtualization and networking. While I cannot divulge all our secrets, analyst firm EMA was prescient in their recent report entitled "Data Virtualization Meets the Network."

Posted December 20, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

Sponsors