Newsletters




Trends and Applications



Institutions around the world depend on Vernon Systems Limited, based in Auckland, New Zealand, to provide sophisticated collection management and web access for cultural treasures. Revelation Software's products have been core to Vernon Systems' business, providing the main development environment for Vernon Systems since it was founded in 1986.

Posted January 20, 2014

Software-defined networking (SDN) is IT's new black, displacing cloud as the technology darling du jour. But while all the focus on the network layers is ultimately good for applications—after all, an optimized network is critical for applications today—SDN does not address challenges in the application layers that are just as key to ensuring performance, security, and availability of applications in the data center and into the cloud.

Posted January 07, 2014

Before standardizing on a specific master data management (MDM) solution for your IT infrastructure, take the time to look under the hood to make sure the MDM platform is capable of keeping up with your business and providing long-term value. By understanding the advantages of today's MDM technology, and focusing on key and often-overlooked technical functionality, you may just find that you are in fact the unsung hero.

Posted January 07, 2014

Big Data: The Battle Over Persistence and the Race for Access Hill

Posted January 07, 2014

Award-winning Attunity Replicate is automated, easy-to-use, high-performance data replication and loading software.

Posted December 20, 2013

Trend-Setting Products in Data for 2014

Posted December 20, 2013

Dell Boomi AtomSphere®, the world's largest integration cloud, enables customers to connect any combination of cloud and on-premise applications without software, appliances or coding. Organizations of all sizes, from growing companies to very large enterprises, enjoy rapid time to value as a result of drastically reduced implementation times and substantial cost savings over traditional integration solutions.

Posted December 20, 2013

Companies are rapidly running out of space in their data warehouses. Odds are, most organizations have far less capacity available than they think. Their last upgrade should have provided enough space for at least two years. However, with the rapid growth of Big Data, that's not often the case.

Posted December 20, 2013

An enterprise RDBMS that spans the globe is now a reality with the geographically distributed TransLattice Elastic DatabaseTM (TED). The nodes are placed wherever needed, without the distance limitations often associated with distributed systems. The system scales out easily and the data is spread across the nodes.

Posted December 20, 2013

Revolution REnterprise (RRE) is the fastest, enterprise-class big data big analytics platform available today. Supporting a variety of big data statistics, predictive modeling and machine learning capabilities, RRE provides users with cost-effective and fast big data analytics that are fully compatible with the R language, the de facto standard for modern analytics users.

Posted December 20, 2013

OpenInsight, from Revelation Software, is a database development suite that provides Windows, Web 2.0 and .NET tools to develop and deploy mission critical applications. These tools can be used with OpenInsight's proprietary NoSQL database, nearly any flavor of SQL database, or any of the many MultiValue databases.

Posted December 20, 2013

Many of the world's most successful companies use Teradata, the world-class enterprise data warehouse, for their high-stakes analytics needs. With rapid data growth and business demands for access to new data sources, the warehouse is experiencing constantly changing demands. Scale and performance are central to those business needs and organizations are now looking to optimize their most critical analytics environments. As data ages, it is less frequently analyzed so organizations are now taking a serious look at dedicated archiving solutions. Doing so is critical for companies with significant compliance requirements. That's where RainStor comes in.

Posted December 20, 2013

Business intelligence initiatives, real-time dashboards, and improved reporting are on everyone's radar. Unfortunately, data is more dispersed than ever before, increasingly distributed in the various SaaS applications that a business relies upon. This makes the challenge of implementing these initiatives far more difficult than in the ‘good ole days,' where one could simply connect the tools directly to a database. SaaS data is not open and accessible in a standard way. Instead, it is exposed via APIs (programming interfaces) that require custom coding and expensive integration projects just to connect the pieces together.

Posted December 20, 2013

Objectivity, Inc. is the Enterprise Database leader of real-time, complex Big Data solutions. Our leading edge technologies: InfiniteGraph, The Distributed Graph Database™ and Objectivity/DB, a distributed and scalable object management database, enable organizations to develop scalable solutions to discover hidden relationships for improved Big Data analytics, develop new ROI opportunities and improve inter-departmental business processes to achieve greater return on data-related investments.

Posted December 20, 2013

Performance as a Service is the hottest new service every database manager must know about because it delivers rapid problem resolution for application and database performance issues. Ntirety provides Database Performance as a Service through a powerful combination of technology and skills which incorporates our own Ntrust™ Database Appliance with AppDynamics PRO and DBTuna software. By bundling these together in a service, you don't need to purchase, install or manage new hardware and software to gain access to powerful performance management capabilities.

Posted December 20, 2013

Since 1985, data quality's been our obsession—the driving force behind many of our scalable data cleansing and enrichment solu¬tions. It's this passion that's culminated in the creation of Melissa Data's new flagship Personator technology, our next-generation enterprise data quality solution.

Posted December 20, 2013

Kore Technologies is a leading provider of enterprise integration, data warehousing, and business intelligence solutions for the mid-market, specializing in companies that use the UniData/ UniVerse (U2) database. Kourier Integrator is Kore's flagship product for Extract, Transform, and Load (ETL) and Enterprise Application Integration (EAI), enabling companies to integrate and connect with disparate databases and best-in-class applications.

Posted December 20, 2013

A Wal-Mart store in Chicago offers an endcap—will the additional product sales be greater than the trade funds expense? As important—are there trade funds available for this particular product right now? To compile the data needed to assess these promotional programs, sales executives logged into 40 (forty!) different applications to monitor dozens of screens. With JackBe Presto, the company transformed those 40 systems into a single sign-on.

Posted December 20, 2013

What do FarmVille, Guess and the Obama campaign have in common? These are shining examples of successes created by data-driven organizations. The technology that makes these possible is the HP Vertica Analytic Platform, a highly scalable and purpose-built platform for big data analytics. Founded in 2005 by database legend Michael Stonebraker, and acquired by HP in 2011, Vertica has become the defacto standard for analytics within companies like Zynga, Guess, Twitter, Comcast, Cerner, HP, and many others.

Posted December 20, 2013

EnterpriseDB has created the products and an ecosystem of services and support to enable global enterprises to deploy open source software in the data center using Postgres to power their most important applications. The success of open source has been realized in other layers of the enterprise stack; Xen and KVM for virtualization, Linux for operating systems, and JBoss and Apache for middleware. Forward-thinking CIOs are now turning increasingly to the database layer and to Postgres to reduce their reliance on costly proprietary solutions.

Posted December 20, 2013

Want to stay ahead of the competition? Then you know that this endeavor demands systematic analysis of information on new patents, new technologies, competitors, competing products, market developments, industries and customer expectations. For this purpose, an efficient "radar system" provides essential support in managing these tasks: the Empolis Competitive Intelligence solution is the antenna that brings important information (or signals) to your screen.

Posted December 20, 2013

The race is on! Winners and losers in business are being decided based on who can extract more value from exponentially increasing information, with agility, to meet business goals. The vision of many to many, data sources to data consumers, is very appealing to top executives but IT is struggling to get there fast enough. Data Virtualization offers the solution that is fast and strategic at the same time. With Denodo, business strategists, CIOs and other IT experts can plan the implementation of a shared data layer across the enterprise, expose a common data model and a unified interface over a multiplicity of diverse data sources that can feed and support an increasing number of business applications, from BI and analytics to portals, operational applications and web and mobile apps.

Posted December 20, 2013

Delphix delivers agility to enterprise application projects, addressing the largest source of inefficiency and inflexibility in the datacenter—provisioning, managing, and refreshing databases for business-critical applications. With Delphix in place, QA engineers spend more time testing and less time waiting for new data, increasing utilization of expensive test infrastructure. Analysts and managers make better decisions with fresh data in data marts and warehouses.

Posted December 20, 2013

Thank you, DBTA, for this distinctive honor, and for the opportunity to share a few words about what makes DBI Software's pureFeat™ Performance Management suite for IBM DB2 LUW distinctively different.

Posted December 20, 2013

The promise of "Big Data" has driven organizations to rethink their approach to traditional business intelligence. To stay competitive, organizations need to harness all of the relevant information to run the business regardless of its type (variety), its size (volume) or the speed in which its delivered (velocity). Datawatch is at the forefront of Next Generation Analytics by providing organizations the ability to analyze and understand Any Data Variety, regardless of structure, at Real-time Velocity, through an unmatched Visual Data Discovery environment.

Posted December 20, 2013

Database monitoring is useless unless your monitoring system can seamlessly raise intelligent alerts to dispatch the optimum level of response. That's why Datavail, the largest pure-play database services company in North America, devel¬oped Datavail Delta, a tool built to monitor a wide variety of OS and Database parameters. Windows Server 2003, 2008, 2008R2 and 2012 and SQL Server versions 2000, 2005, 2008, 2008R2 and 2012 are compatible with Delta.

Posted December 20, 2013

Seamless access for data analysis across heterogeneous data sources represents ‘the holy grail' within mainstream enterprises. Designed for Big Data processing and performance at scale, Cirro is a revolutionary approach to bridging corporate analytic data silos.

Posted December 20, 2013

As the sponsor of Cisco's acquisition of Composite Software, I am often asked about expected synergies from combining the leaders in data virtualization and networking. While I cannot divulge all our secrets, analyst firm EMA was prescient in their recent report entitled "Data Virtualization Meets the Network."

Posted December 20, 2013

As a leader in the data modeling space, CA ERwin is privileged to be an integral part of organizations' key strategic initiatives such as business intelligence and analytics, data governance, or data quality—many of which revolve around data. At CA Technologies, we understand that data runs your business, and we've put a strong focus on developing a solution that can act as an "information hub" for these initiatives.

Posted December 20, 2013

Database Plugins is pleased to have its keystone product, the Database Plugin Server, selected as a trend-setting application for 2013 by DBTA. The Database Plugin Server has certainly fostered a continuing line of innovative products.

Posted December 20, 2013

5 Market Transformations Impacting Big Data in 2014

Posted December 17, 2013

Database technology has gone through somewhat of a renaissance in recent years, given new goals and requirements for storage, processing and analytics and new database vendors have emerged. When evaluating modern data management technologies, there are five key characteristics to consider.

Posted December 17, 2013

Oracle has announced the fifth generation database machine, the Oracle Exadata Database Machine X4, which adds enhancements to improve performance and quality of service for OLTP, DBaaS (database as a service), and data warehousing. Tim Shetler, vice president of product management, Oracle, shared his views on the update in an interview. "There is no price change with this generation. It is the same price as before. We are just giving customers more capacity and more performance," said Shetler.

Posted December 17, 2013

NoSQL, NewSQL and Hadoop - Beyond the Hype and Ready for the Enterprise

Posted December 17, 2013

Big Data Poses Legal Issues and Risks

Posted December 17, 2013

It's time to look back at some of the most interesting big data blog posts of the past 12 months. These 12 posts provide warnings, tips and tricks, and often a touch of humor as well.

Posted December 17, 2013

Serena Software, a provider of orchestrated application development and release management solutions, has announced Serena Release Manager v5. The new release automates application deployments, provides visibility, control and standardization of the release process, and supports coordination and collaboration for release teams.

Posted December 17, 2013

The latest release of CA ERwin Data Modeler, a solution for collaboratively visualizing and managing business data, addresses two major objectives - the need for organizations to manage more data across more platforms, and to easily share that data with an expanding number of users with a range of roles and skill sets.

Posted December 17, 2013

There are many ways that big data can help businesses make better decisions and succeed more quickly, ranging from product innovation to manufacturing to marketing. To help organizations get their big data projects off on the right foot, here are five essential truths about big data analytics.

Posted December 17, 2013

How Enterprises Maintain the Engine Behind Data Growth

Posted December 04, 2013

While there have always been many database choices, it's only recently that enterprises have been embarking on new journeys with their data strategies. Today's database landscape is increasingly specialized and best of breed, due to the expanding range of new varieties of databases and platforms—led by NoSQL, NewSQL, and Hadoop. This is complicating the already difficult job of bringing all these data types together into a well-integrated, well-architected environment.

Posted December 04, 2013

When it comes to service recovery, speed matters. The costs of recovery from failures can be staggering in terms of business service downtime, in lost revenues and damaged reputations. For DR preparedness to significantly improve, companies should consider these 5 dimensions of disaster recovery.

Posted November 13, 2013

There can be many reasons for big data projects failing, but the causes often fall under the umbrella of a lack of careful planning and a failure to effectively reconcile the panoramic visions of business objectives with technical and structural realities. Business objectives are often abstract and high level, and operationalizing those abstractions into the kind of detailed information that feeds the development of effective big data projects is often to be found at the root of that failure. Precision and measurability are the keys to defining the business objectives that will be addressed by the project.

Posted November 13, 2013

Offering DBAs and developers a new means to prove their expertise on MongoDB, the open source document database company has introduced a new program that offers comprehensive exams worldwide through MongoDB University. MongoDB will first offer the Associate MongoDB Certified Developer exam beginning December 3.

Posted November 13, 2013

The number of databases continues to grow, but evolution tells us that not all may survive. Through the natural selection of most useful traits, intensifying the most crucial features, and implementing the best of both, databases will continue to flourish in new remarkable ways, helping organizations achieve specialized goals unique to their business. Here's a look at where the evolutionary path of the data center could take us in the coming years.

Posted November 13, 2013

IBM announced new business analytics and cloud software solutions to help zEnterprise clients take advantage of new workloads. These include a new version of the DB2 and IMS databases, and Cognos analytics tools configured for zEnterprise.

Posted November 13, 2013

The challenges associated with big data can be turned into opportunities for small-to-midsize enterprises (SMEs) with the right data strategy. As SMEs look to increase their businesses, it is critical to incorporate a cost-effective approach that aligns with both existing data challenges and future plans for expansion. Laying a strong base for big data will help SMEs prepare for this growth by providing immediate insight on key business drivers and objectives.

Posted October 23, 2013

Survivorship, known as the Golden Record in data terms, allows for the creation of a single, accurate and complete version of a customer record. A new technique for Golden Record selection offers a much more effective and logical approach when it comes to record survivorship. The most powerful future for data quality lies in the new and unique ability to discern contact data quality information and select the surviving record based on the level of quality of the information provided.

Posted October 23, 2013

Modern data centers contain a mix of physical and virtual systems and must be able to provide access to highly distributed collaborative applications as well as support systems that leverage cloud computing. Here are 8 best practices for achieving data center security and an in-depth analysis of the new security concerns presented by next-generation data centers.

Posted October 23, 2013

Clustrix, provider of a scale-out SQL database engineered for the cloud, has been granted two new patents for systems and methods for redistributing and slicing data in relational databases from the United States Patent and Trademark Office. "For years, people have been trying to figure out how to take a relational database and make sure that you can use it across multiple distributed servers and get linear, better performance and that is at essence of what these patents do," said Robin Purohit, CEO, Clustrix in an interview.

Posted October 23, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors