Newsletters




Trends and Applications



NoSQL company Couchbase has announced its integration solution for VMware vFabric Application Director. The solution allows deployment and management of Couchbase Server on any VMware vCloud-powered private, public or hybrid clouds so that application middleware teams can more easily leverage Couchbase Server in their virtual and cloud environments. Because so many users are deploying Couchbase on the cloud, it seemed a natural fit to support VMware vFabric Application Director, Frank Weigel, vice president of products at Couchbase, tells 5 Minute Briefing.

Posted September 26, 2012

Database security company Application Security, Inc. (AppSecInc) has announced the general availability of a major new release of its flagship platform, DbProtect. Version 6.4 incorporates insights gained from 10 years of working with customers, Josh Shaul, CTO of AppSecInc, tells 5 DBTA. DbProtect is intended to let organizations evaluate the security of their database environment and have access to preventative controls so they can eliminate security risks without the need to patch or reconfigure databases. With this release, the product, which has been rebuilt from scratch, offers a much easier to use interface as well as the ability to provide various groups of stakeholders with individual views based on a single scan, thereby limiting the burden on the database as well as limiting user access based on roles, notes Shaul.

Posted September 26, 2012

Quest Software has introduced Toad Business Intelligence Suite, a packaged suite of tools to link traditional and non-traditional data sources, bridging the gap between BI environments and distributed big data sources. In addition, Toad BI Suite aims to span the divide between technical and non-technical users by offering tailored interfaces designed to meet their individual data provisioning and analytic needs.

Posted September 26, 2012

Address data quality is at the heart of all business operations. Inaccurate addresses can cost businesses anywhere between a few cents to many dollars per customer interaction. Undelivered products or information could result in dissatisfied or even lost customers, a cost that is far greater than the fee to implement an automated address check. Here, a list of the top 10 tips to select the right address data quality provider.

Posted September 11, 2012

In recent years, the networks of developers, integrators, consultants, and manufacturers committed to supporting database systems have morphed from one-on-one partnerships into huge ecosystems in which they have become interdependent on one another, and are subject to cross-winds of trends and shifts that are shaping their networks. Nowhere is this more apparent than the huge ecosystem that has developed around Oracle. With Oracle's never-ending string of acquisitions, new functionality, and widespread adoption by enterprises, trends that shape this ecosystem are certain to have far-reaching effects on the rest of the IT world. Concerns that percolate through the ecosystem reflect — and influence — broad business concerns. New paradigms — from cloud computing to big data to competing on analytics — are taking root within the Oracle ecosystem long before anywhere else.

Posted September 11, 2012

Are today's data systems — many of which were built and designed for legacy systems of the past decade — up to the task of moving information to end users at the moment they need it? And is this information timely enough? In many cases, there's a lot of work that still needs to be done before real-time information, drawn from multiple sources, becomes a reality. A new survey of 338 data managers and professionals who are subscribers to Database Trends and Applications reveals that real-time data access is still a distant pipe dream for at least half of the companies represented in the survey. The survey, conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Attunity in March of 2012, finds that close to half of the survey respondents, 48%, report that relevant data within their organizations still take 24 hours or longer to reach decision makers. This suggests that much data is still batch-loaded overnight.

Posted September 11, 2012

Big data and cloud analytics vendor Kognitio has partnered with Xtremeinsights, a provider of solutions for leveraging Hadoop in existing data management systems. Together, the partners aim to deliver software and integration technologies to businesses that want to leverage the Hadoop platform and gain actionable insights from their big data. Using its in-memory analytical platform, Kognitio speeds up the analysis of data from Hadoop clusters, enabling ad hoc, real-time analytics at a significantly lower cost. "Xtremeinsights can build the underlying infrastructure so that your business users can do ad hoc analysis on ridiculous amounts of data and get answers in real-time," Michael Hiskey, Kognitio's vice president of marketing and business development, tells DBTA.

Posted August 23, 2012

SAP AG introduced a new solution to help organizations gain real-time insights into market trends and customer sentiment. The SAP rapid-deployment solution for sentiment intelligence with SAP HANA is intended to allow users to analyze customer sentiment from social networking sites, communities, wikis, blogs and other sources, and combine the information with CRM data. Customers that have had success getting started with big data analytics are the ones that have set out to solve a very specific use case or set out to solve a specific problem, David Jonker, director of marketing for database and technology at SAP, tells DBTA. "The rapid deployment solution for sentiment intelligence does exactly that."

Posted August 23, 2012

TransLattice, a provider of distributed databases and application platforms for enterprise, cloud and hybrid environments, has released TransLattice Elastic Database (TED), which the company describes as the world's first geographically distributed relational database management system (RDBMS). A single database can run on multiple TransLattice nodes around the world, allowing for greater data availability, performance, and scalability at a lower cost than traditional databases. "Since we have the ability to pre-position the data close to end users and have a node that's operating on their behalf in distributed queries, we can offer a much higher level of user experience than conventional systems," Michael Lyle, CTO of TransLattice, explains to DBTA. Additionally, TED makes it easier for global enterprises to comply with data jurisdiction policy requirements.

Posted August 23, 2012

Informatica, Talend, Jaspersoft, and Pervasive Software Inc. have joined the Google Cloud Platform Partner Program as Technology Partners. To help customers get the most out of its cloud platform products, explains Eric Morse, head of Sales and Business Development, for Google's Cloud Platform, Google work closely with technology companies that provide powerful complementary solutions integrated with the platform.

Posted August 23, 2012

Pentaho's Business Analytics 4.5 is now certified on Cloudera's latest releases, Cloudera Enterprise 4.0 and CDH4. Pentaho also announced that its visual design studio capabilities have been extended to the Sqoop and Oozie components of Hadoop. "Hadoop is a very broad ecosystem. It is not a single project," Ian Fyfe, chief technology evangelist at Pentaho, tells DBTA. "Sqoop and Oozie are shipped as part of Cloudera's distribution so that is an important part of our support for Cloudera as well - providing that visual support which nobody else in the market does today."

Posted August 23, 2012

Oracle has announced Oracle Exalogic Elastic Cloud Software 2.0. According to Oracle, customers in 43 countries across 22 industries have already adopted Oracle Exalogic, and it is the fastest growing Oracle engineered system with 3x Y/Y sales bookings based on the last two quarters of FY 2012. The second generation of Exalogic is raising the bar even further, with a single integrated system that addresses the key business goals of application owners - to seize market opportunities, lower business risk and reduce cost and complexity, noted Hasan Rizvi, senior vice president for product development at Oracle, who spoke during a webcast presentation to launch the new release.

Posted August 23, 2012

Data management software vendor Terracotta has released the latest version of its flagship product, BigMemory 3.7, providing performance at any scale through in-memory data management. The latest release offers improved in-memory access, allowing customers to store big data in real time with high application speed, performance, and scale. "We've made optimizations that allow people to put more data and capacity into our product so that they get more value out of the data that they put in," Gary Nakamura, general manager of Terracotta, explains to DBTA.

Posted August 23, 2012

Flash Masters: Shape Shifting Into the Modern Data Center

Posted August 09, 2012

Oracle has introduced a new migration tool that aims to make it easier for users to migrate data from SQL Server to MySQL. The new migration tool is integrated into MySQL Workbench, which allows the visual design, development and administration of MySQL Databases. According to Oracle, with MySQL Workbench, SQL Server developers and DBAs can easily convert existing applications to run on MySQL, both on Windows and other platforms. In addition, to address the growing demand from business analysts using MySQL for data marts and analytic applications, Oracle has announced a new "MySQL for Excel" application plug-in that allows users to import, export and manipulate MySQL data, without requiring prior MySQL technical knowledge.

Posted July 25, 2012

Information availability software provider Attunity has partnered with EMC to offer a high-performance big data replication solution for the EMC Greenplum Unified Analytics Platform, as well as dedicated optimizations to Attunity Replicate from EMC Greenplum. "Enterprise architects have to pay special attention to data flow. Data is their critical asset and right now enterprise architects are really challenged with a ‘data bottleneck,'" Matt Benati, Attunity's vice president of global marketing, tells 5 Minute Briefing.

Posted July 25, 2012

The volume of data now being stored by businesses is at a point where the term "big data" almost feels inadequate to describe it. The size of big data sets is a constantly moving target, ranging from a few dozen terabytes to many petabytes of data in a single data set. And it is estimated that, over the next 2 years, the total amount of big data stored by business will be four times today's volumes. As business continues its inexorable shift to the cloud, weblogs continue to fuel the big data fire. But there are plenty of other sources as well - RFID, sensor networks, social networks, social, Internet text and documents, Internet search indexing, call detail records, scientific research, military surveillance, medical records, photography archives, video archives, and large-scale e-commerce transaction records.

Posted July 25, 2012

Big data is one of the most significant industry disruptors in IT today. Even in its infancy, it has shown significant ROI and has almost universal relevance to a wide cross-section of the industry. Why? Big data turns traditional information architecture on its head, putting into question commonly accepted notions of where and how data should be aggregated, processed, analyzed, and stored. Enter Hadoop and NoSQL, the open source data-crunching platform. Although these technologies are hotter than an Internet IPO, you simply can't ignore your current investments - those investments in SQL which drive everything from your data warehouse to your ERP, CRM, SCM, HCM and custom applications.

Posted July 25, 2012

Big data analytics provider Datameer has announced that its technology is now integrated into Dell's Emerging Solutions Ecosystem, which offers complementary hardware, software, and services in a pre-packaged bundle. Built with Dell infrastructure, Datameer analytics technology is offered on top of Cloudera's distribution of Apache Hadoop. The software is well suited to organizations that need to make quick decisions based on structured and unstructured data, such as financial services, retail, telecommunications, and Web 2.0 companies. "The advantage to the end user is they can buy all of this from one source, it's all integrated together, it runs seamlessly, and it's an easy solution for people to purchase and use," Joe Nicholson, Datameer's vice president of marketing and business development, tells DBTA.

Posted July 25, 2012

Jaspersoft, maker of business intelligence (BI) software, today announced availability of Jaspersoft Business Intelligence 4.7, with reports generated by JasperReports Server now give casual users the ability to interact with more of their data. According to Jaspersoft, its open source business model and zero-cost per-user licensing fees make interactive reporting affordable for even the largest scale reporting projects. Additional improvements in Jaspersoft 4.7 include direct native connectivity to big data sources and expanded mobile device support. "This is exciting for a couple of reasons," Mike Boyarski, director of product marketing for Jaspersoft, tells DBTA. "One is that no BI tool out there so far has this level of interactivity for what we call casual BI users."

Posted July 25, 2012

Syncsort, a global leader in high-performance data integration solutions, has certified its DMExpress data integration software for high-performance loading of Greenplum Database. Syncsort has also joined the Greenplum Catalyst Developer Program. Syncsort DMExpress software delivers extensive connectivity that makes it easy to extract and transform data from nearly any source, and rapidly load it into the massively parallel processing (MPP) Greenplum Database without the need for manual tuning or custom coding. "IT organizations of all sizes are struggling to keep pace with the spiraling infrastructure demands created by the sheer volume, variety and velocity of big data," says Mitch Seigle, vice president, Marketing and Product Management, Syncsort.

Posted July 25, 2012

Compuware Corporation has launched a free online cloud service enabling organizations to compare the speed of their website's performance against leading competitor sites. SpeedoftheWeb.org provides a series of industry-relevant indexes with categorized collections of 1,000 of the world's most trafficked websites - as ranked by Alexa and Compuware Gomez benchmarks. As end-user expectations become greater, the cost of poor website performance also grows, both in terms of lost revenues and brand loyalty. In addition, end users are showing less patience for slow-performing websites, according to Compuware. In response to these heightening requirements, organizations need to ensure performance along every spot in the web application delivery chain, Alois Reitbauer, Compuware APM's Web 2.0 and Mobile Performance Expert and author of Java Enterprise Performance, tells DBTA.

Posted July 25, 2012

Empowering Business Analysts with Faster Insights

Posted July 11, 2012

Privacy in the Next Gen Data Center: Harmonization Across Regulations to Decrease Complexity

Posted July 11, 2012

Big Data or Right Data: What Really Matters?

Posted July 11, 2012

Datameer has announced a new release of its big data analytics solution, which combines data integration, analytics and visualization of any data type in one application. The new capabilities offered in Datameer 2.0 are in two main categories, Joe Nicholson, vice president of marketing, Datameer, tells DBTA. One is adding new functionality and the other is bringing Hadoop to the desktop with Hadoop natively embedded in two of three new editions of the application.

Posted June 28, 2012

In a live presentation that was also made available on the web, Oracle CEO Larry Ellison unveiled Oracle's cloud strategy and introduced Oracle Cloud Social Services, a new enterprise social platform offering. "We made a decision to rebuild all of our applications for the cloud almost 7 years ago. We called that project ‘Fusion,'" he told the audience. While joking that, at the time, some competitors called it "con-fusion," Ellison also recounted the years of work, input from thousands of people, and billions of dollars that were required to enable Oracle to make the transition from being an on-premise application provider to being a cloud application provider - as well as an on-premise application provider - and to rewrite and modernize all of its applications.

Posted June 28, 2012

Lucid Imagination, a developer of search, discovery and analytics software based on Apache Lucene and Apache Solr technology, has unveiled LucidWorks Big Data, a fully integrated development stack that combines advantages of multiple open source projects including Hadoop, Mahout, R and Lucene/Solr to provide search, machine learning, recommendation engines and analytics for structured and unstructured content in one solution available in the cloud. "With more and more companies being challenged by the explosive growth of information, as has been widely reported, the vast majority of that content is unstructured or semi structured text, and traditional business intelligence or traditional analytics methodologies don't come close to addressing the vast percentage of content," Paul Doscher, CEO of Lucid Imagination, tells DBTA.

Posted June 28, 2012

IBM stepped up its smarter computing initiative with a broad range of performance and efficiency enhancements to its storage and technical computing systems - the engines of big data. As part of its ongoing smarter computing effort, IBM has announced a new strategic approach to designing and managing storage infrastructures with greater automation and intelligence, as well as performance enhancements to several key storage systems and the Tivoli Storage Productivity Center suite. IBM also announced its first offerings that incorporate software from IBM's acquisition of Platform Computing earlier this year. "Enterprises are dealing with data that is increasing exponentially in both size and complexity," said Rod Adkins, senior vice president of IBM Systems & Technology Group. The enhanced systems and storage solutions have the performance, efficiency, and intelligence to handle this big data, he added.

Posted June 28, 2012

Data analytics vendor Teradata and information management software provider Kalido have introduced a new joint solution that they say will allow customers to build or expand a data warehouse in 90 days or less, providing deeper analytics to users for improved business decision-making. This solution combines the Teradata Data Warehouse Appliance with the Kalido Information Engine, providing customers with a streamlined data consolidation tool that aggregates disparate data into a single unified platform.

Posted June 28, 2012

Companies are scrambling to learn all the various ways they can slice, dice, and mine big data coming in from across the enterprise and across the web. But with the rise of big data — hundreds of terabytes or petabytes of data — comes the challenge of where and how all of this information will be stored. For many organizations, current storage systems — disks, tapes, virtual tapes, clouds, inmemory systems — are not ready for the onslaught, industry experts say. There are new methodologies and technologies coming on the scene that may help address this challenge. But one thing is certain: Whether organizations manage their data in their internal data centers, or in the cloud, a lot more storage is going to be needed. As Jared Rosoff, director of customer engagement with 10gen, puts it: "Big data means we need ‘big storage.'"

Posted June 13, 2012

As data continues to grow unabated, organizations are struggling to manage it more efficiently. By better leveraging their expanding data stores and making the information available more widely, organizations hope to put big data to work — helping them to achieve greater productivity and more informed decision making, as well as compete more effectively as a result of insights uncovered by analytics on their treasure troves of information. Improving the management of big data is not something to consider addressing at some point in the hazy future — the big data challenge is already here, according to a new survey of 264 data managers and professionals who are subscribers to Database Trends and Applications.

Posted June 13, 2012

Social media network-based business intelligence represents the next great frontier of data management, promising decision makers vast vistas of new knowledge gleaned from exabytes of data generated by customers, employees, and business partners. Mining data from Facebook, Twitter, blogs, wikis, and internal corporate networks potentially may surface new insights into impending market shifts, patterns in customer sentiment, and competitive intelligence. It's a rich opportunity not lost on today's organizations, a new survey of 711 business and IT managers from across the globe reveals. A majority of respondents are either planning to collect and analyze data from both proprietary and public social media networks, or are doing so already.

Posted June 13, 2012

Informix Genero, a new IBM offering developed under partnership with Four Js Development Tools, is a logical enhancement to the Informix 4GL language and environment that offers extensive capabilities for developing modern web and desktop GUI applications, reports, and web services. With IBM Informix Genero, users can recompile 4GL applications and run them as GUI and web applications while retaining the business logic.

Posted May 23, 2012

Eagle Creek Software Services - CRM and BI Market Leader Profile

Posted May 23, 2012

As a leader in pharmacy technology, National Health Systems, Inc. provides a range of services for the retail pharmacy industry. Built on a foundation of dedication and commitment to its customers and the profession of Pharmacy, NHS companies PDX, NHIN, and Rx.com provide pharmacies with the tools they need to provide the best possible patient care, manage their businesses, and enhance their competitiveness in the marketplace.

Posted May 23, 2012

Google has announced that Google BigQuery, a web service that lets users do interactive analysis of massive data sets, is now available to the public. Billed as enabling customers to "analyze terabytes of data with just a click of a button," the company says the data is secured, replicated across data centers, and can be easily exported.

Posted May 23, 2012

Informatica Corporation, a provider of data integration software and services, has announced the latest release of its Informatica Cloud solution. Offered as an integration platform-as-a-service (iPaaS), the latest release from Informatica features the Cloud Connector Toolkit, Cloud Integration Templates, and new enterprise features, all of which are part of the new Informatica Cloud Developer Edition and allow developers to rapidly embed end-user customizable integration logic and connectivity into cloud applications.

Posted May 23, 2012

Big data has become a big topic for 2012. It's not only the size, but also the complexity of formats and speed of delivery of data that is starting to exceed the capabilities of traditional data management technologies, requiring the use of new or exotic technologies simply to manage the volume alone. In recent years, the democratization of analytics and business intelligence (BI) solutions has become a major driving force for data warehousing, resulting in the use of self-service data marts. One major implication of big data is that in the future, users will not be able to put all useful information into a single data warehouse. Logical data warehouses bringing together information from multiple sources as needed is replacing the single data warehouse model. Combined with the fact that enterprise IT departments are continually moving towards distributed computing environments, the need for IT process automation to automate and execute the integration and movement of data between these disparate sources is more important than ever.

Posted May 09, 2012

There has been a significant change in the IT world recently; solution developers no longer believe the answer to all data management challenges is a relational database. After 40 years, data management was considered to be a quiet part of IT where the products and providers were firmly decided. It is evident that information management has again become quite dynamic with a broad set of solutions offering new options for managing the big data challenges of today.

Posted May 09, 2012

The term "big data" refers to the massive amounts of data being generated on a daily basis by businesses and consumers alike - data which cannot be processed using conventional data analysis tools owing to its sheer size and, in many case, its unstructured nature. Convinced that such data hold the key to improved productivity and profitability, enterprise planners are searching for tools capable of processing big data, and information technology providers are scrambling to develop solutions to accommodate new big data market opportunities.

Posted May 09, 2012

CIOs and IT departments are on the frontlines of a monumental IT shift. With the number of mobile devices and applications exploding and bandwidth soaring, they are being asked to find ways to enable the brave new world of enterprise mobility. All involved - from users to IT - recognize the productivity and business efficiency benefits of this trend, but it is typically only IT that also recognizes the dangers unchecked mobility poses to sensitive corporate data.

Posted May 09, 2012

Learning from the European Union Data Directives on Privacy

Posted April 26, 2012

Oracle addressed the need to make IT infrastructure and business analytics technologies simpler and more efficient in a presentation to OpenWorld Tokyo 2012 attendees that was also made available via live webcast. In addition to presenting its strategy and plans for business analytics, the company also unveiled new additions to its product portfolio. In his keynote address, Oracle president Mark Hurd explained how the business users of tomorrow will require faster and more comprehensive information access. "The true question with analytics is how to get the right information to the right person at the right time to make the right decision," he said.

Posted April 26, 2012

IBM has introduced DB2 10 and InfoSphere Warehouse 10 software that integrates with big data systems, automatically compresses data into tighter spaces to prevent storage sprawl, and slices information from the past, present, and future to eliminate expensive application code. Over the past 4 years, more than 100 clients, 200 business partners, and hundreds of experts from IBM Research and Software Development Labs around the world collaborated to develop the new software.

Posted April 26, 2012

MapR Technologies, Inc., provider of the MapR distribution for Apache Hadoop, has introduced new data connection options for Hadoop to enable a range of data ingress and egress alternatives for customers. These include direct file-based access using standard tools and file-based applications, direct database connectivity, Hadoop specific connectors via Sqoop, Flume and Hive; as well as direct access to popular data warehouses and applications using custom connectors. Additionally, technology providers Pentaho and Talend are partnering with MapR to provide direct integration with MapR's distribution, and MapR has also entered into a partnership with data warehouse and business intelligence platform vendor Tableau Software.

Posted April 26, 2012

Helping customers manage the influx of mobile devices, networks and applications in the enterprise, SAP has unveiled a new release of the Afaria mobile device management solution. With the 7.0 release of Afaria, SAP aims to allow enterprise IT to more effectively manage mobile applications and devices through a new user interface (UI) for simplified administration, improved workflow and enterprise integration capabilities. "The consumerization of IT is driving our innovation path and commitment to providing customers with the industry's most comprehensive, robust and streamlined mobility management platform, including mobile device management," says Sanjay Poonen, president, Global Solutions, SAP.

Posted April 26, 2012

Big News with Big Data - Commercial Enhancements to Apache Hadoop Usher in a New Era

Posted April 11, 2012

Seven Steps for a Successful Applications Rationalization Initiative

Posted April 11, 2012

The Next Information Processing Super Engine

Posted April 11, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors