Newsletters




Trends and Applications



While there have always been many database choices, it's only recently that enterprises have been embarking on new journeys with their data strategies. Today's database landscape is increasingly specialized and best of breed, due to the expanding range of new varieties of databases and platforms—led by NoSQL, NewSQL, and Hadoop. This is complicating the already difficult job of bringing all these data types together into a well-integrated, well-architected environment.

Posted December 04, 2013

When it comes to service recovery, speed matters. The costs of recovery from failures can be staggering in terms of business service downtime, in lost revenues and damaged reputations. For DR preparedness to significantly improve, companies should consider these 5 dimensions of disaster recovery.

Posted November 13, 2013

There can be many reasons for big data projects failing, but the causes often fall under the umbrella of a lack of careful planning and a failure to effectively reconcile the panoramic visions of business objectives with technical and structural realities. Business objectives are often abstract and high level, and operationalizing those abstractions into the kind of detailed information that feeds the development of effective big data projects is often to be found at the root of that failure. Precision and measurability are the keys to defining the business objectives that will be addressed by the project.

Posted November 13, 2013

Offering DBAs and developers a new means to prove their expertise on MongoDB, the open source document database company has introduced a new program that offers comprehensive exams worldwide through MongoDB University. MongoDB will first offer the Associate MongoDB Certified Developer exam beginning December 3.

Posted November 13, 2013

The number of databases continues to grow, but evolution tells us that not all may survive. Through the natural selection of most useful traits, intensifying the most crucial features, and implementing the best of both, databases will continue to flourish in new remarkable ways, helping organizations achieve specialized goals unique to their business. Here's a look at where the evolutionary path of the data center could take us in the coming years.

Posted November 13, 2013

IBM announced new business analytics and cloud software solutions to help zEnterprise clients take advantage of new workloads. These include a new version of the DB2 and IMS databases, and Cognos analytics tools configured for zEnterprise.

Posted November 13, 2013

The challenges associated with big data can be turned into opportunities for small-to-midsize enterprises (SMEs) with the right data strategy. As SMEs look to increase their businesses, it is critical to incorporate a cost-effective approach that aligns with both existing data challenges and future plans for expansion. Laying a strong base for big data will help SMEs prepare for this growth by providing immediate insight on key business drivers and objectives.

Posted October 23, 2013

Survivorship, known as the Golden Record in data terms, allows for the creation of a single, accurate and complete version of a customer record. A new technique for Golden Record selection offers a much more effective and logical approach when it comes to record survivorship. The most powerful future for data quality lies in the new and unique ability to discern contact data quality information and select the surviving record based on the level of quality of the information provided.

Posted October 23, 2013

Modern data centers contain a mix of physical and virtual systems and must be able to provide access to highly distributed collaborative applications as well as support systems that leverage cloud computing. Here are 8 best practices for achieving data center security and an in-depth analysis of the new security concerns presented by next-generation data centers.

Posted October 23, 2013

Clustrix, provider of a scale-out SQL database engineered for the cloud, has been granted two new patents for systems and methods for redistributing and slicing data in relational databases from the United States Patent and Trademark Office. "For years, people have been trying to figure out how to take a relational database and make sure that you can use it across multiple distributed servers and get linear, better performance and that is at essence of what these patents do," said Robin Purohit, CEO, Clustrix in an interview.

Posted October 23, 2013

MarkLogic unveiled the latest version of its Enterprise NoSQL database platform, MarkLogic 7. With this release, MarkLogic automates the process of placing data in the storage tiers that are most cost-effective and performance-appropriate, and introduces the new MarkLogic Semantics option.

Posted October 23, 2013

Datical, Inc. has emerged from stealth mode and introduced Datical DB which uses a patent-pending, data model approach for automating, simplifying and managing database schema change, configuration and complex dependencies as part of the application release process. The solution provides a graphical user interface and wizards for creating a model of schema from an existing database which can be simultaneously migrated to any environment and database such as Oracle, DB2, MySQL, SQL Server, and Postgres.

Posted October 23, 2013

Cloud computing is no longer hype; it is the reality today for most organizations because of the numerous benefits that it brings. There are three main deployment models for cloud computing—private cloud, public cloud, and a hybrid mix of the two. Here is a look at the risks and disadvantages with each of the cloud deployment models.

Posted October 09, 2013

Modern marketers are increasingly targeting their efforts at providing the right information to the right customer at just the right moment. These marketers see the ability to provide relevant content to clients and prospective customers as a way of building their brand awareness and forging lasting relationships. However, according to a new survey, sponsored by Skyword, Inc., and conducted by Unisphere Research, a division of Information Today, Inc., there are obstacles in the path to achieving the full effect of these efforts.

Posted October 09, 2013

Constantly changing tax rules can make payroll deductions and tax payments a time-consuming and costly endeavor for businesses. To get this onerous job done efficiently and cost-effectively, many utilize payroll software specialists that provide tools to support their in-house staff. Read how Revelation Software's OpenInsight and OpenInsight for Web are giving Ardbrook, a Dublin, Ireland-based software provider of payroll software, the agility it needs.

Posted October 09, 2013

Why is data integration so difficult? For many organizations, data integration has been handled as a dark art over the years, implemented behind the scenes with ad hoc scripts, extract, transform, and load (ETL) operations, connectors, manual coding, and patching. Often, front-end applications to get at needed data are built and deployed one at a time, requiring considerable IT staff time, as well as creating a waiting period for business decision makers. To address this challenge, there's a renewed push across the industry to elevate data integration from being a series of one-off projects shrouded in mystery to the core of a multidisciplinary, enterprise architecture—incorporating new and existing approaches such as master data management, data virtualization, and data integration automation.

Posted September 26, 2013

EMC's CEO and chairman Joe Tucci gave a keynote at Oracle OpenWorld 2013 on the transition occurring in IT and the data center of the future. There are four key macro trends driving the transformation in IT, said Tucci. These tremendously disruptive and opportunistic trends include mobility, cloud computing, big data, and social networking. Jeremy Burton, EVP at EMC, cited a recent IOUG-Unisphere Research survey report which showed that the daily DBA activities most on the rise are systems monitoring, performance diagnosis, and managing backup and recovery. Oracle and EMC are integrating their technologies to allow customers to spend less time in the back office so they can devote more time to the front office dealing with more impactful business issues, said Burton.

Posted September 26, 2013

At LinuxCon 2013, IBM announced plans to invest $1 billion in new Linux and open source technologies for IBM's Power Systems servers. The new pledge recalled the company's announcement in 2000 that it would embrace Linux as strategic to its systems strategy, followed a year later with the promise of $1 billion dedicated to backing the Linux movement. The new investment will fuel two immediate initiatives - a new client center in Europe and a Linux on Power development cloud.

Posted September 26, 2013

There is no limit to the potential, business- building applications for big data, springing from the capability to provide new, expansive insights never before available to business leaders. However, the new forms of data, along with the speed in which it needs to be processed, requires significant work on the back end, which many organizations may not yet be ready to tackle. IT leaders agree that to make the most of big data, they will need to redouble efforts to consolidate data environments, bring in new solutions, and revisit data retention policies. These are the conclusions of a new survey of 322 data managers and professionals who are members of the Independent Oracle Users Group (IOUG). The survey was underwritten by Oracle Corp. and conducted by Unisphere Research, a division of Information Today, Inc.

Posted September 26, 2013

Oracle CEO Larry Ellison made three key announcements in his opening keynote at Oracle OpenWorld, the company's annual conference for customers and partners in San Francisco. Ellison unveiled the Oracle Database In-Memory Option to Oracle Database 12c which he said speeds up query processing by "orders of magnitude," the M6 Big Memory Machine, and the new Oracle Database Backup Logging Recovery Appliance. Explaining Oracle's goals with the new in-memory option, Ellison noted that in the past there have been row-format databases, and column-format databases that are intended to speed up query processing. "We had a better idea. What if we store data in both formats simultaneously?"

Posted September 26, 2013

RainStor, a provider of an enterprise database for managing and analyzing all historical data, has introduced RainStor FastForward, a new product that enables customers to re-instate data from Teradata tape archives (also known as BAR for Backup, Archive and Restore) and move it to RainStor for query. The new RainStor FastForward product resolves a pressing challenge for Teradata customers that need to archive their Teradata warehouse data to offline tape, which can make it difficult to access and query that data when business and regulatory users require it, Deirdre Mahon, vice president of marketing, RainStor, explained in an interview.

Posted September 26, 2013

If you look at what is really going on in the big data space it's all about inexpensive open source solutions that are facilitating the modernization of data centers and data warehouses, and at the center of this universe is Hadoop. In the evolution of the big data market, open source is playing a seminal role as the "disruptive technology" challenging the status quo. Additionally, organizations large and small are leveraging these solutions often based on inexpensive hardware and memory platforms in the cloud or on premise.

Posted September 11, 2013

Featuring a multi-tenant architecture that streamlines the process of consolidating databases onto the cloud and enables organizations to manage many databases as one, Oracle Database 12c is a next-generation database. To avoid confusion, however, here are 10 things that Oracle Database 12c is not.

Posted September 11, 2013

Oracle holds an enviable position in the IT marketplace with a wide array of database systems, development tools, languages, platforms, enterprise applications, and servers. Riding the coattails of this industry giant is a healthy and far-flung ecosystem of software developers, integrators, consultants, and OEMs. These are the partners that will help make or break Oracle's struggle with new forces disrupting the very foundations of IT. And lately, Oracle—long known for its own brand of xenophobia and disdain for direct competitors—has been making a lot of waves by forging new alliances with old foes. This is opening up potentially lucrative new frontiers for business partners at all levels.

Posted September 11, 2013

As the sponsor of Cisco's acquisition of Composite Software, I am often asked about expected synergies from combining the leaders in data virtualization and networking. While I cannot divulge all our secrets, analyst firm EMA was prescient in their recent report entitled "Data Virtualization Meets the Network."The Changing Nature of Data Management Drives Data Virtualization

Posted September 10, 2013

For more than a decade, MarkLogic has delivered a powerful and trusted next-generation Enterprise NoSQL database that enables organizations to turn all data into valuable and actionable information. Organizations around the world rely on MarkLogic's enterprise-grade technology to make better decisions faster.

Posted September 10, 2013

Independent Oracle Users Group (IOUG) members will be out in force at OpenWorld 2013 - presenting more than 40 sessions on the topics you want to learn about most. Celebrating its 20th anniversary this year, the IOUG represents the independent voice of Oracle technology and database professionals and allows them to be more productive in their business and careers through context-rich education, sharing best practices, and providing technology direction and networking opportunities.

Posted September 10, 2013

The Independent Oracle Users Group (IOUG) is a community of Oracle database and technology professionals that has provided virtual and in-person knowledge-sharing/education and networking opportunities for 20 years.

Posted September 10, 2013

Exhibiting at Oracle OpenWorld 2013 in booth 2214, Attunity is a leading provider of data integration software solutions that make data available where and when needed across heterogeneous enterprise platforms and the cloud.

Posted September 10, 2013

The flagship Oracle OpenWorld conference each year in San Francisco is recognized for attracting tens of thousands of highly qualified and influential attendees from more than 140 countries. With a focus on key current and future Oracle technologies and solutions, attendees converge at the conference because they are serious about educating themselves and having a good time as well.

Posted September 10, 2013

On Tuesday morning, September 24th, at 8:00am PST EMC Chairman and CEO Joe Tucci will deliver a keynote address at Oracle OpenWorld 2013 in San Francisco, CA. In addition, as a Diamond sponsor, EMC will have six breakout sessions covering relevant best practices for optimizing your Oracle infrastructure. When on the show floor, attendees can engage with EMC's Oracle experts and learn how to Lead Your Transformation in booth #1301.

Posted September 10, 2013

Phone and Name Verification, and customer Deduping, within Oracle Forms, Oracle/Java applications, PL/SQL packages, and PeopleSoft. Melissa Data APIs and Web Services for Data Quality lower costs, improve customer relations, enable BI initiatives, and empower sales and marketing teams to generate and close more opportunities.

Posted September 10, 2013

Delphix delivers agility to enterprise application projects, addressing the largest source of inefficiency and inflexibility in the datacenter—provisioning, managing, and refreshing databases for business-critical applications. With Delphix in place, QA engineers spend more time testing and less time waiting for new data, increasing utilization of expensive test infrastructure.

Posted September 10, 2013

Make big ideas happen with a strategic approach to converged cloud, converged infrastructure, big data, security and management tools for your database, middleware and applications. HP booth 1701 is the heartbeat of innovation, your area for conversations and investigation.

Posted September 10, 2013

Database performance issues? Our Oracle customers rely on Ignite to help them quickly pinpoint exactly where the problem lies—in just four clicks. Confio Ignite is a tool for DBAs, developers, and IT managers to collaborate and resolve performance issues faster.

Posted September 10, 2013

Even before all the new data sources and platforms that big data has come to represent arrived on the scene, providing users with access to the information they need when they need it was a big challenge. What has changed today? The growing range of data types beyond traditional RDBMS data - and a growing awareness that effectively leveraging data from a wide variety of sources will result in the ability to compete more effectively.Join DBTA on Thursday August 29, at 11 am PT/ 2 pm ET for a special roundtable webcast to learn about the essential technologies and approaches that help to overcome the big data integration challenges that get in the way of gaining actionable insights.

Posted August 21, 2013

There may be no more commonly used term in today's IT conversations than "big data." There also may be no more commonly misused term. Here's a look at the truth behind the five most common big data myths, including the misguided but almost universally accepted notion that big data applies only to large organizations dealing with great volumes of data.

Posted August 21, 2013

Data analytics, long the obscure pursuit of analysts and quants toiling in the depths of enterprises, has emerged as the must-have strategy of organizations across the globe. Competitive edge not only comes from deciphering the whims of customers and markets but also being able to predict shifts before they happen. Fueling the move of data analytics out of back offices and into the forefront of corporate strategy sessions is big data, now made enterprise-ready through technology platforms such as Hadoop and MapReduce. The Hadoop framework is seen as the most efficient file system and solution set to store and package big datasets for consumption by the enterprise, and MapReduce is the construct used to perform analysis over Hadoop files.

Posted August 21, 2013

Progress Software announced availability of new data connectivity and application capabilities as part of its Progress Pacific application platform-as-a-service (aPaaS) for building and managing business applications on any cloud, mobile or social platform. "Pacific is a platform running in the cloud that is targeted at small and medium-size businesses, ISVs and departmental IT," John Goodson, chief product officer at Progress Software, explained in an interview. Instead of requiring a highly trained IT staff to build applications, Goodson says, the platform provides a visual design paradigm that allows users with limited skills to build powerful applications that can quickly connect to any data sources.

Posted August 21, 2013

SAP AG introduced new high availability and disaster recovery functionality with SAP Sybase Replication Server for SAP Business Suite software running on SAP Sybase Adaptive Server Enterprise (SAP Sybase ASE). "After only a year and a quarter supporting the Business Suite, ASE has already garnered about 2,000 customer installations. This easily provides that near zero-downtime for HA/DR that is non-intrusive to the system using Replication Server as the key enabling technology," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview.

Posted August 21, 2013

IBM announced its new zEnterprise BC12 (zBC12) mainframe, designed for enhanced analytics, cloud, and mobile computing. Priced at $75,000 for the base model, IBM says it is targeting smaller organizations. The computer giant says it is also adding new industry solutions and software and operating systems across its zEnterprise portfolio, designed for financial services and government operations.

Posted August 21, 2013

More "things" are now connected to the internet than people, a phenomenon dubbed The Internet of Things. Fueled by machine-to-machine (M2M) data, the Internet of Things promises to make our lives easier and better, from more efficient energy delivery and consumption to mobile health innovations where doctors can monitor patients from afar. However, the resulting tidal wave of machine-generated data streaming in from smart devices, sensors, monitors, meters, etc., is testing the capabilities of traditional database technologies. They simply can't keep up; or when they're challenged to scale, are cost-prohibitive.

Posted August 07, 2013

A former colleague is looking for a database server to embed into an important new factory automation application his company is building. The application will manage data from a large number of sensor readings emanating from each new piece of industrial equipment his company manufactures. These values, such as operating temperature, material thickness, cutting depth, etc., fit into the data category commonly called "SCADA" - supervisory control and data acquisition. Storing, managing and analyzing this SCADA data is a critical enhancement to this colleague's new application. His large customers may have multiple locations worldwide and must be able to view and analyze the readings, both current and historical, from each piece of machinery across their enterprise.

Posted August 07, 2013

Consider a professional baseball game or any other popular professional sporting event. When a fan sits in the upper deck of Dodger Stadium in Los Angeles, or any other sporting arena on earth, the fan is happily distracted from the real world. Ultimately, professional sports constitutes a trillion dollar industry - an industry whose product is on the surface entertainment but when one barely pierces that thin veneer it quickly becomes clear that the more significant product produced is data. A fan sitting in the upper deck does not think of it as such but the data scientist recognizes the innate value of the varied manifestations of the different forms of data being continuously produced. Much of this data is being used now but it will require a true Unified Data Strategy to fully exploit the data as a whole.

Posted August 07, 2013

While unstructured data may represent one of the greatest opportunities of the big data revolution, it is one of its most perplexing challenges. In many ways, the very core of big data is that it is unstructured, and many enterprises are not yet equipped to handle this kind of information in an organized, systematic way. Most of the world's enterprise databases—based on a model designed in the 1970s and 1980s that served enterprises well in the decades since—suddenly seem out-of-date, and clunky at best when it comes to managing and storing unstructured data. However, insights from these disparate data types—including weblog, social media, documents, image, text, and graphical files—are increasingly being sought by the business.

Posted July 25, 2013

Join DBTA and MarkLogic for a webcast on Wednesday, July 31, to learn about the essential technologies and approaches to succeeding with predictive analytics on Big Data. In a recent survey of Database Trends and Applications subscribers, predictive analytics was cited as the greatest opportunity that big data offers to their organizations. The reason is simple — whether you're fighting crime, delivering healthcare, scoring credit or fine-tuning marketing, predictive analytics is the key to identifying risks and opportunities and making better decisions. However, to leverage the power of predictive analytics, organizations must possess the right technology and skills.

Posted July 25, 2013

Oracle is advancing the role of Java for IoT (Internet of Things) with the latest releases of its Oracle Java Embedded product portfolio - Oracle Java ME Embedded 3.3 and Oracle Java ME Software Development Kit (SDK) 3.3, a complete client Java runtime and toolkit optimized for microcontrollers and other resource-constrained devices. Oracle is also introducing the Oracle Java Platform Integrator program to provide partners with the ability to customize Oracle Java ME Embedded products to reach different device types and market segments. "We see IoT as the next big wave that will hit the industry," Oracle's Peter Utzschneider, vice president of product management, explained during a recent interview.

Posted July 25, 2013

SAP has launched Sybase ASE (Adaptive Server Enterprise) 15.7 service pack 100 (SP100) to provide higher performance and scalability as well as improved monitoring and diagnostic capabilities for very large database environments. "The new release adds features in three areas to drive transactional environments to even more extreme levels. We really see ASE moving increasingly into extreme transactions and to do that we have organized the feature set around the three areas," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview with 5 Minute Briefing.

Posted July 25, 2013

A new science called "data persona analytics" (DPA) is emerging. DPA is defined as the science of determining the static and dynamic attributes of a given data set so as to construct an optimized infrastructure that manages and monitors data injection, alteration, analysis, storage and protection while facilitating data flow. Each unique set of data both transient and permanent has a descriptive data personality profile which can be determined through analysis using the methodologies of DPA.

Posted July 25, 2013

The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.

Posted July 09, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25

Sponsors