Newsletters




Trends and Applications



For more than a decade, MarkLogic has delivered a powerful and trusted next-generation Enterprise NoSQL database that enables organizations to turn all data into valuable and actionable information. Organizations around the world rely on MarkLogic's enterprise-grade technology to make better decisions faster.

Posted September 10, 2013

Independent Oracle Users Group (IOUG) members will be out in force at OpenWorld 2013 - presenting more than 40 sessions on the topics you want to learn about most. Celebrating its 20th anniversary this year, the IOUG represents the independent voice of Oracle technology and database professionals and allows them to be more productive in their business and careers through context-rich education, sharing best practices, and providing technology direction and networking opportunities.

Posted September 10, 2013

The Independent Oracle Users Group (IOUG) is a community of Oracle database and technology professionals that has provided virtual and in-person knowledge-sharing/education and networking opportunities for 20 years.

Posted September 10, 2013

Exhibiting at Oracle OpenWorld 2013 in booth 2214, Attunity is a leading provider of data integration software solutions that make data available where and when needed across heterogeneous enterprise platforms and the cloud.

Posted September 10, 2013

The flagship Oracle OpenWorld conference each year in San Francisco is recognized for attracting tens of thousands of highly qualified and influential attendees from more than 140 countries. With a focus on key current and future Oracle technologies and solutions, attendees converge at the conference because they are serious about educating themselves and having a good time as well.

Posted September 10, 2013

On Tuesday morning, September 24th, at 8:00am PST EMC Chairman and CEO Joe Tucci will deliver a keynote address at Oracle OpenWorld 2013 in San Francisco, CA. In addition, as a Diamond sponsor, EMC will have six breakout sessions covering relevant best practices for optimizing your Oracle infrastructure. When on the show floor, attendees can engage with EMC's Oracle experts and learn how to Lead Your Transformation in booth #1301.

Posted September 10, 2013

Phone and Name Verification, and customer Deduping, within Oracle Forms, Oracle/Java applications, PL/SQL packages, and PeopleSoft. Melissa Data APIs and Web Services for Data Quality lower costs, improve customer relations, enable BI initiatives, and empower sales and marketing teams to generate and close more opportunities.

Posted September 10, 2013

Delphix delivers agility to enterprise application projects, addressing the largest source of inefficiency and inflexibility in the datacenter—provisioning, managing, and refreshing databases for business-critical applications. With Delphix in place, QA engineers spend more time testing and less time waiting for new data, increasing utilization of expensive test infrastructure.

Posted September 10, 2013

Make big ideas happen with a strategic approach to converged cloud, converged infrastructure, big data, security and management tools for your database, middleware and applications. HP booth 1701 is the heartbeat of innovation, your area for conversations and investigation.

Posted September 10, 2013

Database performance issues? Our Oracle customers rely on Ignite to help them quickly pinpoint exactly where the problem lies—in just four clicks. Confio Ignite is a tool for DBAs, developers, and IT managers to collaborate and resolve performance issues faster.

Posted September 10, 2013

Even before all the new data sources and platforms that big data has come to represent arrived on the scene, providing users with access to the information they need when they need it was a big challenge. What has changed today? The growing range of data types beyond traditional RDBMS data - and a growing awareness that effectively leveraging data from a wide variety of sources will result in the ability to compete more effectively.Join DBTA on Thursday August 29, at 11 am PT/ 2 pm ET for a special roundtable webcast to learn about the essential technologies and approaches that help to overcome the big data integration challenges that get in the way of gaining actionable insights.

Posted August 21, 2013

There may be no more commonly used term in today's IT conversations than "big data." There also may be no more commonly misused term. Here's a look at the truth behind the five most common big data myths, including the misguided but almost universally accepted notion that big data applies only to large organizations dealing with great volumes of data.

Posted August 21, 2013

Data analytics, long the obscure pursuit of analysts and quants toiling in the depths of enterprises, has emerged as the must-have strategy of organizations across the globe. Competitive edge not only comes from deciphering the whims of customers and markets but also being able to predict shifts before they happen. Fueling the move of data analytics out of back offices and into the forefront of corporate strategy sessions is big data, now made enterprise-ready through technology platforms such as Hadoop and MapReduce. The Hadoop framework is seen as the most efficient file system and solution set to store and package big datasets for consumption by the enterprise, and MapReduce is the construct used to perform analysis over Hadoop files.

Posted August 21, 2013

Progress Software announced availability of new data connectivity and application capabilities as part of its Progress Pacific application platform-as-a-service (aPaaS) for building and managing business applications on any cloud, mobile or social platform. "Pacific is a platform running in the cloud that is targeted at small and medium-size businesses, ISVs and departmental IT," John Goodson, chief product officer at Progress Software, explained in an interview. Instead of requiring a highly trained IT staff to build applications, Goodson says, the platform provides a visual design paradigm that allows users with limited skills to build powerful applications that can quickly connect to any data sources.

Posted August 21, 2013

SAP AG introduced new high availability and disaster recovery functionality with SAP Sybase Replication Server for SAP Business Suite software running on SAP Sybase Adaptive Server Enterprise (SAP Sybase ASE). "After only a year and a quarter supporting the Business Suite, ASE has already garnered about 2,000 customer installations. This easily provides that near zero-downtime for HA/DR that is non-intrusive to the system using Replication Server as the key enabling technology," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview.

Posted August 21, 2013

IBM announced its new zEnterprise BC12 (zBC12) mainframe, designed for enhanced analytics, cloud, and mobile computing. Priced at $75,000 for the base model, IBM says it is targeting smaller organizations. The computer giant says it is also adding new industry solutions and software and operating systems across its zEnterprise portfolio, designed for financial services and government operations.

Posted August 21, 2013

More "things" are now connected to the internet than people, a phenomenon dubbed The Internet of Things. Fueled by machine-to-machine (M2M) data, the Internet of Things promises to make our lives easier and better, from more efficient energy delivery and consumption to mobile health innovations where doctors can monitor patients from afar. However, the resulting tidal wave of machine-generated data streaming in from smart devices, sensors, monitors, meters, etc., is testing the capabilities of traditional database technologies. They simply can't keep up; or when they're challenged to scale, are cost-prohibitive.

Posted August 07, 2013

A former colleague is looking for a database server to embed into an important new factory automation application his company is building. The application will manage data from a large number of sensor readings emanating from each new piece of industrial equipment his company manufactures. These values, such as operating temperature, material thickness, cutting depth, etc., fit into the data category commonly called "SCADA" - supervisory control and data acquisition. Storing, managing and analyzing this SCADA data is a critical enhancement to this colleague's new application. His large customers may have multiple locations worldwide and must be able to view and analyze the readings, both current and historical, from each piece of machinery across their enterprise.

Posted August 07, 2013

Consider a professional baseball game or any other popular professional sporting event. When a fan sits in the upper deck of Dodger Stadium in Los Angeles, or any other sporting arena on earth, the fan is happily distracted from the real world. Ultimately, professional sports constitutes a trillion dollar industry - an industry whose product is on the surface entertainment but when one barely pierces that thin veneer it quickly becomes clear that the more significant product produced is data. A fan sitting in the upper deck does not think of it as such but the data scientist recognizes the innate value of the varied manifestations of the different forms of data being continuously produced. Much of this data is being used now but it will require a true Unified Data Strategy to fully exploit the data as a whole.

Posted August 07, 2013

While unstructured data may represent one of the greatest opportunities of the big data revolution, it is one of its most perplexing challenges. In many ways, the very core of big data is that it is unstructured, and many enterprises are not yet equipped to handle this kind of information in an organized, systematic way. Most of the world's enterprise databases—based on a model designed in the 1970s and 1980s that served enterprises well in the decades since—suddenly seem out-of-date, and clunky at best when it comes to managing and storing unstructured data. However, insights from these disparate data types—including weblog, social media, documents, image, text, and graphical files—are increasingly being sought by the business.

Posted July 25, 2013

Join DBTA and MarkLogic for a webcast on Wednesday, July 31, to learn about the essential technologies and approaches to succeeding with predictive analytics on Big Data. In a recent survey of Database Trends and Applications subscribers, predictive analytics was cited as the greatest opportunity that big data offers to their organizations. The reason is simple — whether you're fighting crime, delivering healthcare, scoring credit or fine-tuning marketing, predictive analytics is the key to identifying risks and opportunities and making better decisions. However, to leverage the power of predictive analytics, organizations must possess the right technology and skills.

Posted July 25, 2013

Oracle is advancing the role of Java for IoT (Internet of Things) with the latest releases of its Oracle Java Embedded product portfolio - Oracle Java ME Embedded 3.3 and Oracle Java ME Software Development Kit (SDK) 3.3, a complete client Java runtime and toolkit optimized for microcontrollers and other resource-constrained devices. Oracle is also introducing the Oracle Java Platform Integrator program to provide partners with the ability to customize Oracle Java ME Embedded products to reach different device types and market segments. "We see IoT as the next big wave that will hit the industry," Oracle's Peter Utzschneider, vice president of product management, explained during a recent interview.

Posted July 25, 2013

SAP has launched Sybase ASE (Adaptive Server Enterprise) 15.7 service pack 100 (SP100) to provide higher performance and scalability as well as improved monitoring and diagnostic capabilities for very large database environments. "The new release adds features in three areas to drive transactional environments to even more extreme levels. We really see ASE moving increasingly into extreme transactions and to do that we have organized the feature set around the three areas," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview with 5 Minute Briefing.

Posted July 25, 2013

A new science called "data persona analytics" (DPA) is emerging. DPA is defined as the science of determining the static and dynamic attributes of a given data set so as to construct an optimized infrastructure that manages and monitors data injection, alteration, analysis, storage and protection while facilitating data flow. Each unique set of data both transient and permanent has a descriptive data personality profile which can be determined through analysis using the methodologies of DPA.

Posted July 25, 2013

The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.

Posted July 09, 2013

SAP AG and Esri, a geographic information system (GIS) and location analytics provider, are joining forces to more deeply integrate GIS solutions with platforms and enterprise applications from SAP in order to improve business efficiency and decision-making. The ability to combine the added dimension of location information with enterprise data, in real time is aimed at giving businesses greater immediacy in their decision-making capabilities. "We are partnering with the leading GIS vendor in the marketplace, Esri, to provide much deeper integration. We see GIS-related information becoming more important in the future," David Jonker, senior director, Big Data Product Marketing, Technology & Innovation Platform, SAP Labs, tells DBTA.

Posted July 09, 2013

Storage Area Networks (SANs) and Network-Attached Storage (NAS) owe their popularity to some compelling advantages in scalability, utilization and data management. But achieving high performance for some applications with a SAN or NAS can come at a premium price. In those database applications where performance is critical, direct-attached storage (DAS) offers a cost-effective high-performance solution. This is true for both dedicated and virtualized servers, and derives from the way high-speed flash memory storage options can be integrated seamlessly into a DAS configuration. There are three primary reasons now for the renewed interest in NAS.

Posted July 09, 2013

Oracle Database 12c is available for download from Oracle Technology Network (OTN). First announced by Oracle CEO Larry Ellison during his keynote at Oracle OpenWorld 2012, Oracle Database 12c introduces a new multi-tenant architecture that simplifies the process of consolidating databases onto the cloud; enabling customers to manage many databases as one - without changing their applications. During the OpenWorld keynote, Ellison described Oracle Database 12c as "the first multi-tenant database in the world" and said it provides "a fundamentally new architecture" to "introduce the notion of a container database" with the ability to plug in multiple separate, private databases into that single container.

Posted July 09, 2013

In the realm of 21st century data organization, the business function comes first. The form of the data and the tools to manage that data will be created and maintained for the singular purpose of maximizing a business's capability of leveraging its data. Initially, this seems like an obvious statement but when examining the manner in which IT has treated data over the past four decades it becomes painfully obvious that the opposite idea has been predominant.

Posted July 09, 2013

The objective of "old school" capacity management was to ensure that each server had enough capacity to avoid performance issues, and typically focused on trend-and-threshold techniques to accomplish this. But the rapid adoption of the cloud, and now OpenStack, means that supply and demand is much more fluid, and this form of capacity management is now obsolete. Unfortunately, many are ignoring this new reality and continuing to rely on what have quickly become the bad habits of capacity management. These old school methods and measures not only perpetuate an antiquated thought process, but they also lead to, among other problems, low utilization and density. In a world of tightening IT budgets these are problems that can't be ignored.

Posted June 27, 2013

Are we attempting to view 21st-century organizations through a 1990s window? Decision makers' ability to understand what customers are thinking and to be able to deliver service that dazzles and amazes them, depends on their organization's ability to sift through all the available data. However, organizations are not doing a very good job of competing on analytics. Efforts to introduce more analytic power across enterprises to analytics-driven cultures appear to be lukewarm at best. What is needed is a way to embed analytics deeper into day-to-day business operations.

Posted June 27, 2013

If money was not an issue, we wouldn't be having this conversation. But money is front and center in every major database roll-out and optimization project, and even more so in the age of server virtualization and consolidation. It often forces us to settle for good enough, when we first aspired for swift and non-stop. The financial tradeoffs are never more apparent than they have become with the arrival of lightning fast solid state technologies.

Posted June 27, 2013

Oracle expanded its partnerships with key cloud vendors. Microsoft Corp. and Oracle Corp. have formed a partnership that will enable customers to run Oracle software on Windows Server Hyper-V and in Windows Azure, Microsoft's cloud platform. Customers will be able to deploy Oracle software — including Java, Oracle Database and Oracle WebLogic Server — on Windows Server Hyper-V or in Windows Azure and receive full support from Oracle. In addition, Salesforce.com and Oracle also announced a comprehensive nine-year partnership encompassing all three tiers of cloud computing: applications, platform and infrastructure.

Posted June 27, 2013

RainStor, a provider of an enterprise database for managing and analyzing historical data, says it has combined the latest data security technologies in a comprehensive product update that has the potential to rapidly increase adoption of Apache Hadoop for banks, communications providers and government agencies.

Posted June 27, 2013

DBTA and SAP will present a webcast on how extreme transaction works and how it is helping data-intensive enterprises overcome their data growth and complexity challenges. Across the public sector, financial services, healthcare, and other industries, organizations are being challenged with managing more data and more users are demanding access to that data with less time and resources. This trend was underscored in a study conducted over Database Trends and Applications readers in April, which found that faster query performance was the number-one technology challenge that respondents are trying to get their arms around.

Posted June 27, 2013

IBM recently laid out a set of new initiatives to further support and speed up the adoption of the Linux operating system across the enterprise. These include two new Power Systems Linux Centers, as well as plans to extend support for Kernel-based Virtual Machine (KVM) technology to its Power Systems portfolio of server products.

Posted June 27, 2013

The amount of data being generated, captured and analyzed worldwide is increasing at a rate that was inconceivable a few years ago. Exciting new technologies and methodologies are evolving to address this phenomenon of science and culture creating huge new opportunities. These new technologies are also fundamentally changing the way we look at and use data. The rush to monetize "big data" makes the appeal of various "solutions" undeniable.

Posted June 27, 2013

These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.

Posted June 13, 2013

Mobile devices—including employee-owned and corporate-provided smartphones and tablets—are rapidly becoming a primary point of access to more than just email and texting. However, the proliferation of mobile application users also presents new challenges to IT departments, as the users demand access to business-critical data and processes, creating security, management, and development challenges

Posted June 13, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 13, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 13, 2013

Seamless accessfor data analysis across heterogeneous data sources represents ‘the holy grail' within mainstream enterprises. Designed for Big Data processing and performance at scale, Cirro is a revolutionary approach to bridging corporate analytic data silos. Hadoop and NoSQL data sources along with access of SaaS data sources are rapidly becoming a high priority requirement to integrate with traditional corporate analytic data sources. Cirro's Next Generation Data Fed¬eration platform delivers on this requirement providing a single point-of-access for Hadoop, NoSQL, SaaS, relational and other data sources. Cirro enables users to access all of their data sources with the BI and visualization tools they already have on their desktops. Seamlessly incorporating cloud-based data sources with traditional data sources enables self-service data exploration and analysis previously unavailable in the marketplace.

Posted June 03, 2013

As a leader in the data modeling space, CA ERwin is privileged to be an integral part of organizations' key strategic initiatives such as business intelligence and analytics, data governance, or data quality—many of which revolve around data. At CA Technologies, we understand that data runs your business, and we've put a strong focus on developing a solution that can act as an "information hub" for these initiatives.

Posted June 03, 2013

Delphix is happy to be included in the DBTA 100. We believe that agile data is already transforming how our customers view database and application management. Enterprise applications must constantly evolve to meet changing business demands, triggering expensive projects that are often over budget and behind schedule. Today, many applications have to be migrated into new data centers, private clouds, public clouds, hybrid clouds, onto SSDs, etc. adding more complexity and fragmentation to application portfolios.

Posted June 03, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 03, 2013

The success of major IT trends including Big Data, Cloud and Social Media depends on high quality customer data. One crucial part of every data quality strategy is address validation. AddressDoctor significantly improves your global address quality in order to reduce costs, increase productivity and streamline your business processes. Our address validation software automatically corrects and standardizes postal addresses worldwide—no matter if your data is captured in a CRM or online shop, or stored in a database.

Posted June 03, 2013

Today's on-the-go workforce is driving us all to connect more data to more people on more devices. And there's no shortage of "newer and better" ways to help you do that. But despite the persistent buzz about emerging technologies, keep this simple principle in mind: The fastest way to get results is to build on what you have.

Posted June 03, 2013

Over the past year it has been great to see the "Big Data" moniker lose some of its glamour as a catch-all phrase. Fortunately, Hadoop's role is now well understood for processing massive amount of old data for analytics for use cases, which are not sensitive to latency. But, increasing numbers of companies are building new applications that are driving real competitive advantage and disrupting established markets—on the Web, over mobile, or in the Cloud—generating massive amounts of "Big Data." In these new cases, latency matters.

Posted June 03, 2013

At CA Technologies, we've been supporting data management professionals for over twenty years with our CA ERwin® data modeling products, and we feel privileged to have been an active part of the rapid transformation in the field of data management during this time. One of the more exciting changes I've seen is the increased interest and participation from the business side of customer organizations.

Posted June 03, 2013

Clustrix is the leading scale-out SQL database engineered for the cloud. The combination of big data and cloud computing has broken the legacy database, creating an industry transition to new scale-out database platforms. Clustrix has delivered on a complete reinvention of the relational database, and the result is a highly differentiated platform that is ideal for the industry transition to cloud computing.

Posted June 03, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26

Sponsors