Newsletters




Trends and Applications



Progress Software announced availability of new data connectivity and application capabilities as part of its Progress Pacific application platform-as-a-service (aPaaS) for building and managing business applications on any cloud, mobile or social platform. "Pacific is a platform running in the cloud that is targeted at small and medium-size businesses, ISVs and departmental IT," John Goodson, chief product officer at Progress Software, explained in an interview. Instead of requiring a highly trained IT staff to build applications, Goodson says, the platform provides a visual design paradigm that allows users with limited skills to build powerful applications that can quickly connect to any data sources.

Posted August 21, 2013

SAP AG introduced new high availability and disaster recovery functionality with SAP Sybase Replication Server for SAP Business Suite software running on SAP Sybase Adaptive Server Enterprise (SAP Sybase ASE). "After only a year and a quarter supporting the Business Suite, ASE has already garnered about 2,000 customer installations. This easily provides that near zero-downtime for HA/DR that is non-intrusive to the system using Replication Server as the key enabling technology," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview.

Posted August 21, 2013

IBM announced its new zEnterprise BC12 (zBC12) mainframe, designed for enhanced analytics, cloud, and mobile computing. Priced at $75,000 for the base model, IBM says it is targeting smaller organizations. The computer giant says it is also adding new industry solutions and software and operating systems across its zEnterprise portfolio, designed for financial services and government operations.

Posted August 21, 2013

More "things" are now connected to the internet than people, a phenomenon dubbed The Internet of Things. Fueled by machine-to-machine (M2M) data, the Internet of Things promises to make our lives easier and better, from more efficient energy delivery and consumption to mobile health innovations where doctors can monitor patients from afar. However, the resulting tidal wave of machine-generated data streaming in from smart devices, sensors, monitors, meters, etc., is testing the capabilities of traditional database technologies. They simply can't keep up; or when they're challenged to scale, are cost-prohibitive.

Posted August 07, 2013

A former colleague is looking for a database server to embed into an important new factory automation application his company is building. The application will manage data from a large number of sensor readings emanating from each new piece of industrial equipment his company manufactures. These values, such as operating temperature, material thickness, cutting depth, etc., fit into the data category commonly called "SCADA" - supervisory control and data acquisition. Storing, managing and analyzing this SCADA data is a critical enhancement to this colleague's new application. His large customers may have multiple locations worldwide and must be able to view and analyze the readings, both current and historical, from each piece of machinery across their enterprise.

Posted August 07, 2013

Consider a professional baseball game or any other popular professional sporting event. When a fan sits in the upper deck of Dodger Stadium in Los Angeles, or any other sporting arena on earth, the fan is happily distracted from the real world. Ultimately, professional sports constitutes a trillion dollar industry - an industry whose product is on the surface entertainment but when one barely pierces that thin veneer it quickly becomes clear that the more significant product produced is data. A fan sitting in the upper deck does not think of it as such but the data scientist recognizes the innate value of the varied manifestations of the different forms of data being continuously produced. Much of this data is being used now but it will require a true Unified Data Strategy to fully exploit the data as a whole.

Posted August 07, 2013

While unstructured data may represent one of the greatest opportunities of the big data revolution, it is one of its most perplexing challenges. In many ways, the very core of big data is that it is unstructured, and many enterprises are not yet equipped to handle this kind of information in an organized, systematic way. Most of the world's enterprise databases—based on a model designed in the 1970s and 1980s that served enterprises well in the decades since—suddenly seem out-of-date, and clunky at best when it comes to managing and storing unstructured data. However, insights from these disparate data types—including weblog, social media, documents, image, text, and graphical files—are increasingly being sought by the business.

Posted July 25, 2013

Join DBTA and MarkLogic for a webcast on Wednesday, July 31, to learn about the essential technologies and approaches to succeeding with predictive analytics on Big Data. In a recent survey of Database Trends and Applications subscribers, predictive analytics was cited as the greatest opportunity that big data offers to their organizations. The reason is simple — whether you're fighting crime, delivering healthcare, scoring credit or fine-tuning marketing, predictive analytics is the key to identifying risks and opportunities and making better decisions. However, to leverage the power of predictive analytics, organizations must possess the right technology and skills.

Posted July 25, 2013

Oracle is advancing the role of Java for IoT (Internet of Things) with the latest releases of its Oracle Java Embedded product portfolio - Oracle Java ME Embedded 3.3 and Oracle Java ME Software Development Kit (SDK) 3.3, a complete client Java runtime and toolkit optimized for microcontrollers and other resource-constrained devices. Oracle is also introducing the Oracle Java Platform Integrator program to provide partners with the ability to customize Oracle Java ME Embedded products to reach different device types and market segments. "We see IoT as the next big wave that will hit the industry," Oracle's Peter Utzschneider, vice president of product management, explained during a recent interview.

Posted July 25, 2013

SAP has launched Sybase ASE (Adaptive Server Enterprise) 15.7 service pack 100 (SP100) to provide higher performance and scalability as well as improved monitoring and diagnostic capabilities for very large database environments. "The new release adds features in three areas to drive transactional environments to even more extreme levels. We really see ASE moving increasingly into extreme transactions and to do that we have organized the feature set around the three areas," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview with 5 Minute Briefing.

Posted July 25, 2013

A new science called "data persona analytics" (DPA) is emerging. DPA is defined as the science of determining the static and dynamic attributes of a given data set so as to construct an optimized infrastructure that manages and monitors data injection, alteration, analysis, storage and protection while facilitating data flow. Each unique set of data both transient and permanent has a descriptive data personality profile which can be determined through analysis using the methodologies of DPA.

Posted July 25, 2013

The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.

Posted July 09, 2013

SAP AG and Esri, a geographic information system (GIS) and location analytics provider, are joining forces to more deeply integrate GIS solutions with platforms and enterprise applications from SAP in order to improve business efficiency and decision-making. The ability to combine the added dimension of location information with enterprise data, in real time is aimed at giving businesses greater immediacy in their decision-making capabilities. "We are partnering with the leading GIS vendor in the marketplace, Esri, to provide much deeper integration. We see GIS-related information becoming more important in the future," David Jonker, senior director, Big Data Product Marketing, Technology & Innovation Platform, SAP Labs, tells DBTA.

Posted July 09, 2013

Storage Area Networks (SANs) and Network-Attached Storage (NAS) owe their popularity to some compelling advantages in scalability, utilization and data management. But achieving high performance for some applications with a SAN or NAS can come at a premium price. In those database applications where performance is critical, direct-attached storage (DAS) offers a cost-effective high-performance solution. This is true for both dedicated and virtualized servers, and derives from the way high-speed flash memory storage options can be integrated seamlessly into a DAS configuration. There are three primary reasons now for the renewed interest in NAS.

Posted July 09, 2013

Oracle Database 12c is available for download from Oracle Technology Network (OTN). First announced by Oracle CEO Larry Ellison during his keynote at Oracle OpenWorld 2012, Oracle Database 12c introduces a new multi-tenant architecture that simplifies the process of consolidating databases onto the cloud; enabling customers to manage many databases as one - without changing their applications. During the OpenWorld keynote, Ellison described Oracle Database 12c as "the first multi-tenant database in the world" and said it provides "a fundamentally new architecture" to "introduce the notion of a container database" with the ability to plug in multiple separate, private databases into that single container.

Posted July 09, 2013

In the realm of 21st century data organization, the business function comes first. The form of the data and the tools to manage that data will be created and maintained for the singular purpose of maximizing a business's capability of leveraging its data. Initially, this seems like an obvious statement but when examining the manner in which IT has treated data over the past four decades it becomes painfully obvious that the opposite idea has been predominant.

Posted July 09, 2013

The objective of "old school" capacity management was to ensure that each server had enough capacity to avoid performance issues, and typically focused on trend-and-threshold techniques to accomplish this. But the rapid adoption of the cloud, and now OpenStack, means that supply and demand is much more fluid, and this form of capacity management is now obsolete. Unfortunately, many are ignoring this new reality and continuing to rely on what have quickly become the bad habits of capacity management. These old school methods and measures not only perpetuate an antiquated thought process, but they also lead to, among other problems, low utilization and density. In a world of tightening IT budgets these are problems that can't be ignored.

Posted June 27, 2013

Are we attempting to view 21st-century organizations through a 1990s window? Decision makers' ability to understand what customers are thinking and to be able to deliver service that dazzles and amazes them, depends on their organization's ability to sift through all the available data. However, organizations are not doing a very good job of competing on analytics. Efforts to introduce more analytic power across enterprises to analytics-driven cultures appear to be lukewarm at best. What is needed is a way to embed analytics deeper into day-to-day business operations.

Posted June 27, 2013

If money was not an issue, we wouldn't be having this conversation. But money is front and center in every major database roll-out and optimization project, and even more so in the age of server virtualization and consolidation. It often forces us to settle for good enough, when we first aspired for swift and non-stop. The financial tradeoffs are never more apparent than they have become with the arrival of lightning fast solid state technologies.

Posted June 27, 2013

Oracle expanded its partnerships with key cloud vendors. Microsoft Corp. and Oracle Corp. have formed a partnership that will enable customers to run Oracle software on Windows Server Hyper-V and in Windows Azure, Microsoft's cloud platform. Customers will be able to deploy Oracle software — including Java, Oracle Database and Oracle WebLogic Server — on Windows Server Hyper-V or in Windows Azure and receive full support from Oracle. In addition, Salesforce.com and Oracle also announced a comprehensive nine-year partnership encompassing all three tiers of cloud computing: applications, platform and infrastructure.

Posted June 27, 2013

RainStor, a provider of an enterprise database for managing and analyzing historical data, says it has combined the latest data security technologies in a comprehensive product update that has the potential to rapidly increase adoption of Apache Hadoop for banks, communications providers and government agencies.

Posted June 27, 2013

DBTA and SAP will present a webcast on how extreme transaction works and how it is helping data-intensive enterprises overcome their data growth and complexity challenges. Across the public sector, financial services, healthcare, and other industries, organizations are being challenged with managing more data and more users are demanding access to that data with less time and resources. This trend was underscored in a study conducted over Database Trends and Applications readers in April, which found that faster query performance was the number-one technology challenge that respondents are trying to get their arms around.

Posted June 27, 2013

IBM recently laid out a set of new initiatives to further support and speed up the adoption of the Linux operating system across the enterprise. These include two new Power Systems Linux Centers, as well as plans to extend support for Kernel-based Virtual Machine (KVM) technology to its Power Systems portfolio of server products.

Posted June 27, 2013

The amount of data being generated, captured and analyzed worldwide is increasing at a rate that was inconceivable a few years ago. Exciting new technologies and methodologies are evolving to address this phenomenon of science and culture creating huge new opportunities. These new technologies are also fundamentally changing the way we look at and use data. The rush to monetize "big data" makes the appeal of various "solutions" undeniable.

Posted June 27, 2013

These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.

Posted June 13, 2013

Mobile devices—including employee-owned and corporate-provided smartphones and tablets—are rapidly becoming a primary point of access to more than just email and texting. However, the proliferation of mobile application users also presents new challenges to IT departments, as the users demand access to business-critical data and processes, creating security, management, and development challenges

Posted June 13, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 13, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 13, 2013

Seamless accessfor data analysis across heterogeneous data sources represents ‘the holy grail' within mainstream enterprises. Designed for Big Data processing and performance at scale, Cirro is a revolutionary approach to bridging corporate analytic data silos. Hadoop and NoSQL data sources along with access of SaaS data sources are rapidly becoming a high priority requirement to integrate with traditional corporate analytic data sources. Cirro's Next Generation Data Fed¬eration platform delivers on this requirement providing a single point-of-access for Hadoop, NoSQL, SaaS, relational and other data sources. Cirro enables users to access all of their data sources with the BI and visualization tools they already have on their desktops. Seamlessly incorporating cloud-based data sources with traditional data sources enables self-service data exploration and analysis previously unavailable in the marketplace.

Posted June 03, 2013

As a leader in the data modeling space, CA ERwin is privileged to be an integral part of organizations' key strategic initiatives such as business intelligence and analytics, data governance, or data quality—many of which revolve around data. At CA Technologies, we understand that data runs your business, and we've put a strong focus on developing a solution that can act as an "information hub" for these initiatives.

Posted June 03, 2013

Delphix is happy to be included in the DBTA 100. We believe that agile data is already transforming how our customers view database and application management. Enterprise applications must constantly evolve to meet changing business demands, triggering expensive projects that are often over budget and behind schedule. Today, many applications have to be migrated into new data centers, private clouds, public clouds, hybrid clouds, onto SSDs, etc. adding more complexity and fragmentation to application portfolios.

Posted June 03, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 03, 2013

The success of major IT trends including Big Data, Cloud and Social Media depends on high quality customer data. One crucial part of every data quality strategy is address validation. AddressDoctor significantly improves your global address quality in order to reduce costs, increase productivity and streamline your business processes. Our address validation software automatically corrects and standardizes postal addresses worldwide—no matter if your data is captured in a CRM or online shop, or stored in a database.

Posted June 03, 2013

Today's on-the-go workforce is driving us all to connect more data to more people on more devices. And there's no shortage of "newer and better" ways to help you do that. But despite the persistent buzz about emerging technologies, keep this simple principle in mind: The fastest way to get results is to build on what you have.

Posted June 03, 2013

Over the past year it has been great to see the "Big Data" moniker lose some of its glamour as a catch-all phrase. Fortunately, Hadoop's role is now well understood for processing massive amount of old data for analytics for use cases, which are not sensitive to latency. But, increasing numbers of companies are building new applications that are driving real competitive advantage and disrupting established markets—on the Web, over mobile, or in the Cloud—generating massive amounts of "Big Data." In these new cases, latency matters.

Posted June 03, 2013

At CA Technologies, we've been supporting data management professionals for over twenty years with our CA ERwin® data modeling products, and we feel privileged to have been an active part of the rapid transformation in the field of data management during this time. One of the more exciting changes I've seen is the increased interest and participation from the business side of customer organizations.

Posted June 03, 2013

Clustrix is the leading scale-out SQL database engineered for the cloud. The combination of big data and cloud computing has broken the legacy database, creating an industry transition to new scale-out database platforms. Clustrix has delivered on a complete reinvention of the relational database, and the result is a highly differentiated platform that is ideal for the industry transition to cloud computing.

Posted June 03, 2013

With so much data today, the difference between business leaders and also-rans is often how well they leverage their data. Significant leverage equals significant business value, and that's a big advantage over the competition.

Posted June 03, 2013

You can't hide from complexity. It's everywhere—from new technologies and rapidly evolving consumer expectations, to the shifting role IT plays in every business.

Posted June 03, 2013

Development and production teams often work in silos, with separate tools and measures of success, and warring tensions between change and stability. Many of us have experienced firsthand how this fundamental tension underlies—and often undermines—IT initiatives. Development wants continuous change and enhancement. Production and operations want stability and controlled change. Finding a way to focus all IT teams on working together toward the common goal of business success is critical.

Posted June 03, 2013

The business world depends on data. Continuent's Tungsten software offers a database-as-a-service that gives cost-effective MySQL databases the enterprise-class clustering and replication required by business-critical applications. Continuent Tungsten provides 24x7 data availability, performance scaling, and simple management without risky application changes or database upgrades. Continuent Tungsten operates at a fraction of the cost of commercial DBMS servers like Oracle and MS SQL Server.

Posted June 03, 2013

Scott Hayes is an IBM DB2 LUW Performance Expert, IBM DB2 GOLD Consultant, IBM Information Management Champion, US patent inventor, published author, blogger on DB2 LUW performance topics, and popular frequent speaker at IBM IOD and IDUG Conferences. He started DBI Software in July 2005 with one simple mission: "Help People!" Eight years later, this simple mission is still DBI's #1 core value, though Mr. Hayes admits, "We are better at helping people with DB2 than their marriages or cars."

Posted June 03, 2013

Embarcadero Technologies gives 97% of the world's top 2000 companies the tools needed to address the biggest challenges in data management. Facing significant growth in complexity, diversity and volume of enterprise data, companies worldwide are increasingly turning to data governance as a strategic solution. Helping our customers manage this complexity, and close the "governance gap" has been a major driver of innovation in our products.

Posted June 03, 2013

Open source software has enjoyed a surge in interest and demand over the last two years for its quality and cost savings. Companies like VMware, Microsoft (Skype), Apple, and Facebook (Instagram) are using PostgreSQL, the open source database. Other PostgreSQL users include the Federal Aviation Administration, the US State Department, ABN Amro, Fujitsu, Sony-Ericsson, and Sony Online Entertainment.

Posted June 03, 2013

At HiT Software, we know that fresh data is key to business success. Our clients rely on information about their systems, business units, customers or vendors that is no more than few hours, and often only a few minutes old. Reporting on seasonal pricing and promotions, or consumption of goods versus parts, or financial planning on interest earned all demand actionable data in real-time.

Posted June 03, 2013

Infobright is proud to be selected for inclusion in the DBTA 100. Infobright's investigative analytic engine enables organizations to extract rich, real-time insight from machine-generated data at the speed of business. Specifically focused on the rapid analysis of volumes of information from Web logs, mobile data, call records, stock tickers, sensors and more, Infobright empowers organizations to ask all the questions they want, and get the detailed answers they need to make better, faster and more profitable decisions.

Posted June 03, 2013

Economic recessions are to the business cycle as wild fires are to a forest. They are dangerous but necessary to clear the overgrowth around the stronger trees and promote new life. When the 2008 recession hit, the stronger companies began strategically adapting to survive. Investing in Business Intelligence ("BI") is one strategy that provides companies a way to look at their forest of data from a birds-eye view rather than through the trees. From this viewpoint, companies can identify their strengths and weaknesses and gain useful insights into their customers' needs.

Posted June 03, 2013

Data is an enabler—and an inhibitor. It can yield tremendous insights and answers, but when it is in the wrong format, too big or changing too frequently, it becomes a resource-sink—of people, knowledge and technology. Tools need to evolve to eliminate data constraints.

Posted June 03, 2013

For close to 30 years, I have been actively involved with the database community. From one of the first Oracle Database Administrators, to President of the Independent Oracle Users Group; from one of the Founders of the Professional Association of SQL Server, to one of the original Oracle*Press authors; from Oracle ACE to VMware vExpert for database virtualization, I have seen an amazing succession of changes take place in our industry.

Posted June 03, 2013

Within today's ever-changing landscape of technology and consumer requirements, it can be difficult to understand what solutions are relevant for your data management needs. Wading through the hype of new solutions that can accomplish everything has taken up precious time and energy from the leadership of established organizations. Understanding how to implement the right tool for the job is not as simple as it seems and in many cases, it's best to take into consideration the advantages of utilizing proven technologies to drive the most value from your solution stack.

Posted June 03, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

Sponsors