Newsletters




Database Management

Relational Database Management Systems (RDBMSs) continue to do the heavy lifting in data management, while newer database management systems are taking on prominent roles as well. NoSQL database systems such as Key-Value, Column Family, Graph, and Document databases, are gaining acceptance due to their ability to handle unstructured and semi-structured data. MultiValue, sometimes called the fifth NoSQL database, is also a well-established database management technology which continues to evolve to address new enterprise requirements.



Database Management Articles

Raima, a provider of database management system technology for both in-memory database usage and persistent storage devices, has announced the availability of Raima Database Manager (RDM) version 12.0, optimized for embedded, real-time, in memory and mobile applications.

Posted September 03, 2013

SAP AG introduced new high availability and disaster recovery functionality with SAP Sybase Replication Server for SAP Business Suite software running on SAP Sybase Adaptive Server Enterprise (SAP Sybase ASE). "After only a year and a quarter supporting the Business Suite, ASE has already garnered about 2,000 customer installations. This easily provides that near zero-downtime for HA/DR that is non-intrusive to the system using Replication Server as the key enabling technology," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview.

Posted August 31, 2013

SAP has launched Sybase ASE (Adaptive Server Enterprise) 15.7 service pack 100 (SP100) to provide higher performance and scalability as well as improved monitoring and diagnostic capabilities for very large database environments. "The new release adds features in three areas to drive transactional environments to even more extreme levels. We really see ASE moving increasingly into extreme transactions and to do that we have organized the feature set around the three areas," said Dan Lahl, vice president, Database Product Marketing, SAP, in an interview with 5 Minute Briefing.

Posted August 31, 2013

In the last several articles, I've been describing the benefits of reading and analyzing the benchmarking case studies released by the Transaction Processing Council. I've given you from a broad overview of the TPC benchmarks and shown ways that the vendor-published TPC benchmarks can help you save money and how the vendor-published TPC benchmarks must explain in disclaimers how they tweak their workloads. I have described how to run your own benchmarks and explained how to properly prepare your environment for a benchmark test. Now, it is time to show you where the rubber really hits the road, testing and benchmarking tools that can run highly scalable benchmarking workloads against your database servers.

Posted August 31, 2013

Ramping up for Oracle OpenWorld 2013, Oracle today announced the Oracle executive keynote schedule. Leading the lineup, CEO Larry Ellison will present the welcome keynote, "Oracle Database 12c In-Memory Database and M6 Big Memory Machine," on Sunday evening, Sept. 22, at 5 pm. Oracle OpenWorld 2013 takes place September 22 - September 26, 2013 at Moscone Center in San Francisco.

Posted August 27, 2013

To help developers and administrators better manage their dynamic data environments, Oracle has released MySQL Workbench 6.0. MySQL Workbench is a unified visual tool that provides data modeling, SQL development, and comprehensive administration for server configuration, user administration, and migration. The new release is a major update that addresses over 200 inputs and requests from the community and is intended to make it easier for administrators and developers to design, develop, and manage their MySQL databases.

Posted August 21, 2013

Independent Oracle Users Group (IOUG) members will be out in force at OpenWorld 2013 - presenting more than 40 sessions on the topics you want to learn about most. Celebrating its 20th anniversary this year, the IOUG represents the independent voice of Oracle technology and database professionals and allows them to be more productive in their business and careers through context-rich education, sharing best practices, and providing technology direction and networking opportunities.

Posted August 21, 2013

GenieDB has announced the launch of the GenieDB Globally Distributed MySQL-as-a-Service. The DBaaS offering allows organizations to take advantage of GenieDB's automated platform to build web-scale applications that gain the benefits of geographical database distribution, continuous availability during regional outages, and better application response time for globally distributed users.

Posted August 20, 2013

NuoDB has announced the last release of its current product version and a technology preview of some upcoming second-generation features available later in 2013. The preview is contained in the free download of the new NuoDB Starlings Release 1.2. The NewSQL approach is gaining greater acceptance, said Barry Morris, founder and CEO of NuoDB, in an interview. "What people are saying back to us is that they are getting all of the features of NoSQL without throwing SQL or transactions away. And that concept is becoming the popular notion of what NewSQL is."

Posted August 13, 2013

Database Trends and Applications has launched a special "Who to See at Oracle OpenWorld" section online where you can find information on what to expect at this year's conference and premium vendors that offer products and services to serve your needs as an Oracle technology professional.

Posted August 09, 2013

Just about every company with a DBMS has that binder full of corporate and/or IT standards. That one over there in the corner with the cobwebs on it — the one that you only use when you need an excuse to avoid work. Okay, well, maybe it's not quite that bad. Your standards documents could be on the company intranet or some other online mechanism (but chances are there will be virtual cobwebs on your online standards manuals, too).

Posted August 07, 2013

I was recently chatting with a good friend of mine who's very highly placed in the Microsoft SQL Server team. Our conversation was wide ranging and covered a lot of topics, such as internal features and upcoming announcements. (I'm under at least three different NDA's. So don't expect me to give up anything too juicy or gossipy.) For example, we spent quite a while discussing the ton of great new features and improvements just over the horizon with the recent release of SQL Server 2014 CTP1.

Posted August 07, 2013

A former colleague is looking for a database server to embed into an important new factory automation application his company is building. The application will manage data from a large number of sensor readings emanating from each new piece of industrial equipment his company manufactures. These values, such as operating temperature, material thickness, cutting depth, etc., fit into the data category commonly called "SCADA" - supervisory control and data acquisition. Storing, managing and analyzing this SCADA data is a critical enhancement to this colleague's new application. His large customers may have multiple locations worldwide and must be able to view and analyze the readings, both current and historical, from each piece of machinery across their enterprise.

Posted August 07, 2013

Consider a professional baseball game or any other popular professional sporting event. When a fan sits in the upper deck of Dodger Stadium in Los Angeles, or any other sporting arena on earth, the fan is happily distracted from the real world. Ultimately, professional sports constitutes a trillion dollar industry - an industry whose product is on the surface entertainment but when one barely pierces that thin veneer it quickly becomes clear that the more significant product produced is data. A fan sitting in the upper deck does not think of it as such but the data scientist recognizes the innate value of the varied manifestations of the different forms of data being continuously produced. Much of this data is being used now but it will require a true Unified Data Strategy to fully exploit the data as a whole.

Posted August 07, 2013

One of the principles within relational theory is that each entity's row or tuple be uniquely identifiable. This means the defined structure includes some combination of attributes whose populated values serve to identify an individual row within the table/relation. This, or these, attribute(s) are the candidate key(s) for the structure. The candidate key is also known as the primary key, or if a structure has multiple candidate keys, then one of them is designated as the primary key. When building up a logical design, primary keys should be identified by the actual data points in play.

Posted August 06, 2013

Just about every company with a DBMS has that binder full of corporate and/or IT standards. That one over there in the corner with the cobwebs on it - the one that you only use when you need an excuse to avoid work. Okay, well, maybe it's not quite that bad. Your standards documents could be on the company intranet or some other online mechanism (but chances are there will be virtual cobwebs on your online standards manuals, too).

Posted August 05, 2013

NuoDB, provider of a NewSQL database, has announced the beta program for a new tool that facilitates migration from MySQL, Microsoft SQL Server, IBM DB2, PostgreSQL and Oracle RDBMSs. The migration tool is open source and available for free download on the NuoDB DevCenter or on GitHub.

Posted July 30, 2013

IBM says it is accelerating its Linux on Power initiative with the new PowerLinux 7R4 server as well as new software and middleware applications geared for big data, analytics and next generation Java applications in an open cloud environment. According to IBM, the new PowerLinux 7R4 server, built on the same Power Systems platform running IBM's Watson cognitive computing solution, can provide clients the performance required for the new business-critical and data-intensive workloads increasingly being deployed in Linux environments. IBM is also expanding the portfolio of software for Power Systems with the availability of IBM Cognos Business Intelligence and EnterpriseDB database software, each optimized for Linux on Power.

Posted July 30, 2013

After four years of operating BigCouch in production, Cloudant has merged the BigCouch code back into the open source Apache CouchDB project. Cloudant provides a database-as-a-service and CouchDB serves as the foundation of Cloudant's technology. The company developed BigCouch, an open source variant of CouchDB, to support large-scale, globally distributed applications.There are three main reasons Cloudant is doing this, Adam Kocoloski, co-founder and CTO at Cloudant, told 5 Minute Briefing in an interview.

Posted July 30, 2013

A new science called "data persona analytics" (DPA) is emerging. DPA is defined as the science of determining the static and dynamic attributes of a given data set so as to construct an optimized infrastructure that manages and monitors data injection, alteration, analysis, storage and protection while facilitating data flow. Each unique set of data both transient and permanent has a descriptive data personality profile which can be determined through analysis using the methodologies of DPA.

Posted July 25, 2013

Oracle has announced the latest 12c releases of its Cloud Application Foundation, which integrates application server and in-memory data grid capabilities into a foundation for cloud computing, representing "a major release of our middleware infrastructure," Mike Lehmann, Oracle vice president of product management, tells 5 Minute Briefing. The focus for the products is to provide mission-critical cloud infrastructure and a lot of work has been done around native cloud capabilities, says Lehmann.

Posted July 17, 2013

The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.

Posted July 17, 2013

MemSQL, a provider of real-time analytics, announced the availability of MemSQL 2.1, which includes new features and enhancements to enable customers to access, explore and increase the value of data, regardless of size or file format. To meet the demands posed by increasing amounts of data and data types, MemSQL has updated its analytics platform to enable customers to receive real-time results on analytical queries across both real-time and historical datasets.

Posted July 16, 2013

Storage Area Networks (SANs) and Network-Attached Storage (NAS) owe their popularity to some compelling advantages in scalability, utilization and data management. But achieving high performance for some applications with a SAN or NAS can come at a premium price. In those database applications where performance is critical, direct-attached storage (DAS) offers a cost-effective high-performance solution. This is true for both dedicated and virtualized servers, and derives from the way high-speed flash memory storage options can be integrated seamlessly into a DAS configuration. There are three primary reasons now for the renewed interest in NAS.

Posted July 09, 2013

Oracle Database 12c is available for download from Oracle Technology Network (OTN). First announced by Oracle CEO Larry Ellison during his keynote at Oracle OpenWorld 2012, Oracle Database 12c introduces a new multi-tenant architecture that simplifies the process of consolidating databases onto the cloud; enabling customers to manage many databases as one - without changing their applications. During the OpenWorld keynote, Ellison described Oracle Database 12c as "the first multi-tenant database in the world" and said it provides "a fundamentally new architecture" to "introduce the notion of a container database" with the ability to plug in multiple separate, private databases into that single container.

Posted July 09, 2013

In the realm of 21st century data organization, the business function comes first. The form of the data and the tools to manage that data will be created and maintained for the singular purpose of maximizing a business's capability of leveraging its data. Initially, this seems like an obvious statement but when examining the manner in which IT has treated data over the past four decades it becomes painfully obvious that the opposite idea has been predominant.

Posted July 09, 2013

Dell Software has introduced the latest version of the Dell KACE K1000 Management Appliance, which now includes integrated software asset management to boost software license compliance, while helping lower IT costs. The K1000 adds automated software asset identification, tracking and optimization to its capabilities for managing the deployment, operation and retirement of software assets. The need for the appliance is being fueled by a range of factors, including the influx of new technologies such as cloud computing, virtualization, and BYOD, which are adding complexity in terms of systems management, Lisa Richardson, senior product marketing manager for Endpoint Systems Management, Dell Software, tells 5 Minute Briefing.

Posted July 08, 2013

SAP AG announced this week that version 16 of the company's Sybase software has achieved a Guinness World Record for loading and indexing big data. In cooperation with BMMsoft, HP and Red Hat, SAP Sybase IQ 16 achieved an audited result of 34.3 terabytes per hour, surpassing the previous record of 14 terabytes per hour achieved by the same team using an earlier version of SAP Sybase IQ. The latest version of th real-time analytics server and enterprise data warehouse (EDW) provides a new, fully parallel data loading capability and a next-generation column store, enabling the jump in big data performance.

Posted June 27, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 27, 2013

RainStor, a provider of an enterprise database for managing and analyzing historical data, says it has combined the latest data security technologies in a comprehensive product update that has the potential to rapidly increase adoption of Apache Hadoop for banks, communications providers and government agencies.

Posted June 27, 2013

The amount of data being generated, captured and analyzed worldwide is increasing at a rate that was inconceivable a few years ago. Exciting new technologies and methodologies are evolving to address this phenomenon of science and culture creating huge new opportunities. These new technologies are also fundamentally changing the way we look at and use data. The rush to monetize "big data" makes the appeal of various "solutions" undeniable.

Posted June 27, 2013

Database Trends and Applications (DBTA) magazine has announced the inaugural "DBTA 100: The Companies That Matter Most in Data," a list saluting this year's companies in data and enterprise information management—from long-standing industry veterans to fast-growing startups tackling big data. "Beyond the explosion of interest surrounding big data, the past several years have transformed enterprise information management, creating both challenges and opportunities for companies seeking to protect, optimize, integrate, and extract actionable insight from a sea of data assets," remarked Thomas Hogan, group publisher of Database Trends and Applications.

Posted June 26, 2013

Oracle announced the general availability of MySQL Cluster 7.3, which adds foreign key support, a new NoSQL JavaScript Connector for node.js, and an auto-installer to make setting up clusters easier. MySQL Cluster is an open source, auto-sharded, real-time, ACID-compliant transactional database with no single point of failure, designed for advanced web, cloud, social and mobile applications. "Foreign key support has been a longstanding feature request from day-one," Tomas Ulin, vice president of MySQL Engineering at Oracle, tells 5 Minute Briefing.

Posted June 19, 2013

Dell has released Toad for Oracle 12.0 which provides developers and DBAs with a key new capability - a seamless connection to the Toad World user community so they will no longer have to exit the tool and open a browser to gain access to the community. "The actual strength of the product has always been the input of users," John Whittaker, senior director of marketing for the Information Management Group at Dell Software, tells 5 Minute Briefing. The new ability to access the Toad World community from within Toad enables database professionals to browse, search, ask questions and start discussions directly in the Toad forums, all while using Toad.

Posted June 19, 2013

These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.

Posted June 19, 2013

The grain of a fact table is derived by the dimensions with which the fact is associated. For example, should a fact have associations with a Day dimension, a Location dimension, a Customer dimension, and a Product dimension, then the usual assumption would be for the fact to be described as being at a "by Day," "by Location," "by Customer," "by Product" metrics level. Evidence of this specific level of granularity for the fact table is seen by the primary key of the fact being the composite of the Day dimension key, Location dimension key, Customer dimension key, and Product dimension key. However, this granularity and these relationships are easily disrupted.

Posted June 13, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73

Sponsors