Newsletters




Database Management

Relational Database Management Systems (RDBMSs) continue to do the heavy lifting in data management, while newer database management systems are taking on prominent roles as well. NoSQL database systems such as Key-Value, Column Family, Graph, and Document databases, are gaining acceptance due to their ability to handle unstructured and semi-structured data. MultiValue, sometimes called the fifth NoSQL database, is also a well-established database management technology which continues to evolve to address new enterprise requirements.



Database Management Articles

Embarcadero Technologies, a provider of software solutions for application and database development, has introduced DB PowerStudio XE3.5, the latest release of the company's database management and development platform. DB PowerStudio XE3.5 is a key component of Embarcadero's new metadata governance platform, which allows organizations to leverage diverse data across information management and the software development lifecycle for data governance initiatives.

Posted March 20, 2013

The Independent Oracle Users Group (IOUG) will celebrate its 20th anniversary at COLLABORATE 13, a conference on Oracle technology presented jointly by the IOUG, OAUG (Oracle Applications User Group) and the Quest International User Group. The event will be held April 7 to 11 at the Colorado Convention Center in Denver. As part of the conference, the IOUG will host the COLLABORATE 13-IOUG Forum with nearly 1,000 sessions providing user-driven content. The theme of this year's COLLABORATE 13-IOUG Forum is "Elevate - take control of your career and elevate your Oracle ecosystem knowledge and expertise," says IOUG president John Matelski.

Posted March 20, 2013

Two columns ago, I described how the TPC benchmarks are useful for getting a general idea of the performance characteristics of your preferred database vendor and hardware platform. And in last month's column, I described how the published TPC benchmarks can even help with pricing, especially when you don't have your own quantity discounts in place.

Posted March 19, 2013

Two big questions are on the minds of data professionals these days. How are increasing complexity and the inevitable onslaught of big data shaping the future of database administrators and data architects? How will our roles change? In the interest of studying the evolving landscape of data, the Independent Oracle User's Group (IOUG) took the pulse of the community. The Big Data Skills for Success study polled numerous individuals in the IOUG Oracle technology community, to identify just how the responsibilities of handling data are changing and what the future of these roles looks like.

Posted March 14, 2013

When data professionals think about regulatory compliance we tend to consider only data in our production databases. After all, it is this data that runs our business and that must be protected. So we work to implement database auditing to know who did what to which data when; or we tackle database security and data protection initiatives to protect our data from prying eyes; or we focus on improving data quality to ensure the accuracy of our processes.

Posted March 14, 2013

Databases are restricted by reliance on disk-based storage, a technology that has been in place for several decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to information storage devices remains a hindrance in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the Independent Oracle Users Group (IOUG). The survey was underwritten by SAP Corp. and conducted by Unisphere Research, a division of Information Today, Inc.

Posted March 14, 2013

Showing that there's more to the real-time SAP HANA platform than just business, the National Basketball Association (NBA) and SAP announced they have worked together to launch NBA.com/Stats, a new statistical destination on NBA.com giving fans interactive access to official NBA statistics and analysis.

Posted March 12, 2013

For the second year, Percona, a provider of professional services for MySQL, will host the Percona Live MySQL Conference and Expo. This year's conference will take place April 22 to April 25 at the Hyatt Regency Santa Clara & The Santa Clara Convention Center.

Posted March 11, 2013

Oracle has announced the general availability of Oracle Database Appliance X3-2, featuring up to twice the performance and supporting over four times the storage as compared to the original Oracle Database Appliance. Oracle Database Appliance is a complete package of software, server, storage and networking designed for simplicity and high availability, helping businesses of all sizes reduce risk and save time and money managing their data and applications.

Posted March 06, 2013

Rackspace Hosting has acquired ObjectRocket, a MongoDB database as a service (DBaaS) provider. Through the acquisition, Rackspace says it will broaden its OpenStack-based open cloud platform to provide a NoSQL DBaaS, and establish a strong presence within the high-growth NoSQL database market.

Posted March 04, 2013

The continued expansion of structured and unstructured data storage seems to be never-ending. At the same time, database administrators' need to reduce their storage consumption is accelerating as its cost becomes more visible. Today, however, there are data optimization technologies available that can help with the continued data growth.

Posted February 27, 2013

DataCore Software, a provider of storage virtualization software, has made enhancements to its SANsymphony-V Storage Hypervisor. The new capabilities are intended to support customers who are facing high data growth, as well as the need to enable faster response times and provide continuous availability for business-critical applications.

Posted February 27, 2013

Oracle president Mark Hurd and Oracle executive vice president of product development Thomas Kurian recently hosted a conference call to provide an update on Oracle's cloud strategy and recap of product-related developments. Oracle is trying to do two things for customers - simplify their IT and power their innovation, said Hurd.

Posted February 27, 2013

SAP has announced two new number-one results on the SAP Sales and Distribution (SD) standard application benchmark for SAP Sybase Adaptive Server Enterprise (ASE).

Posted February 27, 2013

SAP and NetApp say they are tightening their collaboration with the goal of better supporting solutions such as SAP's HANA platform and SAP's NetWeaver Landscape Virtualization Management software. According to the companies, by complementing database solutions from SAP with NetApp storage and data management solutions, they can provide customers an agile data infrastructure for SAP applications that can help them be more competitive and reduce cost of ownership.

Posted February 27, 2013

Oracle has added a new utility to its storage software portfolio that is designed to streamline and automate critical tasks for customers using Oracle Database with Oracle's Sun ZFS Storage Appliance.

Posted February 20, 2013

Expanding its cloud portfolio, Oracle has unveiled Oracle Infrastructure as a Service (Oracle IaaS) with "Capacity on Demand." "For the first time, you can get Oracle engineered systems deployed on premise behind your firewall for a monthly fee," said Oracle president Mark Hurd, during a live webcast that presented details of the offering. Announced by Oracle CEO Larry Ellison at Oracle OpenWorld in 2012, Oracle IaaS enables organizations to deploy fully integrated engineered systems, including Oracle Exadata Database Machine, Oracle Exalogic Elastic Cloud, Oracle SPARC SuperCluster, Oracle Exalytics In-Memory Machine and Oracle Sun ZFS Storage Appliance in their data centers.

Posted February 13, 2013

A profound shift is occurring in where data lives. Thanks to skyrocketing demand for real-time access to huge volumes of data—big data—technology architects are increasingly moving data out of slow, disk-bound legacy databases and into large, distributed stores of ultra-fast machine memory. The plummeting price of RAM, along with advanced solutions for managing and monitoring distributed in-memory data, mean there are no longer good excuses to make customers, colleagues, and partners wait the seconds—or sometimes hours—it can take your applications to get data out of disk-bound databases. With in-memory, microseconds are the new seconds.

Posted February 13, 2013

Enterprise developers these days are usually heads down, in the trenches working on in-depth applications using Java or .NET with data stored in SQL Server or Oracle or DB2 databases. But there are other options. One of them is FileMaker, an elegant database system and development platform that can be used to quickly build visually appealing and robust applications that run on Macs, Windows PCs, smartphones, and iPads.

Posted February 13, 2013

Oracle announced the general availability of the MySQL 5.6 open source database. According to Oracle, with this version, users can experience simplified query development and faster execution, better transactional throughput and application availability, flexible NoSQL access, improved replication and enhanced instrumentation.

Posted February 06, 2013

EMC Corporation has updated its appliance-based unified big data analytics offering. The new EMC Greenplum Data Computing Appliance (DCA) Unified Analytics Platform (UAP) Edition expands the system's analytics capabilities and solution flexibility, achieves performance gains in data loading and scanning, and adds integration with EMC's Isilon scale-out NAS storage for enterprise-class data protection and availability. Within a single appliance, the DCA integrates Greenplum Databases for analytics-optimized SQL, Greenplum HD for Hadoop-based processing as well as Greenplum partner business intelligence, ETL, and analytics applications. The appliances have been able to host both a relational database and Hadoop for some time now, Bill Jacobs director of product marketing for EMC Greenplum, tells 5 Minute Briefing. "The significance of this launch is that we tightened that integration up even more. We make those two components directly manageable with a single administrative interface and also tighten up the security. All of that is targeted at giving enterprise customers what they need in order to use Hadoop in very mission-critical applications without having to build it all up in Hadoop themselves."

Posted January 31, 2013

SAP announced a new option for SAP Business Suite customers — SAP Business Suite powered by SAP HANA — providing an integrated family of business applications that captures and analyzes transactional data in real time on a single in-memory platform. With Business Suite on HANA, "SAP has reinvented the software that reinvented businesses," stated Rob Enslin, member of the Global Executive Board and SAP head of sales, as part of his presentation during the company's recent launch event.

Posted January 30, 2013

Each new year, this column looks back over the most significant data and database-related events of the previous year. Keeping in mind that this column is written before the year is over (in November 2012) to meet publication deadlines, let's dive into the year that was in data.

Posted January 30, 2013

Attunity Ltd., a provider of information availability software solutions, has announced the release of Attunity RepliWeb for Enterprise File Replication (EFR) 6.0, which is designed to allow organizations to quickly and easily replicate data files to and from Apache Hadoop. According to Attunity, the ability to move large amounts of data in and out of Hadoop is especially beneficial to industries that need to process big data on a regular basis such as e-commerce, healthcare, infrastructure management and mobile.

Posted January 29, 2013

Today's data warehouse environments are not keeping up with the explosive growth of data volume (or "big data") and the demand for real-time analytics. Fewer than one out of 10 respondents to a new survey say their data warehouse sites can deliver analysis in what they would consider a real-time timeframe. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. These are among the findings of a new survey of 323 data managers and professionals who are part of the Independent Oracle Users Group (IOUG). The survey was underwritten by SAP Corporation and conducted by Unisphere Research, a division of Information Today, Inc.

Posted January 29, 2013

VoltDB has announced the immediate availability of the newest version of its flagship offering, VoltDB 3.0. VoltDB is an in-memory relational database that aims to address a common problem for enterprises - the ability to build applications that can ingest, analyze and act on massive volumes of data fast enough to deliver business value. According to Bruce Reading, CEO and president of VoltDB, in order for big data to achieve its full value, developers must be able to solve the data velocity problem.

Posted January 22, 2013

Let's talk about database application benchmarking. This is a skill set which, in my opinion, is one of the major differentiators between a journeyman-level DBA and a true master of the trade. In this article, I'll be giving you a brief introduction to TPC benchmarks and, in future articles, I'll be telling you how to extract specific tidbits of very valuable information from the published benchmark results. But let's get started with an overview.

Posted January 22, 2013

Micro Focus, a provider of modernization solutions, launched new interfaces that are designed to enable organizations to more quickly modernize the end user experience in isolation from the application code. The new products, RUMBA 9.0 and RUMBA+, help transpose green-screen presentations onto more familiar Windows, mobile, and web environments. "With wider reaching deployment capabilities, end user productivity improvement is quickly delivered through a simple point and click interface," says Kevin Brearley, senior director of product management for Micro Focus.

Posted January 22, 2013

Revelation Software has announced that David Hendershot has joined the company as member of its software development team. David has been a MultiValue programmer working primarily on the Universe platform for 10 years. He has supported the CUBS software package for a tax /medical collection agency in Pennsylvania, and started his MultiValue career at Market America helping develop the back end of their web site which also ran on the Universe database platform.

Posted January 03, 2013

EnterpriseDB, provider of Postgres products, Oracle database compatibility solutions, and add-on tools for PostgreSQL, has released Postgres Plus xDB Replication Server 5.0 with multi-master replication (MMR). By removing a single point of failure, MMR improves availability and enables DBAs to provide consistent and expanded access to near real-time data across geographically disparate data systems, while simultaneously allowing companies to control costs. "MMR allows each master database to stay in sync with one another so that if you change something in one database it would be updated in the other databases in almost real time," Keith Alsheimer, EnterpriseDB's head of marketing, tells 5 Minute Briefing.

Posted December 21, 2012

Idera, a provider of application and server management solutions, is making available two new tools that monitor the health of SQL Server databases and maximize DBA's efforts. Idera SQL backup status reporter and SQL fragmentation analyzer are intended to help DBAs to save time and money by reporting on database backup operations and detecting fragmentation levels across the SQL Server environment so that they can gain clear insight in order to optimize productivity.

Posted December 21, 2012

IBM has entered into a definitive agreement to acquire StoredIQ Inc., a privately held company based in Austin, Texas. IBM says StoredIQ will advance its efforts to help clients derive value from big data and respond more efficiently to litigation and regulations, dispose of information that has outlived its purpose and lower data storage costs. The acquisition is about "information economics," Ken Bisconti, VP of ECM Marketing, IBM, tells 5 Minute Briefing. "It is really about managing the value of content and information, and helping also to dispose of it defensibly as it becomes of less value."

Posted December 20, 2012

Oracle has announced new software enhancements to the Oracle SPARC SuperCluster engineered system that are intended to enable customers to consolidate any combination of mission-critical enterprise databases, middleware and applications on a single system and rapidly deploy secure, self-service cloud services.

Posted December 12, 2012

The University of Minnesota, a top research institution comprised of five campuses, 65,000 students and 25,000 employees, has made systematic changes and improved database administration efficiency with Oracle Exadata Database Machine. By hosting its IT environment on two Oracle Exadata Database Machine half racks, the university consolidated more than 200 Oracle database instances into fewer than 20, enabling it to reduce data center floor space and total cost of ownership.

Posted December 12, 2012

At OpenWorld, Oracle's annual conference for customers and partners, John Matelski, president of the IOUG, and CIO for Dekalb County, Georgia, gave his perspective on the key takeaways from this year's event. Matelski also described the user group's efforts to help the community understand the value of Oracle's engineered systems and deal with the broad implications of big data, and how the IOUG is supporting Oracle DBAs in their evolving roles.

Posted December 12, 2012

Within the information technology sector, the term architect gets thrown around quite a lot. There are software architects, infrastructure architects, application architects, business intelligence architects, data architects, information architects, and more. It seems as if any area may include someone with an "architect"status. Certainly when laying out plans for a physical building, an architect has a specific meaning and role. But within IT "architect" is used in a much fuzzier manner.

Posted December 11, 2012

Not long in the past, SQL Server licensing was an easy and straightforward process. You used to take one of a few paths to get your SQL Server licenses. The first and easiest path was to buy your SQL Server license with your hardware. Want to buy a HP Proliant DL380 for a SQL Server application? Why not get your SQL Server Enterprise Edition license with it at the same time? Just pay the hardware vendor for the whole stack, from the bare metal all the way through to the Microsoft OS and SQL Server.

Posted December 06, 2012

A proper database design cannot be thrown together quickly by novices. A practiced and formal approach to gathering data requirements and modeling data is mandatory. This modeling effort requires a formal approach to the discovery and identification of entities and data elements. Data normalization is a big part of data modeling and database design. A normalized data model reduces data redundancy and inconsistencies by ensuring that the data elements are designed appropriately.

Posted December 06, 2012

While no one can dispute the importance of enterprise resource planning (ERP) systems to organizational performance and competitiveness, executives in charge of these systems are under intense pressure to stay within or trim budgets. Close to half of the executives in a new survey say they have held off on new upgrades for at least a few years. In the meantime, at least one out of four enterprises either are scaling back or have had to scale back their recent ERP projects due to budget constraints.

Posted December 06, 2012

In-memory technology provider Terracotta, Inc. has announced that javax.cache, a caching standard for Java applications, has entered Draft Review Stage under the Java Community Process. It provides a standard approach for how Java applications temporarily cache data, an essential technology for in-memory solutions and a critical factor in achieving high performance and scalability of big data.

Posted November 27, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158

Sponsors