Newsletters




Database Management

Relational Database Management Systems (RDBMSs) continue to do the heavy lifting in data management, while newer database management systems are taking on prominent roles as well. NoSQL database systems such as Key-Value, Column Family, Graph, and Document databases, are gaining acceptance due to their ability to handle unstructured and semi-structured data. MultiValue, sometimes called the fifth NoSQL database, is also a well-established database management technology which continues to evolve to address new enterprise requirements.



Database Management Articles

Today, I would like to give you a primer on how to read the benchmark reports that are published by the major database and hardware vendors. You never know when a vendor will publish a new benchmark. There's no set schedule for them to publish their test findings. Of course, you can always look for new advertisements from many of the vendors. But that's very imprecise.

Posted February 13, 2013

ScaleBase has introduced Data Traffic Manager 2.0 for MySQL databases. The new version builds on the attributes of version 1.0, which focused on improving scalability and availability for next-generation applications, as well as providing centralized management, Paul Campaniello, vice president of marketing for ScaleBase, tells 5 Minute Briefing.

Posted February 12, 2013

Oracle announced the general availability of the MySQL 5.6 open source database. According to Oracle, with this version, users can experience simplified query development and faster execution, better transactional throughput and application availability, flexible NoSQL access, improved replication and enhanced instrumentation.

Posted February 06, 2013

Oracle president Mark Hurd and Oracle executive vice president of product development Thomas Kurian recently hosted a conference call to provide an update on Oracle's cloud strategy and recap of product-related developments. Oracle is trying to do two things for customers - simplify their IT and power their innovation, said Hurd.

Posted February 06, 2013

EMC Corporation has updated its appliance-based unified big data analytics offering. The new EMC Greenplum Data Computing Appliance (DCA) Unified Analytics Platform (UAP) Edition expands the system's analytics capabilities and solution flexibility, achieves performance gains in data loading and scanning, and adds integration with EMC's Isilon scale-out NAS storage for enterprise-class data protection and availability. Within a single appliance, the DCA integrates Greenplum Databases for analytics-optimized SQL, Greenplum HD for Hadoop-based processing as well as Greenplum partner business intelligence, ETL, and analytics applications. The appliances have been able to host both a relational database and Hadoop for some time now, Bill Jacobs director of product marketing for EMC Greenplum, tells 5 Minute Briefing. "The significance of this launch is that we tightened that integration up even more. We make those two components directly manageable with a single administrative interface and also tighten up the security. All of that is targeted at giving enterprise customers what they need in order to use Hadoop in very mission-critical applications without having to build it all up in Hadoop themselves."

Posted January 31, 2013

SAP announced a new option for SAP Business Suite customers — SAP Business Suite powered by SAP HANA — providing an integrated family of business applications that captures and analyzes transactional data in real time on a single in-memory platform. With Business Suite on HANA, "SAP has reinvented the software that reinvented businesses," stated Rob Enslin, member of the Global Executive Board and SAP head of sales, as part of his presentation during the company's recent launch event.

Posted January 30, 2013

Each new year, this column looks back over the most significant data and database-related events of the previous year. Keeping in mind that this column is written before the year is over (in November 2012) to meet publication deadlines, let's dive into the year that was in data.

Posted January 30, 2013

Attunity Ltd., a provider of information availability software solutions, has announced the release of Attunity RepliWeb for Enterprise File Replication (EFR) 6.0, which is designed to allow organizations to quickly and easily replicate data files to and from Apache Hadoop. According to Attunity, the ability to move large amounts of data in and out of Hadoop is especially beneficial to industries that need to process big data on a regular basis such as e-commerce, healthcare, infrastructure management and mobile.

Posted January 29, 2013

Today's data warehouse environments are not keeping up with the explosive growth of data volume (or "big data") and the demand for real-time analytics. Fewer than one out of 10 respondents to a new survey say their data warehouse sites can deliver analysis in what they would consider a real-time timeframe. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. These are among the findings of a new survey of 323 data managers and professionals who are part of the Independent Oracle Users Group (IOUG). The survey was underwritten by SAP Corporation and conducted by Unisphere Research, a division of Information Today, Inc.

Posted January 29, 2013

The call for papers is now open for the first-ever Big Data Boot Camp produced by Database Trends and Applications (DBTA) magazine and www.dbta.com. The event, which will take place May 21-22 at the New York Hilton, will provide the opportunity for speakers to share their knowledge and experience in the emerging big data arena.

Posted January 24, 2013

VoltDB has announced the immediate availability of the newest version of its flagship offering, VoltDB 3.0. VoltDB is an in-memory relational database that aims to address a common problem for enterprises - the ability to build applications that can ingest, analyze and act on massive volumes of data fast enough to deliver business value. According to Bruce Reading, CEO and president of VoltDB, in order for big data to achieve its full value, developers must be able to solve the data velocity problem.

Posted January 22, 2013

Let's talk about database application benchmarking. This is a skill set which, in my opinion, is one of the major differentiators between a journeyman-level DBA and a true master of the trade. In this article, I'll be giving you a brief introduction to TPC benchmarks and, in future articles, I'll be telling you how to extract specific tidbits of very valuable information from the published benchmark results. But let's get started with an overview.

Posted January 22, 2013

Micro Focus, a provider of modernization solutions, launched new interfaces that are designed to enable organizations to more quickly modernize the end user experience in isolation from the application code. The new products, RUMBA 9.0 and RUMBA+, help transpose green-screen presentations onto more familiar Windows, mobile, and web environments. "With wider reaching deployment capabilities, end user productivity improvement is quickly delivered through a simple point and click interface," says Kevin Brearley, senior director of product management for Micro Focus.

Posted January 22, 2013

Revelation Software has announced that David Hendershot has joined the company as member of its software development team. David has been a MultiValue programmer working primarily on the Universe platform for 10 years. He has supported the CUBS software package for a tax /medical collection agency in Pennsylvania, and started his MultiValue career at Market America helping develop the back end of their web site which also ran on the Universe database platform.

Posted January 03, 2013

EnterpriseDB, provider of Postgres products, Oracle database compatibility solutions, and add-on tools for PostgreSQL, has released Postgres Plus xDB Replication Server 5.0 with multi-master replication (MMR). By removing a single point of failure, MMR improves availability and enables DBAs to provide consistent and expanded access to near real-time data across geographically disparate data systems, while simultaneously allowing companies to control costs. "MMR allows each master database to stay in sync with one another so that if you change something in one database it would be updated in the other databases in almost real time," Keith Alsheimer, EnterpriseDB's head of marketing, tells 5 Minute Briefing.

Posted December 21, 2012

Idera, a provider of application and server management solutions, is making available two new tools that monitor the health of SQL Server databases and maximize DBA's efforts. Idera SQL backup status reporter and SQL fragmentation analyzer are intended to help DBAs to save time and money by reporting on database backup operations and detecting fragmentation levels across the SQL Server environment so that they can gain clear insight in order to optimize productivity.

Posted December 21, 2012

IBM has entered into a definitive agreement to acquire StoredIQ Inc., a privately held company based in Austin, Texas. IBM says StoredIQ will advance its efforts to help clients derive value from big data and respond more efficiently to litigation and regulations, dispose of information that has outlived its purpose and lower data storage costs. The acquisition is about "information economics," Ken Bisconti, VP of ECM Marketing, IBM, tells 5 Minute Briefing. "It is really about managing the value of content and information, and helping also to dispose of it defensibly as it becomes of less value."

Posted December 20, 2012

Oracle has announced new software enhancements to the Oracle SPARC SuperCluster engineered system that are intended to enable customers to consolidate any combination of mission-critical enterprise databases, middleware and applications on a single system and rapidly deploy secure, self-service cloud services.

Posted December 12, 2012

The University of Minnesota, a top research institution comprised of five campuses, 65,000 students and 25,000 employees, has made systematic changes and improved database administration efficiency with Oracle Exadata Database Machine. By hosting its IT environment on two Oracle Exadata Database Machine half racks, the university consolidated more than 200 Oracle database instances into fewer than 20, enabling it to reduce data center floor space and total cost of ownership.

Posted December 12, 2012

At OpenWorld, Oracle's annual conference for customers and partners, John Matelski, president of the IOUG, and CIO for Dekalb County, Georgia, gave his perspective on the key takeaways from this year's event. Matelski also described the user group's efforts to help the community understand the value of Oracle's engineered systems and deal with the broad implications of big data, and how the IOUG is supporting Oracle DBAs in their evolving roles.

Posted December 12, 2012

Within the information technology sector, the term architect gets thrown around quite a lot. There are software architects, infrastructure architects, application architects, business intelligence architects, data architects, information architects, and more. It seems as if any area may include someone with an "architect"status. Certainly when laying out plans for a physical building, an architect has a specific meaning and role. But within IT "architect" is used in a much fuzzier manner.

Posted December 11, 2012

Not long in the past, SQL Server licensing was an easy and straightforward process. You used to take one of a few paths to get your SQL Server licenses. The first and easiest path was to buy your SQL Server license with your hardware. Want to buy a HP Proliant DL380 for a SQL Server application? Why not get your SQL Server Enterprise Edition license with it at the same time? Just pay the hardware vendor for the whole stack, from the bare metal all the way through to the Microsoft OS and SQL Server.

Posted December 06, 2012

A proper database design cannot be thrown together quickly by novices. A practiced and formal approach to gathering data requirements and modeling data is mandatory. This modeling effort requires a formal approach to the discovery and identification of entities and data elements. Data normalization is a big part of data modeling and database design. A normalized data model reduces data redundancy and inconsistencies by ensuring that the data elements are designed appropriately.

Posted December 06, 2012

While no one can dispute the importance of enterprise resource planning (ERP) systems to organizational performance and competitiveness, executives in charge of these systems are under intense pressure to stay within or trim budgets. Close to half of the executives in a new survey say they have held off on new upgrades for at least a few years. In the meantime, at least one out of four enterprises either are scaling back or have had to scale back their recent ERP projects due to budget constraints.

Posted December 06, 2012

Big data is here, offering both vast opportunities — as well as vexing challenges — for every organization it touches. For a number of years, it has been understood that to be of value, information needs to be readily available, as close to real time as possible, to users in any location. Now, with the onset of "big data," the task gets more daunting. "These are all increasing the demands on both transactional and analytics data systems," says Bernie Spang, director of database software and systems for IBM.

Posted November 27, 2012

In-memory technology provider Terracotta, Inc. has announced that javax.cache, a caching standard for Java applications, has entered Draft Review Stage under the Java Community Process. It provides a standard approach for how Java applications temporarily cache data, an essential technology for in-memory solutions and a critical factor in achieving high performance and scalability of big data.

Posted November 27, 2012

Cloud database startup NuoDB, Inc. has announced the availability of their Release Candidate 1 (RC1) for immediate download, marking the end of the company's year-long private beta trial. SQL-compliant, NuoDB guarantees ACID transactions and scales elastically in the cloud or on premises. Offering high performance and efficient resource utilization, the solution uses an asynchronous, peer-to-peer model. NuoDB will be offered in two editions: Community and Pro. The Community Edition is a free-forever version for smaller-scale applications, while the Pro Edition is designed to support applications with that require high levels of transactions per second and currency, as well as in databases in which uptime is essential.

Posted November 27, 2012

Jeff West, president of Quest International Users Group, joined by Jonathan Vaughn, Quest's executive director, talked with DBTA at Oracle OpenWorld about what's ahead for 2013. The group has launched smaller, product-concentrated events to support JD Edwards and PeopleSoft users' specific areas of interest, and expanded its range of online offerings for users who may not be able take advantage of in-person conferences. Plans are underway to help members learn about PeopleSoft 9.2 coming in March and to prepare for the looming end of support for JD Edwards World. As always, says West, Quest continues to help get information to members from Oracle and their peers. "It is always about return on investment and aligning IT with the business. That is always on the top of people's minds."

Posted November 27, 2012

A new educational webcast examines the results of the 2012 IOUG Test, Development & QA Survey, and covers the best practices and issues that it highlights. Mining the data assets being gathered from all corners of their enterprise, including transactions, customer data, employee input, and information about market conditions, has been essential to companies in uncovering new opportunities, but, in the rush to deliver results, many IT and development departments take shortcuts within the testing process, taking live data right out of production environments to run through testing, development and quality assurance processes.

Posted November 21, 2012

I was privileged to deliver a session entitled Managing SQL Server in a Virtual World at the PASS Summit 2012, the largest annual conference for Microsoft SQL Server. It was a packed house, literally at standing-room-only capacity. I delivered the session with my friend David Klee and we were swarmed by attendees after the session wrapped up. With almost 600 people in the room, we conducted one of those informal polls that speakers like to do along the lines of "Raise your hands if …" and the informal findings were very telling. Probably around 90% of the attendees used VMware and SQL Server in some capacity and at least 60% used it in production environments. Another important fact was that only 10% of the attendees were actually able to get information on the performance of the actual VMs themselves. Most had to get all of their information and support from the VM / System administration staff.

Posted November 13, 2012

Every organization that manages data using a DBMS requires a database administration group to ensure the effective use and deployment of the company's databases. And since most modern organizations of every size use a DBMS, most organizations have DBAs, or at least people who perform the on-going maintenance and optimization of the database infrastructure.

Posted November 13, 2012

In a new survey of 207 IT and data executives, respondents report that their organizations are behind the curve when it comes to managing the risks that could come from exposing live data to less secure settings—including development departments and outside contractors. This is an Achilles' heel that is being overlooked in data security efforts. The survey, which drew responses from the membership of the Independent Oracle Users Group (IOUG), was conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by IBM. The executive summary of the report titled "Testing the Bounds of Data Governance: 2012 IOUG Test Development & QA Survey" is publicly available - and IOUG members may access the full report from the IOUG website.

Posted November 07, 2012

Mark Clark, president of the Oracle Applications Users Group (OAUG), spoke with 5 Minute Briefing at the recent Oracle OpenWorld conference about issues facing Oracle Applications users and the OAUG's plans for the year ahead. Helping members understand the significance of the cloud, Fusion applications, and EBS 12.2 are key issues, as the OAUG continues to provide education, networking opportunities, and a voice for members to Oracle, says Clark, who is also a senior partner at O2Works.

Posted October 30, 2012

HiT Software, a provider of data replication and change data capture (CDC) solutions for heterogeneous database environments, has announced the release of DBMoto 7.2, which includes support for the Actian Vectorwise analytical database. "Vectorwise is a leading analytical database that is designed to deliver an incredible speed in returning queries on big data," Carolyn Hughes, director of marketing at HiT Software, tells 5 Minute Briefing. In turn, she notes, users also want to be able to get data into Vectorwise at the same level of speed so that their reporting and analytics can be as close to real time as possible. DBMoto captures data from all major relational databases including Oracle, SQL Server, IBM DB2 (all versions), MySQL, Informix, Sybase and others, and automatically passes data and any updates to Vectorwise.

Posted October 30, 2012

At SAP TechEd 2012 in Las Vegas, SAP unveiled its plans for SAP HANA Cloud, a next-generation cloud platform based on in-memory technology. As part of SAP HANA Cloud, the company also announced the general availability of SAP NetWeaverCloud, an open standards-based application service, and SAP HANA One, a deployment of SAP HANA certified for production use on the Amazon Web Services (AWS) Cloud, as the first offerings based on SAP HANA Cloud.

Posted October 24, 2012

Oracle CEO Larry Ellison laid out four key products in his opening keynote at Oracle OpenWorld this year. The announcements - all related to the cloud - include an Oracle IaaS offering in addition to PaaS and SaaS; the addition of an Oracle Private Cloud option; Oracle Database 12c; and the new Exadata X3.

Posted October 24, 2012

Percona Live 2012, a MySQL conference, was held in New York City. With nearly 300 attendees participating, the first day of the event featured tutorials with in-depth presentations on specific topics, while the second day focused on conference sessions. Also new at Percona Live this year was an exhibit hall for MySQL ecosystem participants to put their products on display and network with potential customers. Sponsors included Clustrix, Continuent, ScaleArc, Nimbus Data, Fusion-io, Tokutek, Codership, Couchbase, Akiban, Ospero, ParElastic, SkySQL, ScaleBase, and New Relic.

Posted October 24, 2012

The opportunities and challenges presented by big data are addressed in a new report summarizing the results of a survey of data managers and professionals who are part of the Independent Oracle Users Group. The survey was underwritten by Oracle Corporation and conducted by Unisphere Research, a division of Information Today, Inc. Key highlights from the survey include the finding that more than one out of 10 data managers now have in excess of a petabyte of data within their organizations, and a majority of respondents report their levels of unstructured data are growing.

Posted October 24, 2012

Survey respondents to the IOUG Big Data survey were entered into a drawing to win an iPad by providing their email addresses. The winner of the iPad in the recent IOUG Big Data study sweepstakes drawing was Thomas F. Lewandowski, an independent Oracle DBA.

Posted October 24, 2012

Platfora has introduced what it describes as the first in-memory business intelligence platform for Hadoop. The company unveiled its product, which is now in beta, at the Strata + Hadoop World Conference in New York. It will go GA in Q1 2013. "There is a lot of excitement out there among our early customers as well as the Hadoop distribution companies and other systems vendors," Ben Werther, founder and CEO, Platfora, tells 5 Minute Briefing.

Posted October 23, 2012

Fall is my favorite time of the year for a lot of reasons. I love the cooling temperatures and the falling leaves. I enjoy the fall sports and school activities of my kids. And, perhaps best of all, I get to enjoy the yearly high-point for SQL Server professionals, the annual Community Summit put on by the Professional Association for SQL Server (www.sqlpass.org).For a technologist, the reasons to attend the annual conference of your profession should be self-evident. At the PASS 2012 Summit, there are nearly 200 technical sessions from beginner to advanced level over the duration of the week of November 5.

Posted October 23, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63

Sponsors