Newsletters




Database Management

Relational Database Management Systems (RDBMSs) continue to do the heavy lifting in data management, while newer database management systems are taking on prominent roles as well. NoSQL database systems such as Key-Value, Column Family, Graph, and Document databases, are gaining acceptance due to their ability to handle unstructured and semi-structured data. MultiValue, sometimes called the fifth NoSQL database, is also a well-established database management technology which continues to evolve to address new enterprise requirements.



Database Management Articles

While the established data warehouse excels at core analytics, enterprises need to be more agile than ever before because of the Internet of Things. In a recent DTBA roundtable webcast, Joe Caserta, president and CEO of Caserta Concepts, and Wendy Lucas, program director for IBM Data Warehouse marketing at IBM, discussed the benefits of data warehousing approaches in the cloud.

Posted June 13, 2016

Microsoft is acquiring LinkedIn in an all-cash transaction valued at $26.2 billion that is expected to close this calendar year. The announcement of the definitive agreement for purchase was made by Microsoft in an article posted on the Microsoft News Center.

Posted June 13, 2016

Splice Machine has announced that it is releasing its database management system, a dual-engine RDBMS powered by Hadoop and Spark, as an open source platform. The community edition will be free for members of the open source community, allowing them to use as well as modify source code, whereas the enterprise edition will include a set of proprietary tools focused on operational features that enable DBAs and DevOps teams to maintain, tune, and keep the platform secure while it's live.

Posted June 10, 2016

Cloudera is collaborating with Microsoft to build a new open source platform that will reduce the burden on application developers leveraging Spark. The two entities, together with other open source contributors, have built a new open source Apache licensed REST-based Spark Service, called Livy, which is still in early alpha development.

Posted June 09, 2016

While temporal data support is something that has existed in the past within other database platforms, it is a newly available feature with the RTM version of SQL Server 2016. In case you haven't heard of temporal data values (or for some, "bitemporal"), here is a brief explanation.

Posted June 09, 2016

The data manager now sits in the center of a revolution swirling about enterprises. In today's up-and-down global economy, opportunities and threats are coming in from a number of directions. Business leaders recognize that the key to success in hyper-competitive markets is the ability to leverage data to draw insights that predict and provide prescriptive action to stay ahead of markets and customer preferences. For that, they need to keep up with the latest solutions and approaches in data management. Here are 12 of the key technologies turning heads—or potentially opening enterprise wallets—in today's data centers.

Posted June 09, 2016

Many of the NoSQL tools out there, such as MongoDB, Couchbase, Hadoop, and others, purport to be leading a revolution and breaking the bonds of servitude to the restrictive, inflexible, established, relational market. They claim users need more, users need better … and they are there to help. Of course, when speaking about those relational flaws, the comments always focus on problematic aspects of a DBMS' physical implementation.

Posted June 09, 2016

Keeping your DBMS software up-to-date can be a significant job. The typical release cycle for DBMS software is every 18 to 36 months for major releases, with constant bug fixes and maintenance updates delivered in between those major releases.

Posted June 09, 2016

Digital transformation is taking place at an accelerated rate in the world today. CIOs are struggling to maintain legacy systems, while the world around them continues to transform at a rapid pace. At the same time, clients are expecting IT professionals to support the latest technology that becomes available. As IT continues to lag behind the digital wave, clients are becoming impatient and moving toward cloud technologies such as PaaS, IaaS, SaaS, and DBaaS to address their business requirements.

Posted June 09, 2016

Almost all organizations have migrated at least some infrastructure to the cloud. In fact, just 9% of IT departments have not migrated anything. Furthermore, databases rank in the top three for both infrastructure already migrated to the cloud and infrastructure with the highest priority for future migration.

Posted June 09, 2016

When working with data governance practitioners, I often hear comments that indicate pockets of data governance excellence (the proverbial half-full glass) or silos of data governance (half-empty) as they work toward the common goal of enterprise data governance. This is often accompanied by an observation that "if we could just get everyone to follow the rules (the same rules), then we could truly and successfully govern at the enterprise level."

Posted June 09, 2016

Data Summit 2016 in New York City drew IT managers, data architects, application developers, data analysts, project managers, and business managers. Analytics, search, machine learning, and IoT were some of the key topics of discussion in educational presentations on industry trends and technologies, keynotes, and hands-on workshops.

Posted June 08, 2016

Data Dynamics, Inc., a provider of storage management solutions for unstructured data, has introduced new software modules as part of the StorageX platform.

Posted June 08, 2016

Progress is releasing a new package of platforms that will enable enterprises to tap into the full potential of digital business. Progress DigitalFactory is a new cloud-based platform that provides a holistic, extensible solution for businesses to create omni-channel digital experiences.

Posted June 08, 2016

The General Data Protection Regulation (GDPR) is a legal construct that emanates from the EU and has already resulted in far-ranging implications for all producers, providers, and consumers of services delivered or maintained in the cloud. Though it has yet to go into effect, this system of regulations is sure to impact every provider, producer, and consumer of cloud-based infrastructure, products, services, and, most importantly, data in the years ahead.

Posted June 08, 2016

Cloud computing is gaining ground in the enterprise since it allows businesses to concentrate on their core competencies rather than on IT. As cloud becomes more popular, organizations are focusing on hybrid strategies that combine on-premise and cloud capabilities, industry research shows. However, data integration and security remain concerns.

Posted June 07, 2016

Using data visualization to support visual data storytelling is a craft, and one that takes practice, expertise, and a good bit of drafting and rewriting. Strong visual narratives that make data easier to understand, according to The Economist, "meld the skills of computer science, statistics, artistic design, and storytelling."

Posted June 07, 2016

This week at Spark Summit, data management companies are rolling out new Spark integrations and support at Spark Summit to enable their users to take advantage of the open source data processing framework. In addition, Databricks, the company founded by the team that created Apache Spark, has announced that the Databricks Community Edition (DCE) is now generally available.

Posted June 07, 2016

A new integrated development environment for real-time, high performance analytics, available on IBM Cloud, called the Data Science Experience, is aimed at helping to blend emerging data technologies and machine learning into existing architectures.

Posted June 07, 2016

SIOS Technology Corp., a provider of software products for optimizing and protecting business-critical application environments, is rolling out a new release of SIOS iQ, its machine learning analytics software for VM environments. Providing a key new capability, SIOS has worked with SQL Sentry to integrate iQ version 3.7 with SQL Sentry Performance Advisor to bridge what it describes as a critical gap between IT infrastructure administrators and SQL Server administrators.

Posted June 07, 2016

Emerging and newer vendors can offer fresh, innovative ways of dealing with data management and analytics challenges. Here, DBTA looks at the 10 companies whose approaches we think are worth watching.

Posted June 06, 2016

The IT landscape is always shifting and being contoured by external market forces and internal industry initiatives. Against this changing backdrop, each year, DBTA presents a list of 100 companies that matter in data, compelling us to pause and reflect on the market changes taking place.

Posted June 06, 2016

At Spark Summit in San Francisco this week, Microsoft announced it is making a major commitment for Spark to power Microsoft's big data and analytics offerings including Cortana Intelligence Suite, Power BI, and Microsoft R Server.

Posted June 06, 2016

NoSQL database technology vendor Couchbase has introduced a new Couchbase Spark Connector. According to Couchbase, the new Spark connector will enable businesses to gain business insights faster, enabling them to deliver better customer experiences through web, mobile and IoT applications.

Posted June 06, 2016

Attunity Ltd. is partnering with Microsoft Corp. to enable faster and easier adoption of Microsoft SQL Server 2016. The partnership allows Microsoft to tap into Attunity Replicate's data replication and change data capture technologies to achieve "zero-downtime migrations" from systems such as Oracle and others to SQL Server 2016.

Posted June 06, 2016

In the wide world of Hadoop today, there are seven technology areas that have garnered a high level of interest. These key areas prove that Hadoop is not just a big data tool; it is a strong ecosystem in which new projects coming along are assured of exposure and interoperability because of the strength of the environment.

Posted June 03, 2016

In a new book, titled Next Generation Databases: NoSQL, NewSQL and Big Data, Guy Harrison explores and contrasts both new and established database technologies. Harrison, who leads the team at Dell that develops the Toad, Spotlight, and SharePlex product families, wrote the book to address the gap he sees in the conversation about the latest generation of databases.

Posted June 03, 2016

Cisco and IBM are partnering to provide instant Internet of Things (IoT) insights at the edge of the network. The new approach combines technology from both companies to allow businesses and organizations in remote and autonomous locations to tap the benefits of IBM's Watson IoT and business analytics technologies and Cisco's edge analytics capabilities.

Posted June 02, 2016

As the growing size and complexity of database environments strain resources at most IT departments, many are looking for ways to standardize infrastructure and automate routine tasks to free up assets. Joe McKendrick, lead research analyst at Unisphere Research, and Sam Lucido, director for technical applications marketing at EMC Corp., recently discussed strategies for Oracle users and IT, including the results of a new survey conducted in partnership with the IOUG, VMware, and EMC regarding obstacles data managers face while managing networks and delivering reliable information.

Posted June 01, 2016

Oracle is introducing version 4.0 of its NoSQL database. First introduced in 2011, the Oracle NoSQL Database is a key-value database that evolved from the company's acquisition of BerkeleyDB Java Edition, a mature, high-performance embeddable database. Ashok Joshi, senior director of NoSQL, Berkeley Database, and Database Mobile Server at Oracle, outlined the key enhancements in the new release.

Posted June 01, 2016

Big data represents an enormous shift for IT, said Craig S. Mullins in a presentation at Data Summit 2016 in NYC that looked at what relational database professionals need to know about big data technologies. Mullins, a principal of Mullins Consulting, and the author of the DBA Corner column for DBTA, provided an overview of the changes that have taken place in the DBTA arena in recent years, and the key technologies that are having high impact.

Posted June 01, 2016

The ability to stand up and voice your opinion about a solution or technology that will not solve a business problem is critical. At times, choices are made for budgetary reasons, and other times, it may be organizational pressure. But regardless of the reason, as IT professionals, we need to be courageous and do the right thing, whatever that is.

Posted June 01, 2016

CSC's board of directors has unanimously approved a plan to merge the company with the Enterprise Services segment of Hewlett Packard Enterprise (HPE. The strategic combination of the two businesses will create what the companies' executives called one of the world's largest pure-play IT services companies. The new company is expected to have annual revenues of $26 billion and more than 5,000 clients in 70 countries.

Posted May 31, 2016

Dynatrace, a management tools provider, has teamed up with Pivotal, a digital services tools vendor, to deploy its application monitoring solutions for the Pivotal Cloud Foundry (PCF) platform. Dynatrace Application Monitoring Service Broker Tile and Buildpack Extensions for Pivotal Cloud Foundry will provide actionable performance insights for businesses with cloud initiatives.

Posted May 31, 2016

For the first time, scientists at IBM Research have demonstrated reliably storing three bits of data per cell using a relatively new memory technology known as phase-change memory (PCM). The current memory landscape spans from venerable DRAM to hard disk drives to ubiquitous flash. But in the last several years, PCM has attracted the industry's attention as a potential universal memory technology based on its combination of read/write speed, endurance, non-volatility and density. For example, PCM doesn't lose data when powered off, unlike DRAM, and the technology can endure at least 10 million write cycles, compared to an average flash USB stick, which tops out at 3,000 write cycles.

Posted May 31, 2016

What's on the horizon for big data, analytics, and business intelligence as technology evolves faster and faster? In Data Summit's 2016 closing keynote John O'Brien, principal analyst and CEO, at Radiant Advisors, discussed how technology will evolve and grow in the future.

Posted May 26, 2016

Thanks to the cloud and other empowering technologies such as Hadoop and Apache Spark, we're at the tipping point for big data. These technologies now provide a path to big data success for companies who otherwise lack the specialized big data skills or heretofore proprietary (and expensive) infrastructure to do it themselves. As 2016 progresses, we'll see the broader market put big data capabilities to work and the benefits of big data will, in turn, spread beyond the privileged few companies that were early big data adopters.

Posted May 25, 2016

COLLABORATE, the annual conference presented each year by the OAUG, IOUG and Quest, provides the opportunity to reflect on key changes in the Oracle ecosystem and allows the users groups to engage with their constituents about the areas of greatest importance. With the COLLABORATE 16 conference now behind her, Dr. Patricia Dues, the new president of the OAUG, talked with DBTA about what OAUG members are concerned with now and how the OAUG is helping them address emerging challenges.

Posted May 25, 2016

Utilizing data lakes are an alluring option for users with an enormous amount of information, yet questions remain regarding data accuracy, security, and relevancy. Three experts in the big data space, including Anne Buff, business solutions manager for SAS best practices at the SAS Institute, Abhik Roy, database solution engineer at Experion, and Tassos Sarbanes, data architect at Credit Suisse, participated in a roundtable discussion at Data Summit 2016 that focuses on these questions and more regarding data lakes.

Posted May 25, 2016

With newer and newer big data sources exploding onto the scene, traditional data warehouses are being challenged. Satya Bhamidipati, director of business development of big data and advanced analytics at Oracle, discussed data mining and advanced analytics techniques that will enable the monetization of data during a session at Data Summit 2016.

Posted May 25, 2016

Database backups and restores are key to developing a secure environment for users' information. A trusted backup plan is a requirement for on-premise and cloud instances.

Posted May 25, 2016

EMC Corp.'s Enterprise Content Division (ECD) is releasing an upgraded version of its EMC InfoArchive platform, enhancing the ability to secure and leverage large amounts of critical data and content.

Posted May 25, 2016

Dell is updating its SharePlex database replication and near real-time data integration solution to enable users to replicate Oracle data directly to SAP HANA, Teradata, or EnterpriseDB Postgres.

Posted May 25, 2016

It seems every week there is another data breach in the news, which translates to millions and millions of personal records, credit card numbers, and other pieces of confidential information stolen each month. The victims of these breaches include important companies with professional IT staff. Now, you may be thinking: "Shouldn't the network guys be responsible for security?"

Posted May 25, 2016

Data Summit 2016 kicked off at the New York Hilton Midtown earlier this month with keynote presentations by Ben Wellington, the creator of I Quant NY, and Nicholas Chandra, vice president of Cloud Customer Success at Oracle.

Posted May 25, 2016

Data Summit 2016, held in May in NYC, brought together IT managers, data architects, application developers, data analysts, project managers, and business managers to hear industry-leading professionals deliver educational presentations on industry trends and technologies, networks with their peers, and participate in hands-on workshops. Here are 10 key takeaways from Data Summit 2016:

Posted May 23, 2016

Companies are facing a growing problem: Data is everywhere, clogging up systems and preventing enterprises from gaining meaningful insights. Data virtualization is a way to reduce data proliferation and ensure that all consumers are working from a single source.

Posted May 23, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76

Sponsors