Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

As enterprises search for ways to support modern data applications and keep up with the pace of the revolving door of new technologies and solutions, bottlenecks become more frequent and put a stop to application development.

Posted May 12, 2016

Ensuring data is governed properly is a hot topic as more tools and capabilities to analyze and gain insights become available. At the same time, data discovery is becoming more imperative as analysts must be able to move through the process with as little friction as possible.

Posted May 11, 2016

Public cloud services vendors have had a number of notable outages that have taken down well-respected web-based companies, temporarily putting them out of business. In a session, presented at Data Summit 2016, Michael J Corey, president, Ntirety - A HOSTING Company, and Don Sullivan, system engineer database specialist, VMware, suggested that, despite the fact that cloud is no longer a cutting-edge, disruptive technology, 10 years after it first came to the fore, in many ways, the public cloud environment is still the Wild West.

Posted May 11, 2016

At Data Summit 2016, Robin A. Thottungal, the EPA's first chief data scientist, explained how the agency is becoming more data-driven and shared some of challenges and the innovative solutions taken by the agency in implementing real-time monitoring of environmental parameters about the current state of our ecosystem.

Posted May 10, 2016

At the center of the new big data movement is the Hadoop framework, which provides an efficient file system and related ecosystem of solutions to store and analyze big datasets. The Hadoop ecosystem was addressed from two points of view in a session at Data Summit 2016. James Casaletto, principal solutions architect, Professional Services at MapR, presented a talk titled "Harnessing the Hadoop Ecosystem," and Tassos Sarbanes, mathematician / data scientist, Investment Banking at Credit Suisse, covered the advantages of HBase in a talk titled "HBase Data Model - The Ultimate Model on Hadoop."

Posted May 10, 2016

Despite the increasing focus on offering more access to more users in organizations, ad hoc querying of big data remains a problem for most, according to Jair Aguirre, data scientist at Booz Allen Hamilton, who presented a session at Data Summit 2016 titled "De-Siloing Data Using Apache Drill."

Posted May 10, 2016

Big data represents an enormous shift for IT, said Craig S. Mullins in a presentation at Data Summit 2016 in NYC that looked at what relational database professionals need to know about big data technologies. Mullins, a principal of Mullins Consulting, and the author of the DBA Corner column for DBTA, provided an overview of the changes that have taken place in the DBTA arena in recent years, and the key technologies that are having high impact.

Posted May 10, 2016

IT and businesses don't always see eye to eye when it comes to overall goals within an enterprise. To address this glaring issue, Anne Buff, business solutions manager and thought leader for SAS Best Practices, a thought leadership organization at SAS Institute, discussed aligning data strategy goals at Data Summit 2016.

Posted May 10, 2016

The pace of technology is moving faster than ever and more enterprises are struggling with how to keep up to date with their data architecture. John O'Brien, principal analyst and CEO, at Radiant Advisors, discussed how to design and implement data architecture for the ever evolving big data world during Data Summit 2016.

Posted May 10, 2016

Data Summit 2016 kicked off at the New York Hilton Midtown earlier this month with keynote presentations by Ben Wellington, the creator of I Quant NY, and Nicholas Chandra, vice president of Cloud Customer Success at Oracle.

Posted May 10, 2016

Dell is updating its SharePlex database replication and near real-time data integration solution to enable users to replicate Oracle data directly to SAP HANA, Teradata, or EnterpriseDB Postgres.

Posted May 10, 2016

EMC Corp.'s Enterprise Content Division (ECD) is releasing an upgraded version of its EMC InfoArchive platform, enhancing the ability to secure and leverage large amounts of critical data and content.

Posted May 09, 2016

The world's data has doubled in 18 months' time. The industry estimates that the global amount of storage will reach 40 ZB by 2020. Historically, storage architectures were built on solutions that could only scale vertically. This legacy approach to storage presents significant challenges to being able to store the tremendous quantities of data being created today in a way that is cost-effective and maintains high levels of performance. Today, most of the world's data centers are still using vertical scaling solutions for storage, which means that organizations are seeking alternatives that allow them to scale cheaply and efficiently in order to remain competitive. And now, with software defined storage moving forward, we see the use of more scale-out storage solutions in data centers.

Posted May 04, 2016

The elastic and distributed technologies used to run modern applications require a new approach to operations — one that learns about your infrastructure and assists IT operators with maintenance and problem-solving. The inter-dependencies between new applications are creating chaos in existing systems and surfacing the operational challenges of modern systems. Solutions like micro services architectures alleviate the scalability pains of centralized proprietary services but at a tremendous cost in complexity.

Posted May 04, 2016

The latest release of Oracle Database (12.1.0.2) offers a unique set of features that portend increases in application workload execution, especially for analytics and data warehousing queries. This release, debuts Oracle Database In-Memory which provides a new columnar format - the In-Memory Column Store (IMCS) - for data that is likely to be accessed regularly for aggregation or analysis, as well as other features such as In-Memory Aggregation and In-Memory Joins that potentially offer several orders of magnitude of performance improvement. Finally, the new In-Memory Advisor makes short work of determining exactly which database objects are most likely able to take advantage of the IMCS.

Posted May 04, 2016

Being able to assess the effectiveness and performance of your database systems and applications is one of the most important things that a DBA must be able to do. This can include online transaction response time evaluation, sizing of the batch window and determining whether it is sufficient for the workload, end-to-end response time management of distributed workload, and more. But in order to accurately gauge the effectiveness of your current environment and setup, service level agreements, or SLAs, are needed.

Posted May 04, 2016

What makes an IT organization static or dynamic? What triggers an organization to move from one to the other? The transformation is not easy and it certainly does not happen quickly. These questions can also be asked at a personal level. As an IT professional, are you more likely to be static or dynamic?

Posted May 04, 2016

Oracle database migration can pose a variety of learning curve challenges. However, a platform does exist that can make the transition easier. In a recent DBTA webinar, Bill Brunt, product manager of SharePlex at Dell, discussed how users can reduce downtime, migrate at speed, eliminate risk, and validate success by tapping into SharePlex.

Posted May 03, 2016

IBM announced an expansion of its flash storage portfolio, which three new all-flash array products incorporating upgraded performance with a minimum latency of 250µs (microsecond).

Posted May 02, 2016

Coho Data, a provider of scale-out flash storage for the enterprise, announced support for OpenStack on DataStream and that the Coho Cinder driver is part of the OpenStack Mitaka released earlier this month.

Posted May 02, 2016

Dell Networking introduced new capabilities for campus and data center environments, including the launch of new cloud-managed wired and wireless solutions powered by Dell and Aerohive, Operating System 10 milestones and in-rack platforms for the data center.

Posted May 02, 2016

A lack of standards doesn't help when talking to non-experts, says Daniel Kusnetzky, noted industry analyst and formerly with IDC, who recounted a recent conversation about what constitutes a software-defined data center. A "software-defined" data center is a virtual environment and "its control and monitoring functions have been made available using a standard interface," he explained. That not only enables IT managers to stay on top of critical settings - such as memory, storage, processing and network bandwidth - but also for systems to adjust themselves, he added. Such are the advantages of being software-defined.

Posted May 02, 2016

The new name for Dell after it merges with EMC later in 2016 will be Dell Technologies. The new name was announced by Michael Dell, chairman and CEO of Dell Inc., at EMC World and in a letter to Dell team members.

Posted May 02, 2016

Enterprises are constantly searching for ways to capture, leverage, and analyze data effectively. However, bottlenecks can wreak havoc on the application development process.

Posted April 29, 2016

Magnitude Software, a provider of Enterprise Information Management (EIM) software, unveiled a new a master data management offering designed to fuel business processes with accurate customer data for informed decision making.

Posted April 27, 2016

BackOffice Associates, a provider of information governance and data modernization solutions, is acquiring CompriseIT, a U.K. consulting firm specializing in helping enterprises adopt SAP Business Suite 4 SAP HANA (SAP S/4HANA). BackOffice Associates' acquisition of CompriseIT is the latest initiative in move to strengthen its expertise in helping customers as they embark on their journey to implement SAP S/4HANA.

Posted April 27, 2016

Cisco is launching an appliance that includes the MapR Converged Data Platform for SAP HANA, making it easier and faster for users to take advantage of big data. The UCS Integrated Infrastructure for SAP HANA is made easy to deploy, speeds time to market, and will reduce operational expenses along with providing users with the flexibility to choose a scale-up (on-premises) or scale-out (cloud) storage strategy.

Posted April 27, 2016

A new survey from Ponemon Institute put the average cost of an unplanned data center outage at $7,900 a minute, a 41% increase from 2010, when the cost per minute at $5,600. Typically, reported incidents lasted 86 minutes, totaling an average of $690,200 of costs. It took 119 minutes to bring the data centers back up.

Posted April 27, 2016

Voting has opened for the 2016 DBTA Readers' Choice Awards. Cloud, in-memory, real-time, virtualization, SaaS, IoT - today, there are many opportunities for data-driven companies to take advantage of more data in more varieties flowing at greater velocity than ever before.

Posted April 27, 2016

Microsoft has been on a tear for the past couple of years. It has been pushing forward with a very steady stream of powerful new features and capabilities, even entire product lines, within its Data Platform business. But while Microsoft has been hard at work on this deluge of new technologies, it would be completely forgivable if you haven't noticed. The reason it's OK is that Microsoft is advancing on multiple fronts, both in the on-premises product line and even more dramatically with the Azure cloud-based products.

Posted April 27, 2016

Redis Labs, home of Redis, is releasing Redis on flash with standard x86 servers on the cloud along with partnering with Samsung to improve database performance. By running a combination of Redis on flash and DRAM, data center managers will benefit from leveraging the high throughput and low latency characteristics of Redis while achieving substantial cost savings, according to the company.

Posted April 27, 2016

Cloudera, provider of a data management and analytics platform built on Apache Hadoop and open source technologies, has announced the general availability of Cloudera Enterprise 5.7. According to the vendor, the new release offers an average 3x improvement for data processing with added support of Hive-on-Spark, and an average 2x improvement for business intelligence analytics with updates to Apache Impala (incubating).

Posted April 26, 2016

Zscaler, a provider of a security as a service platform, is unveiling a new service that enables organizations to provide access to internal applications and tools while ensuring the security of their networks.

Posted April 26, 2016

Neo Technology, creator of Neo4j, is releasing an improved version of its signature platform, enhancing its scalability, introducing new language drivers and a host of other developer friendly features.

Posted April 26, 2016

Along with an increasing flow of big data that needs to be captured and analyzed, IT departments today also have more solution choices than ever before. However, before making a solution selection, organizations need to understand their requirements and also evaluate the attributes of the possible tools.

Posted April 25, 2016

The greatest power in using IoT-derived insights is the ability respond to opportunities or threats immediately. However, enterprises largely have focused on historical reporting and will need to significantly modernize their analytics capabilities—both in understanding current events and predicting future outcomes—to take advantage of the new insights that IoT data can bring.

Posted April 25, 2016

The need for data integration has never been more intense than it has been recently. The Internet of Things and its muscular sibling, the Industrial Internet of Things, are now being embraced as a way to better understand the status and working order of products, services, partners, and customers. Mobile technology is ubiquitous, pouring in a treasure trove of geolocation and usage data. Analytics has become the only way to compete, and with it comes a need for terabytes—and gigabytes—worth of data. The organization of 2016, in essence, has become a data machine, with an insatiable appetite for all the data that can be ingested.

Posted April 25, 2016

The core reason for implementing in-memory technology is to improve performance. To help accelerate adoption of in-memory technologies and provide a universal standard for columnar in-memory processing and interchange, the lead developers of 13 major open source big data projects have joined forces to create Apache Arrow, a new top level project within the Apache Software Foundation (ASF).

Posted April 24, 2016

GridGain Systems, provider of enterprise-grade in-memory data fabric solutions based on Apache Ignite, is releasing a new version of its platform. GridGain Professional Edition includes the latest version of Apache Ignite plus LGPL libraries, along with a subscription that includes monthly maintenance releases with bug fixes that have been contributed to the Apache Ignite project but will be included only with the next quarterly Ignite release.

Posted April 20, 2016

Oracle released new versions of several of its engineered systems which provide tightly integrated hardware and software. The company has introduced the Exadata X6 Database Machine, as well as updates to the Zero Data Loss Recovery Appliance and SuperCluster M7.

Posted April 20, 2016

It seems every week there is another data breach in the news, which translates to millions and millions of personal records, credit card numbers, and other pieces of confidential information stolen each month. The victims of these breaches include important companies with professional IT staff. Now, you may be thinking: "Shouldn't the network guys be responsible for security?"

Posted April 20, 2016

To help organizations that are being held back from moving enterprise workloads to a public cloud because of business, legislative, or regulatory requirements that restrict where and how they handle data, Oracle has launched a new set of offerings. Introduced at Oracle CloudWorld in Washington, DC, by Thomas Kurian, president, Oracle, "Oracle Cloud at Customer" enables organizations to get the benefits of Oracle's cloud services but in their own data center.

Posted April 20, 2016

When users require access to multiple databases on multiple servers distributed across different physical locations, database security administration can become quite complicated. The commands must be repeated for each database, and there is no central repository for easily modifying and deleting user security settings on multiple databases simultaneously. At a high level, database security boils down to answering four questions.

Posted April 19, 2016

Compuware has added richer visualization to ISPW, its mainframe source code management and release automation solution, and to Topaz, its mainframe developer solution. "As an ever-growing number of non-mainframe applications make an ever-growing number of calls to the mainframe, the agility of large enterprises increasingly depends on their ability to quickly, safely, and efficiently modify mainframe code," said Compuware CEO Chris O'Malley.

Posted April 18, 2016

The OpenPOWER Foundation has introduced more than 50 new infrastructure and software innovations, spanning the entire system stack, including systems, boards, cards and accelerators. Building upon the 30 OpenPOWER-based solutions already in the marketplace, the new offerings add new servers for high-performance computing and cloud deployments, including more than 10 new OpenPOWER servers, offering expanded services for high performance computing and server virtualization.

Posted April 18, 2016

Serena Software, a provider of application development and release management solutions, is shipping a new version of its change, configuration, and release management solution for mainframes running z/OS. Version 8.1.1 of ChangeMan ZMF includes new capabilities to enable mainframe development.

Posted April 18, 2016

Sumo Logic, a provider of cloud-native, machine data analytics services, is unveiling a new platform that natively ingests, indexes, and analyzes structured metrics data, and unstructured log data together in real-time.

Posted April 18, 2016

Teradata, the big data analytics and marketing applications company, is making key investments in the Internet of Things (IoT) and the Analytics of Things (AoT), along with updating its signature platforms.

Posted April 18, 2016

Hortonworks is making several key updates to its platform along with furthering its mission as being a leading innovator of open and connected data solutions by enhancing partnerships with Pivotal and expanding upon established integrations with Syncsort.

Posted April 15, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112

Sponsors