Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

Enterprises are constantly searching for ways to capture, leverage, and analyze data effectively. However, bottlenecks can wreak havoc on the application development process.

Posted April 29, 2016

Magnitude Software, a provider of Enterprise Information Management (EIM) software, unveiled a new a master data management offering designed to fuel business processes with accurate customer data for informed decision making.

Posted April 27, 2016

BackOffice Associates, a provider of information governance and data modernization solutions, is acquiring CompriseIT, a U.K. consulting firm specializing in helping enterprises adopt SAP Business Suite 4 SAP HANA (SAP S/4HANA). BackOffice Associates' acquisition of CompriseIT is the latest initiative in move to strengthen its expertise in helping customers as they embark on their journey to implement SAP S/4HANA.

Posted April 27, 2016

Cisco is launching an appliance that includes the MapR Converged Data Platform for SAP HANA, making it easier and faster for users to take advantage of big data. The UCS Integrated Infrastructure for SAP HANA is made easy to deploy, speeds time to market, and will reduce operational expenses along with providing users with the flexibility to choose a scale-up (on-premises) or scale-out (cloud) storage strategy.

Posted April 27, 2016

A new survey from Ponemon Institute put the average cost of an unplanned data center outage at $7,900 a minute, a 41% increase from 2010, when the cost per minute at $5,600. Typically, reported incidents lasted 86 minutes, totaling an average of $690,200 of costs. It took 119 minutes to bring the data centers back up.

Posted April 27, 2016

Voting has opened for the 2016 DBTA Readers' Choice Awards. Cloud, in-memory, real-time, virtualization, SaaS, IoT - today, there are many opportunities for data-driven companies to take advantage of more data in more varieties flowing at greater velocity than ever before.

Posted April 27, 2016

Microsoft has been on a tear for the past couple of years. It has been pushing forward with a very steady stream of powerful new features and capabilities, even entire product lines, within its Data Platform business. But while Microsoft has been hard at work on this deluge of new technologies, it would be completely forgivable if you haven't noticed. The reason it's OK is that Microsoft is advancing on multiple fronts, both in the on-premises product line and even more dramatically with the Azure cloud-based products.

Posted April 27, 2016

Redis Labs, home of Redis, is releasing Redis on flash with standard x86 servers on the cloud along with partnering with Samsung to improve database performance. By running a combination of Redis on flash and DRAM, data center managers will benefit from leveraging the high throughput and low latency characteristics of Redis while achieving substantial cost savings, according to the company.

Posted April 27, 2016

Cloudera, provider of a data management and analytics platform built on Apache Hadoop and open source technologies, has announced the general availability of Cloudera Enterprise 5.7. According to the vendor, the new release offers an average 3x improvement for data processing with added support of Hive-on-Spark, and an average 2x improvement for business intelligence analytics with updates to Apache Impala (incubating).

Posted April 26, 2016

Zscaler, a provider of a security as a service platform, is unveiling a new service that enables organizations to provide access to internal applications and tools while ensuring the security of their networks.

Posted April 26, 2016

Neo Technology, creator of Neo4j, is releasing an improved version of its signature platform, enhancing its scalability, introducing new language drivers and a host of other developer friendly features.

Posted April 26, 2016

Along with an increasing flow of big data that needs to be captured and analyzed, IT departments today also have more solution choices than ever before. However, before making a solution selection, organizations need to understand their requirements and also evaluate the attributes of the possible tools.

Posted April 25, 2016

The greatest power in using IoT-derived insights is the ability respond to opportunities or threats immediately. However, enterprises largely have focused on historical reporting and will need to significantly modernize their analytics capabilities—both in understanding current events and predicting future outcomes—to take advantage of the new insights that IoT data can bring.

Posted April 25, 2016

The need for data integration has never been more intense than it has been recently. The Internet of Things and its muscular sibling, the Industrial Internet of Things, are now being embraced as a way to better understand the status and working order of products, services, partners, and customers. Mobile technology is ubiquitous, pouring in a treasure trove of geolocation and usage data. Analytics has become the only way to compete, and with it comes a need for terabytes—and gigabytes—worth of data. The organization of 2016, in essence, has become a data machine, with an insatiable appetite for all the data that can be ingested.

Posted April 25, 2016

The core reason for implementing in-memory technology is to improve performance. To help accelerate adoption of in-memory technologies and provide a universal standard for columnar in-memory processing and interchange, the lead developers of 13 major open source big data projects have joined forces to create Apache Arrow, a new top level project within the Apache Software Foundation (ASF).

Posted April 24, 2016

GridGain Systems, provider of enterprise-grade in-memory data fabric solutions based on Apache Ignite, is releasing a new version of its platform. GridGain Professional Edition includes the latest version of Apache Ignite plus LGPL libraries, along with a subscription that includes monthly maintenance releases with bug fixes that have been contributed to the Apache Ignite project but will be included only with the next quarterly Ignite release.

Posted April 20, 2016

Oracle released new versions of several of its engineered systems which provide tightly integrated hardware and software. The company has introduced the Exadata X6 Database Machine, as well as updates to the Zero Data Loss Recovery Appliance and SuperCluster M7.

Posted April 20, 2016

It seems every week there is another data breach in the news, which translates to millions and millions of personal records, credit card numbers, and other pieces of confidential information stolen each month. The victims of these breaches include important companies with professional IT staff. Now, you may be thinking: "Shouldn't the network guys be responsible for security?"

Posted April 20, 2016

To help organizations that are being held back from moving enterprise workloads to a public cloud because of business, legislative, or regulatory requirements that restrict where and how they handle data, Oracle has launched a new set of offerings. Introduced at Oracle CloudWorld in Washington, DC, by Thomas Kurian, president, Oracle, "Oracle Cloud at Customer" enables organizations to get the benefits of Oracle's cloud services but in their own data center.

Posted April 20, 2016

When users require access to multiple databases on multiple servers distributed across different physical locations, database security administration can become quite complicated. The commands must be repeated for each database, and there is no central repository for easily modifying and deleting user security settings on multiple databases simultaneously. At a high level, database security boils down to answering four questions.

Posted April 19, 2016

Compuware has added richer visualization to ISPW, its mainframe source code management and release automation solution, and to Topaz, its mainframe developer solution. "As an ever-growing number of non-mainframe applications make an ever-growing number of calls to the mainframe, the agility of large enterprises increasingly depends on their ability to quickly, safely, and efficiently modify mainframe code," said Compuware CEO Chris O'Malley.

Posted April 18, 2016

The OpenPOWER Foundation has introduced more than 50 new infrastructure and software innovations, spanning the entire system stack, including systems, boards, cards and accelerators. Building upon the 30 OpenPOWER-based solutions already in the marketplace, the new offerings add new servers for high-performance computing and cloud deployments, including more than 10 new OpenPOWER servers, offering expanded services for high performance computing and server virtualization.

Posted April 18, 2016

Serena Software, a provider of application development and release management solutions, is shipping a new version of its change, configuration, and release management solution for mainframes running z/OS. Version 8.1.1 of ChangeMan ZMF includes new capabilities to enable mainframe development.

Posted April 18, 2016

Sumo Logic, a provider of cloud-native, machine data analytics services, is unveiling a new platform that natively ingests, indexes, and analyzes structured metrics data, and unstructured log data together in real-time.

Posted April 18, 2016

Teradata, the big data analytics and marketing applications company, is making key investments in the Internet of Things (IoT) and the Analytics of Things (AoT), along with updating its signature platforms.

Posted April 18, 2016

Hortonworks is making several key updates to its platform along with furthering its mission as being a leading innovator of open and connected data solutions by enhancing partnerships with Pivotal and expanding upon established integrations with Syncsort.

Posted April 15, 2016

Everyone within an enterprise agrees that data is an asset, but it's what to do with it that causes divisiveness between business leaders and IT personnel.

Posted April 15, 2016

IDERA is releasing its SQL Inventory Check platform for free to allow database administrators (DBAs) to easily discover servers on the network and verify versions to keep them properly maintained or to prepare for migrations.

Posted April 13, 2016

There are many different definitions of the term "big data," some of them reasonable, others not so much. However, the overriding issue for many data professionals, especially those who use more traditional data management tools, is confusion about what to do with big data and how to get the most out of it.

Posted April 08, 2016

SnapLogic is releasing its hybrid execution framework Snaplex on the Microsoft Azure Marketplace as Azureplex, giving users the ability to gain business insights faster with self-service data integration from a plethora of sources.

Posted April 07, 2016

Segment, which provides a customer data hub, has introduced a new platform that will give companies access to new types of cloud services data. The new platform, dubbed "Sources," allows users to now export, sync, and store data from CRM, marketing, and payments platforms and move it into Postgres or Amazon Redshift for analysis.

Posted April 07, 2016

To shed light on the right approaches for improving query performance while at the same time enabling non-technical business users to query large, disparate datasets, Jeremy Sokolic, vice president, product, at Sisense will present a talk at Data Summit titled "Innovating Analytics: From In-Memory to In-Chip."

Posted April 06, 2016

Delphix is making major updates to its data operations platform, delivering enhancements to strengthen secure application development in the data center and in the cloud. "One of the main bottlenecks that we hear about over and over is management of all of this data," Dan Graves, vice president of product marketing. "That's where Delphix comes in. Our core value is unlocking that data in a secure way to allow businesses to a have fast, fresh, full production environment."

Posted April 06, 2016

Micro Focus, an enterprise infrastructure solutions provider, plans to acquire Serena Software, Inc., a provider of application lifecycle management software, in a transaction valued at $540 million. The acquisition is expected to close in early May 2016, subject to receipt of competition clearances in the U.S. and Germany.

Posted April 04, 2016

Dynatrace, a performance tools vendor, has formed a global partnership with HCL Technologies, an IT services company. HCL will leverage Dynatrace's digital performance management technologies within HCL's DryICE platform to deliver user experience and application monitoring to HCL customers

Posted April 04, 2016

AppFormix has integrated Intel Resource Director Technology (Intel RDT) into its performance monitoring platform, delivering performance improvements that address the "noisy neighbor" problem common to multi-tenant cloud environments. Intel RDT technology is available in the recently announced Intel Xeon processor E5-2600 v4 product family.

Posted April 04, 2016

A new study of 150 federal government IT managers suggests data center modernization can save at least $10 billion a year, or about one-seventh of current IT spending. Just 11% of federal IT managers believe their data centers are ready to meet agency missions today.

Posted April 04, 2016

Impelsys, provider of publishing and learning technology solutions, and Ontotext, a provider of semantic technology, are releasing an integrated technology offering.

Posted April 04, 2016

IBM says it is making it easier and faster for organizations to access and analyze data in-place on the IBM z Systems mainframe with a new z/OS Platform for Apache Spark. The platform enables Spark to run natively on the z/OS mainframe operating system.

Posted April 04, 2016

The emergence of big data, characterized in terms of its four V's—volume, variety, velocity, and veracity—has created both opportunities and challenges for credit scoring.

Posted April 04, 2016

Databricks, the company behind Apache Spark, is releasing a new set of APIs that will enable enterprises to automate their Spark infrastructure to accelerate the deployment of production data-driven applications.

Posted April 01, 2016

ManageEngine is introducing a new application performance monitoring solution, enabling IT operations teams in enterprises to gain operational intelligence into big data platforms. Applications Manager enables performance monitoring of Hadoop clusters to minimize downtime and performance degradation. Additionally, the platform's monitoring support for Oracle Coherence provides insights into the health and performance of Coherence clusters and facilitates troubleshooting of issues.

Posted April 01, 2016

Hershey's LLC recently deployed the Infosys Information Platform on AWS to analyze retail store data.

Posted March 31, 2016

It's become almost a standard career path in Silicon Valley: A talented engineer creates a valuable open source software commodity inside of a larger organization, then leaves that company to create a new startup to commercialize the open source product. Indeed, this is virtually the plot line for the hilarious HBO comedy series, Silicon Valley. Jay Krepes, a well-known engineer at LinkedIn and creator of the NoSQL database system, Voldemort, has such a story.

Posted March 31, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75

Sponsors