Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

Cloudera, provider of a data management and analytics platform built on Apache Hadoop and open source technologies, has announced the general availability of Cloudera Enterprise 5.7. According to the vendor, the new release offers an average 3x improvement for data processing with added support of Hive-on-Spark, and an average 2x improvement for business intelligence analytics with updates to Apache Impala (incubating).

Posted April 26, 2016

Zscaler, a provider of a security as a service platform, is unveiling a new service that enables organizations to provide access to internal applications and tools while ensuring the security of their networks.

Posted April 26, 2016

Neo Technology, creator of Neo4j, is releasing an improved version of its signature platform, enhancing its scalability, introducing new language drivers and a host of other developer friendly features.

Posted April 26, 2016

The greatest power in using IoT-derived insights is the ability respond to opportunities or threats immediately. However, enterprises largely have focused on historical reporting and will need to significantly modernize their analytics capabilities—both in understanding current events and predicting future outcomes—to take advantage of the new insights that IoT data can bring.

Posted April 25, 2016

The need for data integration has never been more intense than it has been recently. The Internet of Things and its muscular sibling, the Industrial Internet of Things, are now being embraced as a way to better understand the status and working order of products, services, partners, and customers. Mobile technology is ubiquitous, pouring in a treasure trove of geolocation and usage data. Analytics has become the only way to compete, and with it comes a need for terabytes—and gigabytes—worth of data. The organization of 2016, in essence, has become a data machine, with an insatiable appetite for all the data that can be ingested.

Posted April 25, 2016

The core reason for implementing in-memory technology is to improve performance. To help accelerate adoption of in-memory technologies and provide a universal standard for columnar in-memory processing and interchange, the lead developers of 13 major open source big data projects have joined forces to create Apache Arrow, a new top level project within the Apache Software Foundation (ASF).

Posted April 24, 2016

GridGain Systems, provider of enterprise-grade in-memory data fabric solutions based on Apache Ignite, is releasing a new version of its platform. GridGain Professional Edition includes the latest version of Apache Ignite plus LGPL libraries, along with a subscription that includes monthly maintenance releases with bug fixes that have been contributed to the Apache Ignite project but will be included only with the next quarterly Ignite release.

Posted April 20, 2016

It seems every week there is another data breach in the news, which translates to millions and millions of personal records, credit card numbers, and other pieces of confidential information stolen each month. The victims of these breaches include important companies with professional IT staff. Now, you may be thinking: "Shouldn't the network guys be responsible for security?"

Posted April 20, 2016

To help organizations that are being held back from moving enterprise workloads to a public cloud because of business, legislative, or regulatory requirements that restrict where and how they handle data, Oracle has launched a new set of offerings. Introduced at Oracle CloudWorld in Washington, DC, by Thomas Kurian, president, Oracle, "Oracle Cloud at Customer" enables organizations to get the benefits of Oracle's cloud services but in their own data center.

Posted April 20, 2016

When users require access to multiple databases on multiple servers distributed across different physical locations, database security administration can become quite complicated. The commands must be repeated for each database, and there is no central repository for easily modifying and deleting user security settings on multiple databases simultaneously. At a high level, database security boils down to answering four questions.

Posted April 19, 2016

Compuware has added richer visualization to ISPW, its mainframe source code management and release automation solution, and to Topaz, its mainframe developer solution. "As an ever-growing number of non-mainframe applications make an ever-growing number of calls to the mainframe, the agility of large enterprises increasingly depends on their ability to quickly, safely, and efficiently modify mainframe code," said Compuware CEO Chris O'Malley.

Posted April 18, 2016

The OpenPOWER Foundation has introduced more than 50 new infrastructure and software innovations, spanning the entire system stack, including systems, boards, cards and accelerators. Building upon the 30 OpenPOWER-based solutions already in the marketplace, the new offerings add new servers for high-performance computing and cloud deployments, including more than 10 new OpenPOWER servers, offering expanded services for high performance computing and server virtualization.

Posted April 18, 2016

Serena Software, a provider of application development and release management solutions, is shipping a new version of its change, configuration, and release management solution for mainframes running z/OS. Version 8.1.1 of ChangeMan ZMF includes new capabilities to enable mainframe development.

Posted April 18, 2016

A new survey from Ponemon Institute put the average cost of an unplanned data center outage at $7,900 a minute, a 41% increase from 2010, when the cost per minute at $5,600. Typically, reported incidents lasted 86 minutes, totaling an average of $690,200 of costs. It took 119 minutes to bring the data centers back up.

Posted April 18, 2016

Sumo Logic, a provider of cloud-native, machine data analytics services, is unveiling a new platform that natively ingests, indexes, and analyzes structured metrics data, and unstructured log data together in real-time.

Posted April 18, 2016

Teradata, the big data analytics and marketing applications company, is making key investments in the Internet of Things (IoT) and the Analytics of Things (AoT), along with updating its signature platforms.

Posted April 18, 2016

Hortonworks is making several key updates to its platform along with furthering its mission as being a leading innovator of open and connected data solutions by enhancing partnerships with Pivotal and expanding upon established integrations with Syncsort.

Posted April 15, 2016

IDERA is releasing its SQL Inventory Check platform for free to allow database administrators (DBAs) to easily discover servers on the network and verify versions to keep them properly maintained or to prepare for migrations.

Posted April 13, 2016

Magnitude Software, a provider of Enterprise Information Management (EIM) software, unveiled a new a master data management offering designed to fuel business processes with accurate customer data for informed decision making.

Posted April 08, 2016

SnapLogic is releasing its hybrid execution framework Snaplex on the Microsoft Azure Marketplace as Azureplex, giving users the ability to gain business insights faster with self-service data integration from a plethora of sources.

Posted April 07, 2016

Segment, which provides a customer data hub, has introduced a new platform that will give companies access to new types of cloud services data. The new platform, dubbed "Sources," allows users to now export, sync, and store data from CRM, marketing, and payments platforms and move it into Postgres or Amazon Redshift for analysis.

Posted April 07, 2016

Delphix is making major updates to its data operations platform, delivering enhancements to strengthen secure application development in the data center and in the cloud. "One of the main bottlenecks that we hear about over and over is management of all of this data," Dan Graves, vice president of product marketing. "That's where Delphix comes in. Our core value is unlocking that data in a secure way to allow businesses to a have fast, fresh, full production environment."

Posted April 06, 2016

Micro Focus, an enterprise infrastructure solutions provider, plans to acquire Serena Software, Inc., a provider of application lifecycle management software, in a transaction valued at $540 million. The acquisition is expected to close in early May 2016, subject to receipt of competition clearances in the U.S. and Germany.

Posted April 04, 2016

Dynatrace, a performance tools vendor, has formed a global partnership with HCL Technologies, an IT services company. HCL will leverage Dynatrace's digital performance management technologies within HCL's DryICE platform to deliver user experience and application monitoring to HCL customers

Posted April 04, 2016

AppFormix has integrated Intel Resource Director Technology (Intel RDT) into its performance monitoring platform, delivering performance improvements that address the "noisy neighbor" problem common to multi-tenant cloud environments. Intel RDT technology is available in the recently announced Intel Xeon processor E5-2600 v4 product family.

Posted April 04, 2016

Impelsys, provider of publishing and learning technology solutions, and Ontotext, a provider of semantic technology, are releasing an integrated technology offering.

Posted April 04, 2016

IBM says it is making it easier and faster for organizations to access and analyze data in-place on the IBM z Systems mainframe with a new z/OS Platform for Apache Spark. The platform enables Spark to run natively on the z/OS mainframe operating system.

Posted April 04, 2016

The emergence of big data, characterized in terms of its four V's—volume, variety, velocity, and veracity—has created both opportunities and challenges for credit scoring.

Posted April 04, 2016

Databricks, the company behind Apache Spark, is releasing a new set of APIs that will enable enterprises to automate their Spark infrastructure to accelerate the deployment of production data-driven applications.

Posted April 01, 2016

ManageEngine is introducing a new application performance monitoring solution, enabling IT operations teams in enterprises to gain operational intelligence into big data platforms. Applications Manager enables performance monitoring of Hadoop clusters to minimize downtime and performance degradation. Additionally, the platform's monitoring support for Oracle Coherence provides insights into the health and performance of Coherence clusters and facilitates troubleshooting of issues.

Posted April 01, 2016

Hershey's LLC recently deployed the Infosys Information Platform on AWS to analyze retail store data.

Posted March 31, 2016

It's become almost a standard career path in Silicon Valley: A talented engineer creates a valuable open source software commodity inside of a larger organization, then leaves that company to create a new startup to commercialize the open source product. Indeed, this is virtually the plot line for the hilarious HBO comedy series, Silicon Valley. Jay Krepes, a well-known engineer at LinkedIn and creator of the NoSQL database system, Voldemort, has such a story.

Posted March 31, 2016

The traditional information engineering approach advocates the placement of as much business logic as possible inside the database management system (DBMS). But, more recently, under the umbrella of service-oriented architecture (SOA), folks are arguing for placement of that business logic in a layer of code outside the DBMS. Occasionally, those who favor locating business logic outside the DBMS have even gone so far as to say that this logic "naturally" belongs in a non-DBMS-supported layer.

Posted March 31, 2016

Microsoft has been on a tear for the past couple of years. It has been pushing forward with a very steady stream of powerful new features and capabilities, even entire product lines, within its Data Platform business. But while Microsoft has been hard at work on this deluge of new technologies, it would be completely forgivable if you haven't noticed. The reason it's OK is that Microsoft is advancing on multiple fronts, both in the on-premises product line and even more dramatically with the Azure cloud-based products.

Posted March 31, 2016

What makes an IT organization static or dynamic? What triggers an organization to move from one to the other? The transformation is not easy and it certainly does not happen quickly. These questions can also be asked at a personal level. As an IT professional, are you more likely to be static or dynamic?

Posted March 31, 2016

Over the past 2 years, there have been big announcements from all of the major car manufacturers about their connected car initiatives, lots of M&A activity in the technology industry as they race to supply the revolution, and major global alliances of telecom providers being formed to provide the underlying connectivity and infrastructure. But, most of all, we are actually starting to see some of the promised transformational benefits of the Internet of Things becoming a reality.

Posted March 31, 2016

The pervasive corporate mindset to transition all levels of infrastructure to some cloud, somewhere, is accelerating the growth of the cloud industry with a rapidity so far unseen in the history of computing. This phenomenon has resulted in weighty pressure on CIOs to develop and deploy an effective and comprehensive cloud strategy or risk their organization falling behind this undeniable trend. The internet changed the information technology game, but now the cloud constitutes an entirely different league.

Posted March 31, 2016

Denodo, a provider of data virtualization software, is releasing Denodo Platform 6.0, further accelerating its "fast data" strategy. "It's a major release for us," said Ravi Shankar, Denodo CMO. There are three important areas that nobody else is focusing on in the industry, he noted. "This, we hope, will change how data virtualization, and in a broader sense, data integration will shape up this year."

Posted March 31, 2016

AtScale is unveiling an upgraded platform that introduces new innovations to enterprise security and performance.

Posted March 30, 2016

NoSQL databases were born out of the need to scale transactional persistence stores more efficiently. In a world where the relational database management system (RDBMS) was king, this was easier said than done.

Posted March 29, 2016

MapR is now available as part of Bigstep's big data platform-as-a-service, supporting a wide range of Hadoop applications.

Posted March 29, 2016

Ryft, a provider of big data analytics solutions, has introduced the Ryft ONE Cluster, which builds on the speed and efficiency of the Ryft ONE appliance to enable service providers to improve the performance and availability capabilities their customers require in analytics.

Posted March 29, 2016

Reltio is releasing an enhanced version of Reltio Cloud 2016.1, adding new analytics integration, collaboration, and recommendation capabilities to help companies be right faster.

Posted March 29, 2016

Teradata has introduced a new "design pattern" approach for data lake deployment. The company says its concept of a data lake pattern leverages IP from its client engagements, as well as services and technology to help organizations more quickly and securely get to successful data lake deployment.

Posted March 28, 2016

The digital economy promises to redefine nearly every aspect of a company's operations—from raw material procurement through post-sale services. Yet, some of the most dramatic changes will be seen in how companies evolve their product portfolios and leverage digital capabilities.

Posted March 25, 2016

One of the most common challenges organizations face when developing an enterprise data governance program is the presence of data silos or pockets of data initiatives scattered throughout the company with little to no association or collaboration of efforts. While many describe their data systems as "siloed," the disjunction is less about the underlying technology and far more about the division created between lines of business.

Posted March 25, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142

Sponsors