Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

Scribe Software, a global data integration leader, has opened a new data center in Europe, designed to meet significant new demand for integration-platform-as-a-service (iPaaS) throughout Europe, the Middle East, and Africa (EMEA). Located in Frankfurt, Germany, Scribe's European data center is aimed at helping global organizations maintain compliance with all applicable regulations for storing data, including the European Union's General Data Protection Regulation (GDPR

Posted June 20, 2017

Quest Software, a systems management and security software provider, has announced general availability of SharePlex v9, the latest release of the company's replication and real-time data integration solution. With version 9, SharePlex enables customers to replicate data from both Oracle and SQL Server to more target databases both on-premises and in the cloud.

Posted June 20, 2017

Syncsort, a provider of data integrity and integration solutions for next-generation analytics, has announced new capabilities in its mainframe data access and integration solution that populates Hadoop data lakes with changes in mainframe data.

Posted June 20, 2017

IBM has a new service, delivered via the IBM Cloud, that is designed to assist users in gaining access to preferred business applications. IBM Cloud Identity Connect is an Identity-as-a-Service (IDaaS), which helps provide users with rapid access to thousands of popular cloud apps while enabling single sign-on to their applications, whether from the cloud or on-premises.

Posted June 19, 2017

BMC, a provider of IT service solutions, has rolled out Control-M Workbench, a no-cost, self-service, standalone development environment which builds on the Control-M Automation API that was introduced in 2016.

Posted June 19, 2017

Pure Storage, an all-flash data platform vendor, has introduced new software and hardware innovations for the FlashArray product line that addresses cloud workload integration.

Posted June 19, 2017

Talend, a provider of cloud and big data integration solutions, has unveiled a new version of the Talend Data Fabric platform, optimized to manage cloud and multi-cloud enterprise IT environments. Talend Summer '17 is intended to help manage information across Amazon Web Services, Cloudera Altus, Google Cloud Platform, Microsoft Azure, and Snowflake platforms, enabling customers to integrate, cleanse and analyze data.

Posted June 19, 2017

"Platforms" are all the rage in software positioning and messaging. And recently, a new platform has become the "platform du jour" - driven by the urgency felt by enterprises as they struggle to manage an increasing amount of data and an increasing number of data formats all generated from an increasingly number of applications on an increasingly diverse mix of infrastructure - the "data platform."

Posted June 16, 2017

New and emerging vendors offer fresh ways of dealing with data management and analytics challenges in areas such as data as a service, security as a service, cloud in a box, and data visualization. Here, DBTA looks at the 10 companies whose approaches we think are worth watching.

Posted June 16, 2017

Software audits are becoming a major risk to organizations. Microsoft, Oracle, SAP and other leading software vendors keep close tabs on their customers for potential license violations and true-up costs. A common occurrence is deploying more copies of software than the license agreement allows. But taking software inventory is a time-consuming and laborious process. A better approach is an integrated ITAM and asset information source.

Posted June 16, 2017

The Apache Arrow project is a standard for representing data for in-memory processing.Hardware evolves rapidly. Because Apache Arrow was designed to benefit many different types of software in a wide range of hardware environments, the project team focused on making the work "future-proof," which meant anticipating changes to hardware over the next decade.

Posted June 14, 2017

IBM and Hortonworks are expanding their partnership focused on extending data science and machine learning to more developers and across the Apache Hadoop ecosystem. The companies are combining the Hortonworks Data Platform (HDP) with the IBM Data Science Experience and IBM Big SQL into new integrated solutions designed to help users better analyze and manage data for better decision making.

Posted June 13, 2017

Melissa, a provider of global data quality and identity verification solutions, has achieved certification in the EU-U.S. Privacy Shield Framework, which establishes principles to ensure privacy of customer data shared in the process of transatlantic commerce. According to Melissa, this certification supports Melissa's ongoing commitment to the security, privacy, and availability of customer data for a worldwide clientele.

Posted June 13, 2017

Attunity Ltd., a provider of data integration and big data management software solutions, is launching a new solution, Attunity Compose for Hive, which automates the process of creation and continuous loading of operational and historical data stores in a data lake.

Posted June 13, 2017

Addressing the rise of hybrid deployments, Hortonworks has introduced a new software support subscription to provide seamless support to organizations as they transition from on-premise to cloud. Separately, Hortonworks also announced the general availability of Hortonworks Dataflow (HDF) 3.0, a new release of its open source data-in-motion platform, which enables customers to collect, curate, analyze and act on all data in real-time, across the data center and cloud.

Posted June 12, 2017

TIBCO Software Inc., a provider of software for integration, API management, and analytics, has announced the new TIBCO Cloud platform, featuring the new TIBCO Cloud Live Apps offering alongside the recognized TIBCO Cloud Integration.

Posted June 06, 2017

Databricks has introduced a new offering to simplify the management of Apache Spark workloads in the cloud. "Databricks Serverless" is a managed computing platform for Apache Spark that allows teams to share a pool of computing resources and automatically isolates users and manages costs. The new offering aims to remove the complexity and cost of users managing their own Spark clusters.

Posted June 06, 2017

Syncsort, a provider of data integration solutions for next-generation analytics, has announced new solutions that bring together its industry-leading big data integration and recently acquired Trillium data quality software to address data governance and customer 360 initiatives within data lakes.

Posted June 06, 2017

PremiumSoft has announced the release of Navicat Version 12 with a brand new interface for its Navicat family of products. Navicat develops the database management and development software. Its flagship product, Navicat Premium, allows users to access up to six databases all-in-one, including MySQL, MariaDB, SQL Server, SQLite, Oracle, and PostgreSQL, eliminating workflow disruption to leverage users' time and increasing productivity and efficiency.

Posted June 06, 2017

Unravel Data, which provides an APM platform designed for big data, has added integrated support for Cloudera Impala and Apache Kafka into its platform.

Posted June 06, 2017

Information Builders, a provider of business intelligence (BI) and analytics, data integrity, and integration solutions, has announced five key enhancements to help customers accelerate data-driven decision making.

Posted June 06, 2017

Qubole, a data platform provider, is building an autonomous data platform designed to automate and analyze platform usage to make data teams more effective. The next-generation data platform includes three new products: Qubole Data Service (QDS) Community Edition, QDS Enterprise Edition, and QDS Cloud Agents.

Posted June 05, 2017

Software Diversified Services (SDS), a provider of mainframe software, has acquired the SMA_RT software product from Type80 Security Software, Inc. SMA_RT is a real-time mainframe intrusion detection, SIEM agent, and log-event processing product for z/OS. Terms of the transaction were not disclosed.

Posted June 05, 2017

Seagate Government Solutions, a U.S.-based subsidiary of Seagate, a provider of storage solutions, and Carahsoft Technology Corp., a government IT solutions provider, have formed a partnership to distribute Seagate Secure Self-Encrypting Drives (SEDs) to government agencies. Under the agreement, Carahsoft will serve as a public sector distributor for Trade Agreements Act (TAA) compliant Seagate Secure Self-Encrypting Drives (SEDs) with FIPS 140-2 Level 2 certification.

Posted June 05, 2017

IBM Security and Cisco announced they are working together to address the growing global threat of cybercrime. In a new collaboration, Cisco and IBM Security will work together across products, services and threat intelligence for the benefit of customers.

Posted June 05, 2017

In the last few years, a frequent topic of conversation within some of the largest corporations in the world has been the move to the cloud—how to prepare for it, how to address it, and how to benefit from it. Yet over the past several months, some are also talking about a more ambitious goal: to be cloud-only by 2025.

Posted June 01, 2017

There are two types of businesses in the world today: those that run on data and those that will run on data. Data security now sits at the top of nearly every organization's priority list. But with such a high volume of data coming into most businesses every day, how can information security professionals quickly identify which data is the highest priority for protection? After all, security costs time and money, and not all types of data are as sensitive or vulnerable as others.

Posted June 01, 2017

Often, when working with technical people, they get excited over the latest and greatest piece of software or hardware. They try and get their management excited, but it seems as if their management just doesn't "get" how good it is.

Posted June 01, 2017

Although Java and JavaScript are the most popular all around programming languages today, the C programming language remains the language of choice for high performance computing after almost 45 years of mainstream use. However, where runtime performance considerations are paramount, Go and Rust are emerging as valid successors to C.

Posted June 01, 2017

Resources used to be expensive. Resources used to be scarce. Resources used to take a long time to provision. As such, it made sense to put resource consumption at the top of the list when talking about database performance. Those days are gone. With more than 80% of databases running in virtual environments, where hardware is more commoditized every day, access to physical resources—CPU, memory, network, and disk—whenever needed is much easier. In fact, Moore's Law predicts that technology advancements will double every 2 years. Well, most physical resources are certainly on pace with that, or better.

Posted June 01, 2017

Cloudera has launched Altus, a new PaaS offering aimed at making it easier to run large-scale data processing applications on public cloud. The initial Altus service helps data engineers use on-demand infrastructure to speed the creation and operation of elastic data pipelines that power sophisticated, data-driven applications.

Posted May 31, 2017

Pythian, a technology services provider, is launching a customized analytics solution that integrates multiple data types from both internal and external sources. The new solution, "Kick Analytics As A Service" (Kick AaaS), gathers multi-source, multi-format data together in the cloud, and adds advanced analytics, machine learning and visualizations to ensure business users and business systems get the insights they need when they need them

Posted May 31, 2017

Software AG has announced its GDPR Framework to help organizations address the impending General Data Protection Regulation (GDPR) which takes effect on May 25, 2018. The European Union (EU) GDPR regulation mandates that all companies and institutions are legally bound to rules aimed at protecting personal data and for upholding the data privacy rights of individuals residing in the EU.

Posted May 30, 2017

Hewlett Packard Enterprise (HPE) has released a flash portfolio update with new products and data protection solutions. According to HPE, the adoption of flash storage continues to gain pace, with 51% of customers predicting that they will have an All-Flash Data Center within 5 years, while IT teams seek deeper integration across servers, storage, networks, and automation tools to maximize value from investments.

Posted May 25, 2017

Businesses need to act now for the arrival of EU GDPR compliance regulations, or risk being among the first to be penalized when the regulations take effect in 12 months' time, according to Commvault, which contends that corporate complacency is one of the biggest barriers to GDPR compliance with many organizations yet to implement suitable processes or technology.

Posted May 24, 2017

IBM has introduced a toolkit on Power Systems optimized for open source databases, including MongoDB, PostgreSQL, MySQL, MariaDB, Redis, Neo4j, and Apache Cassandra, to help deliver more speed, control, and efficiency for enterprise developers and IT departments.

Posted May 24, 2017

Quest Software, a global systems management and security software provider, is announcing enhancements to its portfolio of SQL Server database management and performance monitoring solutions. "We are announcing multiple enhancements to address the Microsoft SQL Server community and customer base," said Greg Davoll, software products leader, Quest Software. Quest as a company is focused on three sets of capabilities for customers: database performance monitoring; database replication; and database development and management tooling, and these new product enhancements reflect that focus, he added.

Posted May 24, 2017

Informatica, a provider of solutions for enterprise data management, has unveiled a metadata-driven artificial intelligence technology called "CLAIRE" with the latest release of the Informatica Intelligent Data Platform.

Posted May 23, 2017

SignalFx, a provider of monitoring and operational intelligence solutions for the cloud, has announced the general availability of its latest release featuring new alerting capabilities that enable cloud operations teams to better monitor and manage cloud infrastructure, containers, and applications.

Posted May 22, 2017

SolarWinds, a provider of IT management software, has completed the acquisition of Scout Server Monitoring.

Posted May 22, 2017

CA Technologies has added new data protection enhancements to CA Data Content Discovery and CA Compliance Event Manager that are designed to simplify security management across the enterprise and enable end-to-end protection for data-in-motion from mobile to mainframe.

Posted May 22, 2017

IBM and Nutanix have formed a multi-year initiative to bring new workloads to hyperconverged deployments. The integrated offering aims to combine Nutanix's Enterprise Cloud Platform software with IBM Power Systems, to deliver a turnkey hyperconverged solution targeting critical workloads in large enterprises. The partnership is aimed at delivering a full-stack combination with built-in AHV virtualization for a simple experience within the data center.

Posted May 22, 2017

Syncsort, an analytics platform provider, has announced new integration between its Big Iron to Big Data powerhouse Ironstream, and Compuware's Application Audit software that delivers real-time machine data to Splunk Enterprise Security for Security Information and Event Management. The new integration is intended to help organizations detect threats against mainframe data, correlate them with related information and events, and satisfy compliance requirements.

Posted May 22, 2017

Continuing its Cloud Platform expansion, Oracle is adding enhancements to make it easier for organizations to move enterprise database applications to the cloud.

Posted May 17, 2017

As DBAs, we can get mired in the depths of performance tuning parameters and scripts, sometimes getting lost in all the details. It is a good idea to always have a set of goals and philosophies that you can lean on to keep you focused and working on the appropriate things. That is what I want to talk about in this month's DBA Corner column. Some high-level rules of thumb for achieving your DBMS-related performance goals and maintaining your sanity.

Posted May 17, 2017

In October of 2008, Congress enacted the Emergency Economic Stabilization Act, more commonly known as the bailout of the financial system. The understanding was that catastrophic financial consequences would be the result of the failure of these entities and that those aggregate failures could devastate the U.S. The recent major outages in the public clouds services inevitably lead to the same issue being consider with regard to this new industry.

Posted May 15, 2017

When people talk about the next generation of applications or infrastructure, what is often echoed throughout the industry is the cloud. On the application side, the concept of "serverless" is becoming less of a pipe dream and more of a reality. The infrastructure side has already proven that it is possible to deliver the ability to pay for compute on an hourly or more granular basis.

Posted May 15, 2017

You will often hear experienced practitioners and consultants suggest that there is both an art and a science to effective data governance. The art is in the details of fine-tuning a data governance program to fit your culture and address specific business needs. But the fundamental principles of data governance are best understood and executed through science.

Posted May 15, 2017

The tide is changing for analytics architectures. Traditional approaches, from the data warehouse to the data lake, implicitly assume that all relevant data can be stored in a single, centralized repository. But this approach is slow and expensive, and sometimes not even feasible, because some data sources are too big to be replicated, and data is often too distributed to make a "full centralization" strategy successful.

Posted May 15, 2017

Data—now universally understood to be the lifeblood of businesses—is at risk like never before in the form of both malicious attacks and innocent indiscretions. Recently, Steve Grobman, CTO for McAfee, discussed the range of threats to data security and what companies must do to defend themselves.

Posted May 15, 2017

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142

Sponsors