Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

Time series databases are optimized for collecting, storing, retrieving, and processing time series data. It's critical that businesses use a time series database for time series data and not one of the traditional data stores. DBTA recently held a webinar with Daniella Pontes, product manager, InfluxData, who discussed how Time Series Databases are built with specific workloads and requirements in mind, including the ability to ingest millions of data points per second.

Posted June 17, 2019

IBM has announced Rapid Move for SAP S/4HANA, an approach designed to help accelerate the process of migrating existing SAP systems to SAP S/4HANA.

Posted June 12, 2019

Volkswagen joined an open industry collaboration for the responsible sourcing of strategic minerals that will use blockchain technology to increase efficiency, sustainability and transparency in global mineral supply chains. The collaboration will enable the Volkswagen Group to gain greater insight into the provenance of cobalt used in lithium-ion batteries for electric vehicles and other types of minerals used elsewhere in the production of vehicles.

Posted June 12, 2019

MariaDB is releasing MariaDB Enterprise Server 10.4, code named "Restful Nights" for the peace of mind it brings enterprise customers. The new MariaDB Enterprise Server includes added functionality for enterprises running MariaDB at scale in production environments, involves new levels of testing, and is shipped in a configuration that is secure by default.

Posted June 11, 2019

Databricks, a provider of analytics software founded by the original creators of Apache Spark, and Booz Allen Hamilton, a provider of machine learning services to the U.S. Federal Government, are helping federal agencies scale their data analytics capabilities and accelerate AI initiatives across on-premise, hybrid, and cloud environments.

Posted June 11, 2019

WANdisco, the LiveData company, is launching LiveMigrator, a non-blocking technology that migrates petabytes of unstructured data from on-premise data centers to any cloud vendor in one pass. Rather than implementing costly and risky multiple passes to migrate data, which can take three to six months and block users from making changes, applications can continue to access the on-prem environment - even as data moves to the cloud - with users directing new workloads or queries at cloud assets.

Posted June 11, 2019

AI, machine learning, and predictive analytics are used synonymously by even the most data-intensive organizations, but there are subtle, yet important, differences between them. Machine learning is a type of AI that enables machines to process data and learn on their own, without constant human supervision. Predictive analytics uses collected data to predict future outcomes based on historical data.

Posted June 10, 2019

At times, there is a need to have security within the database be a bit more sophisticated than what is available.  On specific tables, there may be a need to limit access to a subset of rows, or a subset of columns to specific users. Yes indeed, views have always existed, and yes indeed, views can be established limiting rows or columns displayed. However, views only can go so far.

Posted June 10, 2019

What are the practices and procedures that you have found to be most helpful to automate in administering your databases? Yes, I know that automation has been a standard claim for most DBMS vendors, as well as third-party DBA tool vendors, for many years. But are you really anywhere closer to an "on demand," "lights-out," "24/7" database environment yet?

Posted June 10, 2019

Quantum computing will bring unprecedented advances in medicine, science, and mathematics—knowledge currently out of reach. Many secrets of the universe are on the verge of discovery. But, are we ready for everything to be unlocked? Are we prepared to manage what comes with quantum computing's limitless architecture?

Posted June 10, 2019

IT executives and their business counterparts understand the importance of a strong data strategy and its value to their businesses, and most are starting to get the key pieces in place to drive transformation through next-generation technologies and processes. One in four enterprises now regards real-time data as critical to their ongoing operations—and another one in four is actively preparing to introduce real-time data capabilities into their infrastructures, according to a new Unisphere Research survey.

Posted June 10, 2019

Red Hat Enterprise Linux 7.7 beta is now available. The beta release was announced in a Red Hat blog post by Chris Baker.The latest update to the stable and more secure Red Hat Enterprise Linux 7 platform marks the final release in the Full Support Phase (formerly known as "Production Phase 1") of the RHEL 7 lifecycle as described in the Red Hat Enterprise Linux Lifecycle.

Posted June 10, 2019

The Open Mainframe Project (OMP), an open source initiative that enables collaboration across the mainframe community to develop shared tool sets and resources, announced the launch of this year's internship program with 9 global students. Each intern will be paired with mentors from member organizations such as Red Hat, IBM, Sine Nomine Associates and SUSE who designed a project to address a specific mainframe development or research challenge.

Posted June 10, 2019

SnapLogic, provider of an Intelligent Integration Platform, is now a certified CoupaLink Technology Partner after successfully completing the certification to integrate with the Coupa Business Spend Management (BSM) Platform. SnapLogic's new Coupa connector uses Coupa's REST-based APIs to enable seamless data flow to and from the Coupa BSM Platform.

Posted June 07, 2019

Data lake adoption is on the rise. Right now, 38% of DBTA subscribers have data lakes deployed to support data science, data discovery and real-time analytics initiatives, and another 20% are considering adoption. Today, most data lakes are on-premises. However, the cloud is becoming an increasingly attractive location as well. While data lakes have evolved and matured over the past few years of enterprise use, many challenges still exist.

Posted June 06, 2019

Arcserve, a data protection provider, is updating the Arcserve Replication and High Availability (RHA), adding full system high availability for Linux. With this release the platform also extends its full system support of Windows and Linux workloads to Azure, and provides a number of enhancements including performance and usability improvements and new platform certifications.  

Posted June 06, 2019

DBTA's next Data Summit conference will be held May 19-20, 2020, with pre-conference workshops on Monday, May 18. The conference will return to the Hyatt Regency Boston.

Posted June 06, 2019

AtScale, the data warehouse virtualization company, is releasing AtScale 2019.1, featuring updates such as native multi-dimensional support for Microsoft Power BI, data warehouse platform support for Teradata and PostgreSQL, and more. Following recent innovations announced in Q1 2019, including AtScale's integration with Snowflake's cloud-built data warehouse, AtScale 2019.1 further enables enterprises to modernize operational analytics and business intelligence data across on-premise, hybrid-cloud, and multi-cloud deployments.

Posted June 06, 2019

Many enterprises running Oracle as their database of record are looking to migrate their implementations to the public cloud, to escape from the age-old problems with in-house system deployment: large capital outlays, time consuming infrastructure deployment and management, and lack of elasticity and adaptability. The benefits of migrating to a cloud environment are compelling, and the message is getting through to enterprises: According to a recent cloud usage survey conducted by data virtualization company Denodo, 36% of organizations are currently in the process of migrating their data infrastructure to the cloud, while nearly 20% are in advanced stages of implementation.

Posted June 05, 2019

Snowflake, the data warehouse built for the cloud, is introducing the Snowflake Data Exchange, an all-new shared data experience. The Data Exchange is a free-to-join marketplace that enables Snowflake customers to connect with data from providers to seamlessly discover, access, and generate insights from data.

Posted June 05, 2019

Microsoft and Oracle have announced a cloud interoperability partnership to enable customers to migrate and run mission-critical enterprise workloads across Microsoft Azure and Oracle Cloud. According to Don Johnson, executive vice president, Oracle Cloud Infrastructure, with this partnership, joint Oracle and Microsoft customers will be able to migrate their entire set of existing applications to the cloud without having to re-architect anything, and have the ability to preserve the large investments they have already made.

Posted June 05, 2019

Opengear, a provider of solutions to critical IT infrastructure, will be unveiling several new product features that empower customers to improve the reliability of their network. Opengear continues to focus on network resilience—keeping the network running at the core and out to the edge of the infrastructure, with no disruption to the customer experience in the event of network issues.

Posted June 04, 2019

Trifacta, a leader in data preparation, is releasing a new native integration for Snowflake, the data warehouse built for the cloud. Trifacta's visual and machine learning guided interface empowers data teams to collaboratively explore, clean, structure and enrich data at the scale and agility provided by Snowflake.

Posted June 04, 2019

Syncsort is expanding its geographic presence with the opening of offices in Pune and Bengaluru, India. Research, development, and support teams at these locations will initially focus on supporting Syncsort's strategic partnership in B2B integration software, and will also play a central role in accelerating momentum with customers in Asia Pacific and advancing overall company strategy.

Posted June 04, 2019

SolarWinds, a provider of IT management software, is adding a broad refresh to its network management portfolio, boosting network security features and more. The addition of Network Insight support for Palo Alto Networks and key enhancements to the SolarWinds Orion Platform bring greater data visibility and scalability to IT pros.

Posted June 04, 2019

Fivetran, which builds automated technology to help analysts replicate data into cloud warehouses, is revealing its new in-warehouse transformation product, Fivetran Transformations. Designed to bring simplified, fail-safe data transformations to Fivetran's automated data pipeline solution, the agile end-to-end tool enables data teams to execute SQL when new data arrives or on a schedule.

Posted June 03, 2019

Datical, a provider of database release automation solutions, is receiving a patent number for its Database Change Management Simulator. The patented technology proactively forecasts and determines the impact of database schema and stored procedure changes before they are deployed.

Posted May 31, 2019

OwnBackup, a backup and recovery vendor, is closing a $23.25 million Series C round of financing that will be used to meet escalating market demand for the company's solutions. This round is being co-led by Insight Venture Partners and Vertex Ventures. Existing investors Innovation Endeavors, Oryzn Capital and Salesforce Ventures also participated in the round.

Posted May 30, 2019

Pivotal Software, Inc., a cloud-native platform provider, is releasing Pivotal Spring Runtime, a comprehensive support package for Java environments. Pivotal Spring Runtime supports Java workloads running on Linux and Windows server environments including server-side apps running on bare metal, VMs, containers, or Kubernetes. Pivotal Application Service includes Pivotal Spring Runtime, so Java workloads running on the app platform are already covered.

Posted May 28, 2019

JetStream Software Inc., a provider of cloud data protection, is receiving Series A funding of $7.7 million to advance the company's unique technologies. The company's platform enables managed service providers (MSPs) and cloud service providers (CSPs) to deliver disaster recovery as a service (DRaaS) and continuous data protection to enterprise customers, government agencies, and research and educational organizations.

Posted May 28, 2019

Broadcom has introduced value-based software licensing model designed to provide increased clarity regarding software consumption on z/OS mainframe systems. The new Mainframe Consumption Licensing (MCL) model is intended to give mainframe customers better visibility and predictability into their software spending while enabling them to maximize the value created for their end customers.

Posted May 28, 2019

Following up on IBM's announcement of Tailored Fit Pricing for IBM Z, John McKenny, VP of Strategy for ZSolutions Optimization at BMC, commented on the significance of the new offerings in a BMC blog post. BMC supports the new Tailored Fit Pricing model because it will help an increasing number of enterprises continue to grow and build new services on top of their mainframes as well as re-ignite interest in using the platform as an integral part of today's hybrid cloud strategies, McKenny noted.

Posted May 28, 2019

IBM has unveiled new services and capabilities for IBM Z, which, the company says, will further position it as a center point of a secured hybrid cloud strategy. According to Ross Mauri, general manager for IBM Z, who made the announcement in an IBM Infrastructure blog post, the new offerings include three major elements. IBM sees "secured hybrid and multicloud as the future of enterprise IT, and IBM Z is at the center," said Mauri.

Posted May 28, 2019

NetApp, a provider of hybrid cloud data services, is releasing NetApp ONTAP 9.6, the new midrange, end-to-end NVMe AFF A320 storage system and an expanded portfolio of services.

Posted May 28, 2019

Dynatrace, a software intelligence company, is providing support for Red Hat OpenShift 4, the next generation of Red Hat's enterprise Kubernetes platform. Dynatrace's Software Intelligence platform automatically monitors and analyzes containers and the microservices running inside of them across the entire Red Hat OpenShift 4 Kubernetes environment and underlying multi-cloud infrastructure with no blind spots. Dynatrace

Posted May 28, 2019

Red Hat is contributing to Microsoft KEDA, a new open source project aimed at providing an event-driven scale capability for any container workload. Using KEDA, Red Hat puts Azure Functions on top of its OpenShift Container Platform in Developer Preview. It is designed to behave the same way it does when running on Azure as a managed service, but now running anywhere OpenShift runs, which means on the hybrid cloud and on-premises.

Posted May 28, 2019

It can be challenging for IT architects and executives to keep up with today's modern IT infrastructure. Homogeneous systems, common in the early days of computing, are almost non-existent today in the age of heterogeneous systems. It is de rigueur for Linux, Unix and Windows servers to be deployed throughout a modern IT infrastructure. And for larger shops, add in mainframes, too.

Posted May 28, 2019

Due to exponentially growing data stores, organizations today are facing slowdowns and bottlenecks at peak processing times, with queries taking hours or days. Some complex queries simply cannot be executed. Data often requires tedious and time-consuming preparation before queries can be run. David Leichner, CMO, SQream, demonstrated how the power of GPUs can help conquer these challenges with his presentation, "Accelerating Analytics in a New Era of Data," during Data Summit 2019.

Posted May 24, 2019

DataOps is still a relatively new concept, it combines people, processes, and technology to build and enhance data analytics. At Data Summit 2019 Kevin Petrie, Senior Director of Marketing, Attunity, Inc., explored the the five key steps necessary to be successful with DataOps, including the process and cultural shift required during his session, "Five Key Requirements for DataOps Success."

Posted May 24, 2019

As Data Summit 2019 comes to a close, John O'Brien, principal advisor and CEO, Radiant Advisors, looked at lessons learned from this year's conference during his closing keynote. Companies are not lacking in technology options; in most cases, more advanced technologies exist than can be absorbed into the organization all at once.

Posted May 23, 2019

The ability for knowledge graphs to amass information and relationships and connect facts is showing potential for a range of use cases. Bob Kasenchak, director of business development, Access Innovations, Inc., USA, discussed the rise of knowledge graphs in his presentation, "From Structured Text to Knowledge Graphs: Creating RDF Triples From Published Scholarly Data" at Data Summit 2019.

Posted May 23, 2019

Syncsort, the provider of Big Iron to Big Data software, is introducing Connect CDC, a new real-time change data capture and data replication product that enables organizations to stream application data. By making all mission-critical application data accessible in real time, organizations can empower business users across the enterprise to make decisions based on information when it's most relevant and valuable.

Posted May 23, 2019

Accenture and SAP are co-developing and co-innovating to accelerate the development of the SAP C/4HANA platform. The initiative, called Project Elevate, includes the formation of industry consortia with key market leaders in automotive original equipment manufacturers (OEM), business-to-business component manufacturing and utilities to help define and design these industry-specific experiences.

Posted May 22, 2019

SAP is introducing new SAP HANA Cloud services, aiming to bring the power and performance of the SAP HANA database to the cloud. SAP HANA Cloud Services and SAP Cloud Platform aim to provide customers access to all SAP and third-party application data, reduce data duplication and offer a single point for security and governance.

Posted May 22, 2019

Mindtree, a global technology services and digital transformation company, is launching QuikDeploy, an IP-driven approach to helping customers maximize their use of the SAP Solution Manager. This flexible industry accelerator is tailor-made to rapidly deploy SAP S/4HANA into Microsoft Azure cloud.

Posted May 22, 2019

When it comes to cloud technology, more and more businesses are realizing the benefits that cloud can provide them and are beginning to seek more cloud computing options to conduct their business activities. And obviously, Amazon, Microsoft, Google, Alibaba, IBM, and Oracle plan to capture this spend by providing a dizzying array of IaaS and PaaS offerings to help enterprises build and run their services.

Posted May 22, 2019

As we move into a new computing era, technology has outstripped society in its ability to up-end labor markets and industries, as well as personal lives. In a panel discussion at Data Summit 2019, Sue Feldman, president, Synthexis, David Bayer, Executive Director, Cognitive Computing Consortium, Tom Wilde, CEO, Indico, and Steven Cohen, COO, co-founder, Basis Technology, explored the ethical and legal issues that computing advances have raised.

Posted May 22, 2019

There are new technologies that contribute to speed and scale of a modern data platform. But as data size and complexity increase with Big Data, data quality and data integration issues must still be addressed. At Data Summit 2019 Prakriteswar Santikary, VP & Global Chief Data Officer, ERT discussed how to create the architecture of a modern, cloud-based, real-time data integration and analytics platform that ingests any type of clinical data (structured, unstructured, binary, lab values, etc.) at scale from any data sources, during his presentation, "Designing a Fast, Scalable Data Platform."

Posted May 22, 2019

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114

Sponsors