Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

VMware Virtual Volumes has gained increasing adoption from storage vendors since its release. Virtual Volumes can help provide many distinct capabilities for virtualized Tier 1 business critical Oracle workloads that have not been available with traditional storage.

Posted January 08, 2020

erwin, the data governance company, is releasing an updated version of erwin Data Modeler (erwin DM), introducing new innovation across its enterprise modeling and data governance suites.

Posted January 08, 2020

Accenture has agreed to acquire Symantec's Cyber Security Services business from Broadcom. Completion of the acquisition is subject to customary closing conditions and is expected to close in March 2020.

Posted January 07, 2020

vXchnge, a data-center-as-a-service (DCaaS) provider, announced that Tier4 Advisors is joining its channel program. Tier4 is a vendor agnostic IT sourcing, architecture, and strategy firm. Tier4 will offer vXchnge colocation and other DCaaS solutions throughout the company's service footprint. The partnership allows Tier4 to expand its service portfolio of integrated solutions to customers down to the platform level.

Posted January 07, 2020

There are many scripting languages out there, with the latest being Flux by InfluxData. Flux is a functional data scripting language designed to accommodate a wide array of data processing and analytical operations. DBTA recently held a webinar with Scott Anderson, technical writer, InfluxData, who discussed the origins of Flux, key concepts, basic Flux syntax and more.

Posted January 06, 2020

From data lakes and the cloud, to machine learning and artificial intelligence, the world of big data and analytics continues to evolve. At the same time, the need for improved governance and security practices is also intensifying as data privacy concerns and new regulations like GDPR and CCPA require new approaches for responsible use of data.

Posted January 03, 2020

The big data space is constantly in flux. New and emerging technologies plan to disrupt reliable legacy solutions while others hope to work in harmony with tools in the market. What are the next explosive trends for 2020?

Posted January 03, 2020

When applied implementation efforts are not efficient, more often than not, the inefficiencies are due to the interference of an imp known as "churn," i.e., implementation wheels spinning away and not actually making progress. Churn is bad. Churn is one of the most destructive circumstances for any IT project. Churn may raise its ugly head at any point where a project requirement or need is left unclear.

Posted January 02, 2020

One of the great challenges with catastrophic events is that they can come from any number of sources, so business owners must make sure they're prepared for all types of disasters. Preparation is the only way to avert disasters and ensure your operations will continue without significant disruption.

Posted January 02, 2020

With the closing of one year comes another and this time a whole new decade awaits. Every year Database Trends and Applications magazine looks for offerings that promise to help organizations derive greater benefit from their data, make decisions faster, and do so with higher levels of security. Key data management trends have emerged that are shaping the capabilities of IT products and services for 2020 and beyond.

Posted January 02, 2020

2019 was a banner year for cybersecurity crime, with hackers targeting consumers, government agencies, and private corporations alike. According to the Ponemon Institute, the average total cost of a data breach is $3.86 million, and 80% of U.S. businesses expect that they will have had a critical breach this year. These numbers are not only sizable; they're alarming.

Posted December 30, 2019

Many organizations are experimenting with AI programs, but most of them face a significant and seemingly intractable problem. Although proof-of-concept (POC) projects and minimum viable products (MVPs) may show value and demonstrate a potential capability, frequently, they are difficult to scale.

Posted December 30, 2019

The cloud was on everyone's mind this past year; with so many questions rising surrounding how to secure cloud environments to what type of cloud is best for the organization. Cloud computing has revealed countless new dimensions to IT. There are public clouds, private clouds, distributed clouds, and hybrid, multi-cloud architectures.

Posted December 24, 2019

The Internet of Things has connected different facets of the enterprise in ways that were previously only imagined. Now companies and customers demand to see results and understand those results in real time or near-real time. IoT can connect enterprise assets to gain real-time insights, which improve decision making, drive efficiency, empower employees, and create better customer experiences. Here, executives of leading companies offer 6 predictions for what's ahead in 2020 for the IoT space.

Posted December 24, 2019

For many, governance remains a dirty word: It's bureaucratic, restrictive, and slows things down. This perception is diametrically opposed to data governance's true objective, which is to enable rapid yet appropriate exploitation of enterprise data assets.

Posted December 23, 2019

There's no doubt that AI has taken center stage in the enterprise data and analytics world, as evidenced by the mass quantities of related headlines, conferences, and vendor marketing. But hype aside, business executives are now discovering how to leverage AI for improved decision making with augmented or assistive intelligence solutions or the competitive advantages in new products and services. AI is proving its viability in the real world, including enterprise data and analytics.

Posted December 23, 2019

In 2019 artificial intelligence and machine learning continued its upward trajectory in the market, promising to change the future as we know it. To help support data management processes and decision making, artificial and augmented intelligence is being infused into products and services. Machine learning sits in the center of all AI conversations, as combining machine learning with AI and cognitive technologies can make it even more effective in processing large volumes of information. Both technologies can lead to automation of tasks inside and outside the enterprise-another subject that promises to make waves in the future.  Here, executives of leading companies offer 10 predictions for what's ahead in 2020.

Posted December 20, 2019

Dotscience, a provider of DevOps for Machine Learning (MLOps) solutions, is forming partnerships with GitLab and Grafana Labs, along with strengthening integrations with several platforms and cloud providers. The company is deepening integrations to include Scikit-learn, H2O.ai and TensorFlow; expanding multi-cloud support with Amazon Web Services (AWS) and Microsoft Azure; and entering a joint collaboration with global enterprises to develop an industry benchmark for helping enterprises get maximum ROI out of their AI initiatives.

Posted December 18, 2019

Unravel Data, a data operations platform providing full-stack visibility and AI-powered recommendations, is joining the Amazon Web Services (AWS) Partner Network (APN) Global Startup Program. The Unravel platform is designed to accelerate the adoption of big data workloads in AWS. By supporting Amazon EMR, Unravel allows users to connect to a new or existing Amazon EMR cluster with just one click.

Posted December 18, 2019

Magnitude Software, a provider of unified application data management solutions, is updating Magnitude SourceConnect to help accelerate implementations of SAP S/4HANA for central finance foundation. Leveraging Magnitude's pre-built solution with SAP S/4HANA for central finance foundation, businesses may be able to achieve full finance transformations faster than ever before.

Posted December 18, 2019

SAP is forming a partnership with project44, a global leader in advanced visibility for shippers and logistics service providers, to change the way shippers manage the delivery process. The companies plan to offer a joint solution for key transportation processes, such as receiving and tracking, that will connect, automate, and provide visibility data and insight from project44 within SAP Logistics Business Network.

Posted December 18, 2019

The next decade is just around the corner and enterprises in and around the big data space are preparing to pounce upon the next set of trends the new year will bring. The cloud is primed to continue making waves, along with other digital disruptions to improve user experiences. Several industry experts from SAP have offered up what they see as the top trends for 2020. 

Posted December 18, 2019

Syniti, a global data management solution provider, is releasing the Syniti Data Replication 9.6 software solution, a flexible platform that provides fast up-to-the-minute data replication with real-time change data capture. Available on-premise or in the cloud, Syniti Data Replication enables successful management of data growth, faster synchronization and integration, and analysis of data. 

Posted December 18, 2019

SolarWinds, a provider of IT management software, has acquired VividCortex, a provider of SaaS-delivered database performance management with an emphasis on databases commonly used in cloud-native applications designed to meet the needs of a wide range of customers. SolarWinds plans to add the VividCortex product to its IT operations management portfolio beginning in Q4 2019. The SaaS-based offering will complement SolarWinds' Database Performance Analyzer (DPA), an on-premise and cloud-deployed product.

Posted December 18, 2019

Oracle business strategist Lee Levitt laments the lack of data-driven organizations in today's business world in this clip from his keynote at Data Summit 2019.

Posted December 18, 2019

AppNeta, a provider of network performance monitoring, has added product updates. The latest round of improvements provides multi-path visibility into the largest, most complex and dynamic networks in the world, combined with new abilities around API monitoring, easier integration, advanced configuration, and enhanced route alerting, have led to faster time to value across the entire customer base.

Posted December 17, 2019

Ontotext is updating its signature platform with new GraphQL interfaces to make it easier for application developers to access knowledge graphs without tedious development of back-end APIs or complex SPARQL. The underlying Semantic Object service implements an efficient GraphQL to SPARQL translation optimized for GraphDB, as well as a generic configurable security model.

Posted December 17, 2019

It is said that form follows function in architecture, and we are seeing something similar in IT, where job titles follow current trends. So move over data scientist. The "contextualist" is here. We have a lot of shifts in IT, and today we are at the dawn of Industrial Revolution Number Four. This revolution started in the communications industry about a decade ago, with advances in mobile technology that helped disrupt the traditional media world by offering online news and entertainment.

Posted December 16, 2019

The big data ecosystem has changed. No longer is it true that people just want to store massive quantities of data. Action is not only needed but must be taken to sustain the viability of an organization. While some say big data is dead, the concept isn't going anywhere. Instead, it is the notion of inaction on big data that is dead. In addition, the technologies that were built to store and process big data are not loved by the industry; they are merely tolerated and are often maligned. They have been difficult to put into production, to maintain, manage, and even find people with the skills to do all the work.

Posted December 16, 2019

Accenture is acquiring Clarity Insights, a U.S.-based data consultancy with deep data science, artificial intelligence (AI), and machine learning (ML) expertise. The acquisition will add nearly 350 employees, along with a strong portfolio of accelerators, which can help organizations more quickly realize value from their data, to Accenture's Applied Intelligence business. These additions will further equip clients with leading capabilities to meet the growing demand for enterprise-scale AI, analytics, and automation solutions.

Posted December 13, 2019

As the urgency to compete on analytics continues to revolutionize the business world, more and more organizations are moving their data to the cloud to reduce infrastructure costs, increase efficiencies and improve time-to-value. At the same time, there are many success factors to consider, from the strengths and weaknesses of different cloud providers, to integration hurdles, data latency challenges and governance problems.

Posted December 12, 2019

Dynatrace announced the company has created Keptn, an open source pluggable control plane to advance the industry movement toward autonomous clouds. Keptn provides the automation and orchestration of the processes and tools needed for continuous delivery and automated operations for cloud native environments.

Posted December 12, 2019

dotData, focused on delivering full-cycle data science automation and operationalization for the enterprise, has achieved Advanced Technology Partner status in the Amazon Web Services (AWS) Partner Network (APN). Achieving APN Advanced Technology Partner status is recognition of dotData's ability to deliver data science automation and machine learning (ML) automation on AWS.

Posted December 12, 2019

Google Cloud has developed a new software service that helps organizations accomplish large-scale, online data transfers. The goal is to help take the complexity out of data transfers and move data faster than existing online tools like gsutil. "In migrating to the cloud, enterprises can realize the full value of their data—building new apps and experiences faster and accessing analytics and machine learning solutions to generate new insights," Ash Ahluwalia, product manager at Google Cloud, told 5 Minute Briefing.

Posted December 12, 2019

GigaSpaces, the provider of Insight Edge, is releasing GigaSpaces Version 15.0, including updates to the InsightEdge Platform and XAP, to operationalize and optimize machine learning. GigaSpaces Version 15.0 powers machine learning operations (MLOps) initiatives, helping enterprises maximize the business value derived from big data.

Posted December 11, 2019

Oracle OpenWorld, which has for many years been held in San Francisco during the month of September or October, is moving to Las Vegas. "Oracle is excited to offer a modern, state-of-the-art experience for attendees at Oracle OpenWorld and Code One 2020 in Las Vegas," an Oracle spokesperson said in a written statement. "The city and its vast amenities are tailor-made for hosting large-scale events, and we look forward to bringing the industry's most comprehensive technology and developer conference to America's premier hospitality destination."

Posted December 11, 2019

A.M. Turing Award Laureate and database technology pioneer Michael Stonebraker delivered a welcome keynote at Data Summit 2019. He discussed the "fly in the ointment to the data warehouse crowd."

Posted December 11, 2019

Processing big data in real-time for artificial intelligence, machine learning, and the Internet of Things poses significant infrastructure challenges. Whether it is for autonomous vehicles, connected devices, or scientific research, legacy NoSQL solutions often struggle at hyperscale. They've been built on top of existing RDBMs and tend to strain when looking to analyze and act upon data at hyperscale - petabytes and beyond.

Posted December 09, 2019

Prior to 2008, whatever your database question, the answer was always,"Oracle"—or sometimes, "MySQL" or, "SQL Server." The relational database management system (RDBMS) model—really a triumvirate of technologies combining Codd's relational data model, the ACID transaction model, and the SQL language—dominated database management systems completely.

Posted December 09, 2019

Compuware has announced the findings of a global survey of 400 IT leaders, which reveals that manual testing practices, still in widespread use, are one of the biggest challenges large organizations face as they attempt to accelerate digital innovation. The survey examined the processes that organizations have in place to deliver innovation on the mainframe as quickly as in their distributed environments—which are highly reliant on the mainframe.

Posted December 09, 2019

IBM mainframe customers would prefer a single unified authentication system that lets users securely access both mainframe and non-mainframe applications, a new survey suggests. The research finds that this would encourage more mainframe customers to move away from relying solely on passwords in favor of stronger protection such as multi-factor authentication (MFA). Mainframe professionals also recognize that it is easier and more cost effective to deploy the same MFA solution across mainframe and other environments such as Microsoft Windows.

Posted December 09, 2019

Syncsort has completed the acquisition of the Pitney Bowes software and data business, creating a data management software company with more than 11,000 enterprise customers, $600 million in revenue and 2,000 employees worldwide. The combined portfolio brings together capabilities in location intelligence, data enrichment, customer information management and engagement solutions with data integration and optimization software.

Posted December 09, 2019

Sixgill, LLC, a provider of data automation and authenticity products and services, is offering Sixgill Integrity 1.0 for blockchain-enforced data authenticity. Sixgill Integrity fulfills the critical enterprise need for end-to-end, real-time data authenticity assurance with robust capabilities to monitor and guarantee the veracity of any data stream, including today's sensors emitting time-series data in any form.

Posted December 09, 2019

Oracle has announced that Oracle Cloud has joined the IO500. "Our HPC file server solution opens up tremendous opportunities for manufacturing, energy, and other R&D-heavy customers," said Vinay Kumar, vice president, product management, Oracle Cloud Infrastructure. "Customers can tackle their toughest workloads at some of the best price-performance ratios in the market."

Posted December 04, 2019

Banzai Cloud, a cloud software startup, is releasing a secure application platform, Pipeline 2.0, now integrated with Cloud Fusion. Pipeline 2.0 is a hybrid multi-cloud application platform based on Cloud Native technologies that provides a productive foundation for containerized application deployment as well as integrated solutions for Day One and Day Two operations.

Posted December 04, 2019

KALEAO is rebranding itself as Bamboo Systems, a provider of transformative ARM server architecture designed to power the next generation of sustainable data centers. In addition to the rebrand, the company has closed a $4.5 million pre-Series A funding round led by Seraphim Capital. The new funding will be used to bring to market the only server designed to give hyperscale performance with ten times the density of today's Intel-based servers.

Posted December 03, 2019

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127

Sponsors