Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

Think administering z/VM will be difficult? Think again. With IBM Wave for z/VM (IBM Wave) with Oracle, IT organizations can unleash the power of z/VM virtualization, improve productivity, simplify management, and accelerate the cloud journey with innovative technology that helps reduce the barriers to management of virtualized environments.

Posted May 21, 2015

IBM Research and SCA are partnering to create a shared service cloud for the municipalities of New York that is predicted to eliminate 25% of government's IT budget by streamlining applications and connecting siloed municipalities. The IBM mainframe is the platform that New York trusts to host that cloud. In addition to its time-tested scalability, reliability and security, it offers the lowest total cost of ownership — supporting the state as it strives for reduced spending and a smarter, future-ready IT infrastructure.

Posted May 21, 2015

Challenged to do more with less, Dundee City Council identified IT as an area in which it could cut costs. The council consolidated its Oracle Database environment supporting critical services from distributed servers to an IBM zEnterprise BC12 platform, dramatically improving efficiency.

Posted May 21, 2015

IBM Business Partner L3C LLP focuses on three ways to help businesses across the UK increase profitability and reduce costs: consultancy, managed services and complementary IT resources. L3C deployed IBM System z servers running Linux to provide companies of any size — including small, midsized and very large enterprises — with scalable, cost-effective, high-performance cloud services.

Posted May 21, 2015

SAP has announced the cloud edition of SAP Business Suite 4 SAP HANA (SAP S/4HANA), which adds new simplification and innovations across core business functions. In addition, the new release gives customers the opportunity to deploy real hybrid scenarios — combining on-premise and cloud solutions.

Posted May 20, 2015

Chartio, a cloud business intelligence service, is introducing a new solution called Data Stores for transforming and storing data for business intelligence in a more agile way. Data Stores will enable administrators to use Chartio's Data Pipeline to rapidly transform data and store it in the cloud, making it more useful and accessible for end users.

Posted May 20, 2015

In a recent DBTA webcast, Anil Kumar, senior product management with Couchbase, tackled the topic of multi-dimensional scaling (MDS) within NoSQL database clusters. This method of scaling enables organizations to isolate specific database services of their database and scale those accordingly.

Posted May 20, 2015

As the technology behind data storage evolves at a rapid pace, more companies are looking to take advantage of the cloud. However, moving big data to the cloud is not without its share of problems.

Posted May 20, 2015

Oracle is collaborating with Mirantis to enable Oracle Solaris and Mirantis OpenStack users to accelerate application and database provisioning in private cloud environments via Murano, the application project in the OpenStack ecosystem.

Posted May 20, 2015

MapR Technologies, Inc., a provider of a distribution for Apache Hadoop, is including Apache Drill 1.0 in the MapR Distribution.

Posted May 19, 2015

Time is running out to vote for the 2015 DBTA Readers' Choice Awards. This year, there are more than 300 nominees across 29 categories.

Posted May 19, 2015

The shortage of skilled talent and data scientists in Western Europe and the U.S. has triggered the question of whether to outsource analytical activities. This need is further amplified by competitive pressure to reduce time to market and lower costs.

Posted May 19, 2015

As the excitement and opportunity provided by big data tools develop, many organizations find their big data initiatives originating outside existing data management policies. As a result, many concepts of formal data governance are either intentionally or unintentionally omitted as these enterprises race to ingest huge new data streams at a feverish pace in the hope of increased insight and new analytic value.

Posted May 19, 2015

Similar to the dot-com revolution, the Internet of Things is the culmination of radical advances in four core technology pillars.

Posted May 19, 2015

Business pressures, including cost reduction, scalability, and "just-in-time" application software implementation, are just some of the requirements prompting businesses to "cloudify" at least some aspect of their IT infrastructure.

Posted May 19, 2015

Data preparation is gaining considerable visibility as a distinct aspect of data management and analytics work.

Posted May 19, 2015

Hadoop is contributing to the success of data analytics. Anad Rai, IT manager at Verizon Wireless, examined the differences between traditional versus big data at Data Summit 2015 in a session titled "Analytics: Traditional Versus Big Data." The presentation, which was part of the IOUG track moderated by Alexis Bauer Kolak, education manager at the IOUG, showed how big data technologies are helping data discovery and improving the transformation of information and knowledge into wisdom.

Posted May 14, 2015

With the influx of big data solutions and technologies comes a bevy of new problems, according to Data Summit 2015 panelists Miles Kehoe, search evangelist at Avalon Consulting, and Anne Buff, business solutions manager for SAS best practices at the SAS Institute. Kehoe and Buff opened the second day of Data Summit with a keynote discussion focusing on resolving data conundrums.

Posted May 14, 2015

These days, managing a data center can be like working inside a pressure cooker. Virtualization, dynamic computing, cloud computing, big data, the Internet of Things—each major development turns up the heat, but budgets, staff, and skills often lag behind explosive growth in data center scale and complexity.

Posted May 14, 2015

Every now and then, somebody will raise the age-old question "How can I measure the effectiveness and quality of my DBA staff?" This can be a difficult question to answer. And it almost always hides the actual question that is begging to be asked, which is "How many DBAs do we need?"

Posted May 14, 2015

With all the cheerleading and the steady drumbeat of new features being released to Azure, it's easy to lose track of the many cool and valuable new features released in the on-premises version of SQL Server. One of the crown jewels of SQL Server, the cardinality estimator (CE), underwent a large redesign for SQL Server 2014 to improve performance. Cardinality estimates are an extremely important part of query processing. In a nutshell, cardinality estimates are what the relation engine predicts for the number of rows affected by a given operation, including intermediate row sets like those created by filters, aggregations, joins and spool.

Posted May 14, 2015

RevCon is Revelation Software's conference that covers a wide array of topics on Revelation Software. Revelation Software users' conference this year focused on Revelation Software's flagship product, OpenInsight, and the new capabilities of version 10.0.

Posted May 14, 2015

At Data Summit 2015 in New York, James Casaletto of MapR and David Teplow of Integra provided deep dive into the world of Hadoop, past, present, and future.

Posted May 12, 2015

Capgemini is extending its long-standing strategic partnership with SAP, allowing Capgemini to act as a single point of contact for customers globally, and delivering SAP products and support services through one consolidated framework. By signing a global value-added reseller (VAR) agreement with SAP, Capgemini is among a select group of global SAP partners that are part of the global program, which has specific entry requirements that include global reach, reseller capabilities and revenue targets.

Posted May 12, 2015

Splice Machine is partnering with Talend to enable customers to simplify data integration and streamline data workflows on Hadoop. Through this partnership, organizations building operational data lakes with Splice Machine can augment Talend's data integration technology with its data quality capabilities.

Posted May 12, 2015

Pentaho users will now be able to use Apache Spark within Pentaho thanks to a new native integration solution that will enable the orchestration of all Spark jobs. Pentaho Data Integration (PDI), an effort initiated by Pentaho Labs, will enable customers to increase productivity, reduce maintenance costs, and dramatically lower the skill sets required as Spark is incorporated into big data projects.

Posted May 12, 2015

IBM today announced a combined package of new servers and storage software and solutions in a move to accelerate the development of hybrid cloud computing. The company also revealed flexible software licensing of its middleware to help clients speed up their adoption of hybrid cloud environments.

Posted May 11, 2015

Red Hat, Inc., a provider of enterprise open source solutions, announced Red Hat JBoss Enterprise Application Platform (JBoss EAP) 6.4 and expanded benefits for JBoss EAP subscribers deploying their Java applications in hybrid cloud environments.

Posted May 11, 2015

HP has made multiple contributions to the OpenStack Kilo release, including new converged storage management automation and new flash storage technologies to support flexible, enterprise-class clouds. HP's storage contributions to the OpenStack Kilo release focus on two strategic goals.

Posted May 11, 2015

Radware, a provider of cybersecurity and application delivery solutions, released a new managed, cloud-based Web Application Firewall (WAF) service that provides unmatched protection from web-based cyberattacks.

Posted May 11, 2015

As evidenced in the latest spate of leading vendor announcements, all major enterprise computing initiatives are intended to more effectively blend on-premises platforms and applications with those in the cloud - whether it's private or public clouds, or a combination of both. We are reaching the point in which the term "cloud" may not even be important anymore.

Posted May 11, 2015

Teradata has made enhancements to the Teradata Database's hybrid row and column capabilities to provide quicker access to data stored on columnar tables and drive faster query performance. Other relational database management systems store data tables in rows or columns, and each method offers benefits, depending on the application and type of data. However, they have been mutually exclusive. Teradata's new hybrid row and column capabilities allow the best of both worlds.

Posted May 08, 2015

Rocket Software has released Rocket Discover, a new self-service business intelligence solution that combines visual data discovery with data preparation.The new product also reduces the number of operations that can be performed on the data to the most fundamental ones that business users want to perform such as joining data, filtering data, and appending data.

Posted May 07, 2015

Software AG has made updates to its Terracotta In-Memory Data Management platform. New improvements to Terracotta Open Source Kit 4.3 include distributed storage and off-heap storage.The platform is used for boosting performance, scalability, and building real-time applications. Additionally, Terracotta helps developers leverage in-memory storage for current and emerging data workloads.

Posted May 07, 2015

Oracle has introduced Oracle Secure Global Desktop (SGD) Release 5.2, the latest version of the company's secure remote access solution for cloud-hosted enterprise applications and hosted desktops running on not only Microsoft Windows, but also Linux, Solaris and mainframe servers. SGD allows users to work securely from any device and from any location, while providing administrators controlled access to applications and desktop environments resident in the data center, providing a key benefit to large companies in a BYOD world.

Posted May 06, 2015

When databases are built from a well-designed data model, the resulting structures provide increased value to the organization. The value derived from the data model exhibits itself in the form of minimized redundancy, maximized data integrity, increased stability, better data sharing, increased consistency, more timely access to data, and better usability.

Posted May 06, 2015

Pivotal has made updates to its big data suite that include upgrades to the Pivotal HD enterprise-grade Apache Hadoop distribution, which is now based on the Open Data Platform core, and performance improvements for Pivotal Greenplum Database.

Posted May 05, 2015

The Pure Storage FlashArray 400 Series is now certified by SAP as an enterprise storage solution for the SAP HANA platform. The certification enables Pure Storage to participate in SAP's program for SAP HANA tailored data center integration using its certified solution.

Posted May 04, 2015

Cosentry, a provider of IT solutions in the Midwest, has extended its managed cloud to provide expertise in connecting to Microsoft Azure, with a focus on utilizing Azure for SQL Server testing and development, SQL Server backup and SQL Server disaster recovery environments.

Posted May 01, 2015

CA Workload Automation Advanced Integration 1.0 for SAP Business Warehouse has received SAP certification. Specifically, the SAP Integration and Certification Center has certified that CA Workload Automation Advanced Integration 1.0 integrates with SAP Business Warehouse to provide a unified view for jobs running in both SAP and non-SAP applications.

Posted April 30, 2015

Splice Machine, a provider of Hadoop RDMS, announced that it is partnering with mrc (michaels, ross & cole ltd), to allow Splice Machine's Hadoop RDBMS to be certified and integrated with mrc's m-Power platform. "Our partnership with mrc gives businesses a solution that can speed real-time application deployment on Hadoop with the staff and tools they currently have, while also offering affordable scale-out on commodity hardware for future growth," said Monte Zweben, co-founder and CEO, Splice Machine.

Posted April 28, 2015

Cloud is changing the game when it comes to data storage and processing, according to Greg Rahn, director of product management at Snowflake Computing. Rahn will be presenting a session at Data Summit 2015 focusing on demystifying the differences in cloud architecture and implementation that are critical to understand and evaluate the different options out there.

Posted April 28, 2015

Nimble Storage grew its business by providing its customers with insights and recommendations for optimizing their storage infrastructure and simplifying its day-to-day operations.Join this DBTA webcast, on Thursday, April 30 at 2 pm ET / 11 am PT, to find out how, using HP Vertica, Nimble Storage achieved an average of more than $2.3 MM/year in benefits, an ROI of 447% and a payback period of only 6.8 months.

Posted April 28, 2015

IBM continues to experience tough quarters, but its mainframe business stands out like a bright beacon. For a platform that consistently is pronounced to be on the verge of obsolescence, it proved again and again that it provides more value to businesses than farms of commodity servers. Able to run clouds, with the best security of any platform, mainframes continue to provide their worth.

Posted April 27, 2015

System z revenues - which doubled in the first quarter of 2015 - proved to be one of the few positive line items in IBM's latest spate of downward numbers. Revenues from System z mainframe server products increased 118% compared with the year-ago period. Total delivery of System z computing power, as measured in MIPS (millions of instructions per second), increased 95%.

Posted April 27, 2015

BMC announced an operations console designed to monitor complex IT environments and analyzes diverse data sets to deliver IT insights that help solve business issues. The new release, BMC TrueSight Operations Management 10, provides a single view of applications and infrastructure from any device across physical, virtual, and cloud environments.

Posted April 27, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64

Sponsors