Newsletters




Data Center Management

Mainframes continue to represent the strong core of Data Center technology, while Virtualization and Cloud are also emerging as important technologies to support efficiency, scalability, and cost containment. Topics critical to Data Center Operations and Computing including hardware and software for Storage, Consolidation, High Availability, Backup & Recovery, and IT Optimization, as well as automated tools that help compensate for the growing Mainframe Skills Shortage.



Data Center Management Articles

The elastic and distributed technologies that run modern applications require a new approach to operations — one that learns about your infrastructure and assists IT operators with maintenance and problem-solving. The interdependencies between new applications are creating chaos in existing systems and surfacing the operational challenges of modern systems. Solutions such as micro services architectures alleviate the scalability pains of centralized proprietary services but at a tremendous cost in complexity.

Posted August 25, 2016

Teradata has introduced four accelerators to help companies derive value from Internet of Things (IoT) data more quickly. Teradata's "Analytics of Things Accelerators" (AoTAs) are comprised of technology-agnostic intellectual property (IP) and professional services, derived from field engagements in manufacturing, transportation, mining, energy and utilities.

Posted August 25, 2016

Logz.io, a company that offers ELK (Elasticsearch Logstash and Kibana) as a cloud service, is introducing a new analytics solution that combines machine learning technology with human experience. The new solution, called Cognitive Insights, aims to help companies give their customers a better experience by uncovering data issues before they become problems.

Posted August 25, 2016

Being able to deliver a rapid response to problems is a core need for the modern enterprise. While many network "issues" can suffer a delay in response time, true problems such as security breaches and network outages cannot. Let's look at two situations that require a rapid response, five questions that IT never wants to hear, and one example of a solution that IT can implement.

Posted August 24, 2016

SolarWinds, a provider of IT management software, has announced significant updates to the SolarWinds Database Performance Analyzer (DPA). Version 10.2 supports MySQL as a back-end database in addition to SQL Server and Oracle Database, includes visualization and analysis tools to help DBAs optimize for and resolve blocking, locking, and deadlock issues, and adds support for Microsoft SQL Server 2016.

Posted August 24, 2016

EMC has unveiled new products and support to optimize the protection of VMware workloads to enable protection everywhere. The enhanced support spans VMware Virtual SAN, VMware vSphere and expanded data protection options for VCE VxRail Appliances.

Posted August 23, 2016

Syncsort, a provider of big data and mainframe software, has completed the acquisition of UK-based Cogito Ltd., a maker of specialized software that helps improve the performance of IBM DB2 and CA IDMS database management systems on IBM z/OS, and lowers mainframe costs. With more than 80% of corporate data residing on or originating on mainframes today, this data is a critical ingredient for next-generation machine learning and big data systems, according to Syncsort.

Posted August 23, 2016

Hazelcast, a provider of an open source in-memory data grid, has announced the general availability of Hazelcast 3.7. According to the company, the latest release is 30% faster than previous versions and is the first fully modularized version of Hazelcast. Each client/language and plugin is now available as a module - speeding up the development process for open source contributors, with new features and bug fixes released as modules alongside Hazelcast 3.7.

Posted August 23, 2016

Rocana, a company that provides software for operational visibility, has released Rocana Ops 1.6 introducing role-based access controls (RBAC) and support for new application performance metrics (StatsD) to enable deeper visibility with greater control across the enterprise.

Posted August 23, 2016

Perhaps the biggest and most overlooked is how to create accurate test data. You're implementing a new system in order to deal with a massive amount of data, and perhaps your relational database can't handle the volume, so it's vitally important to properly test this new system and ensure that it doesn't fall over as soon as the data floods in.

Posted August 23, 2016

SHARE recently wrapped up its summer conference in Atlanta. James Vincent, immediate past president of SHARE, reflected on the changes that have taken place in the IT industry during his tenure and the key takeaways from the event which took place July 31-August 5. "One takeaway is that SHARE is on the right track when it comes to its focus on the new IT generation, what we call zNextGen," said Vincent.

Posted August 22, 2016

IBM has announced that Workday, a provider of enterprise cloud applications for finance and human resources, has adopted the IBM Cloud as part of a multi-year strategic partnership. IBM Cloud will become the foundation for Workday's development and testing environment. IBM Cloud will provide capacity expansion in support of Workday's development and testing requirements. "Workday will use IBM Cloud to continue accelerating Workday's internal development and testing efforts to support our ongoing global expansion," said Aneel Bhusri, co-founder and CEO of Workday.

Posted August 22, 2016

Coho Data, a provider of scale-out all flash storage architecture and infrastructure solutions for private clouds, has introduced DataStream 2.8, helping to make the Software-Defined Data Center (SDDC) a reality by delivering public cloud to the enterprise private cloud.

Posted August 22, 2016

Seagate Technology plans to ship two new flash innovations that extend the limits of storage computing performance in enterprise data centers to higher levels. The new products include a 60 terabyte (TB) Serial Attached SCSI (SAS) solid-state-drive (SSD) and the 8TB Nytro XP7200 NVMe SSD.

Posted August 22, 2016

HyTrust Inc. has added enhanced capabilities for its workload security platform to support organizations with virtualized, public, private or multi-cloud environments for popular cloud technologies including Amazon Web Services, IBM, EMC, Intel, Microsoft and VMware. The releases, HyTrust DataControl 3.2 and HyTrust CloudControl 5.0, are intended to address data center complexity.

Posted August 22, 2016

Nervana has announced its planned acquisition by Intel, a move it says indicates that Intel is formally committing to pushing the forefront of AI technologies. The company will continue to operate out of its San Diego headquarters. Terms of the deal were not disclosed by industry estimates placed it at more than $400 million.

Posted August 16, 2016

The size and complexity of database environments is pushing IT resources at most organizations to the limit. This reduces agility and increases costs and challenges associated with maintaining the performance and availability of these on demand services. To address these concerns, many IT departments are looking for ways to automate routine tasks and consolidate databases.

Posted August 16, 2016

As IT decision making moves out of the IT department and into the functional areas of organizations, partnerships and collaboration become even more critical. According to an article in strategy+business on why CEOs must become more technology savvy, "the majority of technology spending (68%) is now coming from budgets outside of IT, a significant increase from 47% in 2014." What this means is that many critical technology decisions are being made without the consultation of IT professionals.

Posted August 16, 2016

Redis Labs and Intel announced they have collaboratively benchmarked a throughput of 3 million database operations/second at under 1 millisecond of latency, while generating over 1GB NVMe throughput, on a single server with Redis on Flash and Intel NVMe-based SSDs.

Posted August 12, 2016

Informatica is releasing five new Informatica Cloud offerings in Amazon Web Services Marketplace (AWS Marketplace) to help organizations jumpstart data management projects in the cloud.

Posted August 12, 2016

Nimbus Data is releasing a new all-flash platform for cloud, big data, virtualization, and massive digital content that will offer unprecedented scale and efficiency.

Posted August 09, 2016

Datameer is launching a newly redesigned Global Partner Program to recruit and enable a dynamic ecosystem of technology, services, and reseller partners. Datameer's new program introduces a rewards system that recognizes partners and helps form a trusted and easily accessible network for customers.

Posted August 09, 2016

Blue Medora, a provider of cloud and data center management solutions, is partnering with New Relic to develop a new application performance monitoring tool. The new REST API-based, agentless plugins are aimed at improving enterprise IT operations' visibility between Blue Medora and the other systems that companies monitor with New Relic.

Posted August 08, 2016

The Independent Oracle Users Group (IOUG) is excited to be joining the Oracle technology community in San Francisco once again at Oracle OpenWorld 2016, September 18-22. IOUG's 30,0000+ member community is comprised of the top Oracle technology experts from around the globe, several of whom will be presenting sessions on hot topics like Data Intelligence, iOT, Data Security, and Cloud migrations.

Posted August 04, 2016

Thousands of members of the Oracle Applications Users Group (OAUG) get the answers they need by sharing best practices, case studies and lessons learned. As the world's largest education, networking and advocacy forum for users of Oracle Applications, the OAUG helps members connect to find the solutions they need to do their jobs better and to improve their organizations' return on investment in Oracle Applications.

Posted August 04, 2016

Anyone who has ever attended Oracle OpenWorld knows that you must plan ahead. The conference held in San Francisco each fall is vast, and the upcoming conference, scheduled for September 18-22, 2016, promises to be equally expansive. Just as in years before, tens of thousands of attendees from well over 100 countries can be expected converge to learn more about Oracle's ever-expanding ecosystem of technologies, products and services during thousands of sessions held at the Moscone Center and multiple additional venues in downtown San Francisco. Here, Database Trends and Applications presents the annual Who to See @ Oracle OpenWorld special section.

Posted August 04, 2016

I had the pleasure to spend some time with my old friend Mark Souza, a general manager in the Data Platform team at Microsoft, while speaking at the SQL Saturday event in Dublin, Ireland. Now keep in mind that Mark and I have known each other since the 1990s when SQL Server was just being ported to a brand new operating system called Windows NT. Mark and I were having a laugh and more than a twinge of nostalgia about how much SQL Server has improved over the decades and now sits atop the heap on most analysts' "best database" reports. This isn't just two old-timers sharing a few war stories though. This is a living, breathing transformation that is still in process.

Posted August 04, 2016

Can Oracle and its partners keep up with the increasing demands of customers for real-time digital capabilities? Is the Oracle constellation of solutions—from data analytics to enterprise applications—ready for the burgeoning requirements of the Internet of Things (IoT) and data-driven businesses? For Oracle—along with its far-flung network of software vendors, integrators, and partners—times have never been so challenging.

Posted August 04, 2016

The month of June heralded the long-awaited release of Microsoft's SQL Server 2016. The 2016 edition includes unique security functionality with Always Encrypted that protects data both at rest and in motion, ground-breaking performance and scale as evidenced by the number-one performance benchmarks, and accelerated hybrid cloud scenarios and Stretch Database functionality that supports historical data on the cloud.

Posted August 04, 2016

Organizations have directed a lot of attention recently to consolidation, automation, and cloud efforts in their data management environments. This will purportedly result in decreased demand for data managers and the need for fewer DBAs per groups of databases. However, the opposite seems to be occurring. In actuality, there is a growing need for more talent, as well as expertise to manage through growing complexity. A new survey, sponsored by Idera and conducted by Unisphere Research among more than 300 data executives, managers, and professionals finds that a more challenging data environment is arising due to a confluence of factors.

Posted August 03, 2016

To manage growing data volumes and pressing SLAs, many companies are leveraging Apache™ Kafka and award-winning Attunity Replicate with next-generation change data capture (CDC) for streaming data ingest and processing.

Posted August 03, 2016

The potential of your business intelligence or data processing application is limited without a comprehensive data connectivity solution. Staying competitive and relevant requires a breadth of data connectivity options.

Posted August 03, 2016

Our goal at Amazon Web Services (AWS) is to enable our customers to do things that were previously not possible, and make things that our customers can already do simpler and better at a much lower cost.

Posted August 03, 2016

It is a great honor to accept DBTA's Readers' Choice Award for Best Data Governance Solution for the second year in a row. Our mission at BackOffice Associates is to help Global 2000 organizations utilize data stewardship and holistic information governance solutions to set, execute and enforce key data policies across their entire data and systems landscape.

Posted August 03, 2016

It is both exciting and validating to be selected as the number 1 data modeling solution by DBTA's discerning readers for the third consecutive year in a row!

Posted August 03, 2016

Data replication advances a number of enterprise goals, supporting scenarios such as distribution of information as part of business intelligence and reporting initiatives, facilitating high availability and disaster recovery, and as part of a no-downtime migration initiative.

Posted August 03, 2016

Everyone knows the three Vs, volume, velocity, and variety, of big data but what's required is a solution that can extract valuable insights from new sources such as social networks email, sensors, connected devices as sensors, the web, and smartphones.

Posted August 03, 2016

The cloud continues to transform computing, making it less expensive, easier and faster to create, deploy, and run applications as well as store enormous quantities of data.

Posted August 03, 2016

Query and reporting solutions are part of a comprehensive business intelligence approach in every organization. As long as enterprises need to gather data, BI groups look to utilize query and report programs as primary applications that produce output from information systems

Posted August 03, 2016

As data grows, organizations are looking for ways to dig up insights from underneath layers of information. Data mining solutions provide the tools that enable them to view those hidden gems and facilitate better understanding of new business opportunities, competitive situations, and complex challenges.

Posted August 03, 2016

Business intelligence encompasses a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

Posted August 03, 2016

To evaluate data quickly and enable predictive analytics based on sources of rapidly changing data - including social media, sensors, and financial services data -- streaming data solutions come to rescue, providing the data when it is needed ... now.

Posted August 03, 2016

Many enterprises are finding themselves with different options when it comes to moving, working, and storing data. While one tool may be right for one organization, another or a combination of tools may be just what the doctor ordered.

Posted August 03, 2016

Data virtualization provides organizations with the ability to allow the business and IT sides of organizations to work closer together in a much more agile fashion, and helps to reduce the complexity.

Posted August 03, 2016

A key component to data integration best practices, change data capture (CDC) is based on the identification, capture, and delivery of the changes made to enterprise data sources. CDC helps minimize access to both source and target systems as well as supports the ability to keep a record of changes for compliance, and is a key component of many data processes.

Posted August 03, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80

Sponsors