Cisco Launches Tetration Data Center Analytics Platform with Open API for Extensibility

Cisco has unveiled Cisco Tetration Analytics, a platform designed to help customers gain wider visibility across all aspects of their data center in real time.

Cisco Tetration Analytics  gathers telemetry from hardware and software sensors, and then analyzes the information using advanced machine learning techniques to address critical data center operations such as policy compliance, application forensics, and the move to a whitelist security model.

The platform can monitor every activity inside the data center in real time, enables rich searches using natural language processing, and provides actionable insights, said Yogesh Kaushik, senior director of product management, Tetration, Cisco.

Designed to work with any data center - whether or not it has Cisco technologies, the platform is applicable for a range of industries such as financial services, healthcare, government, and service providers, said Kaushik.

Available in July 2016, Tetration will be offered in an appliance form factor and comes “racked and stacked, fully wired, one-touch deployable,” he noted.

The first Tetration platform will be a full rack appliance that can be deployed on-premise at the customer’s data center.

The platform will also provide an open API - to give ecosystem partners the opportunity to develop applications.  

The Problem Tetration Analytics Addresses

According to Cisco, IT managers are hampered by a lack of visibility and knowledge because there is currently no single tool that can collect consistent telemetry across the entire data center and analyze large volumes of data in real time, at scale, and as a result, organizations have performed fragmented tasks without the correlation necessary to address operational issues comprehensively. 

Tetration is designed to help enable visibility across the data center using either server software sensors that require low overhead, network hardware sensors that monitor every packet at line rate, or both combined for a more complete solution. It executes advanced data center analytics in real time and presents actionable analysis with easy to understand visuals, delivering information such as application insights, automated white list policy recommendations, policy simulation and impact analysis, compliance management, and network flow forensics.

 “Think of it as a time machine for your data center,” said Kaushik, allowing organizations to search through billions of records in seconds across several different directions. “You can literally say ‘I want to know what happened 3 days ago in this 2-minute window’ or ‘how much traffic came from North Korea, and which server did it come to and which user processed it.’”

Similarly, it allows organizations to plan for the future, and see what will happen if they make changes before they are executed to understand the impact on applications, and get insights that are needed before take action is taken, he noted.  Organizations can also validate that policy changes have actually been applied and taken full effect, and do real-time and historical policy simulation - replay what happened in the network at any time, with long-term data storage capabilities.

Data Center Requirements

With Tetration, software sensors are installed on end hosts: either virtual machine or bare metal servers. In the first Tetration release, software sensors support Linux and Windows server hosts, while hardware sensors are embedded in the ASICs of Cisco Nexus 9200-X and Nexus 9300-EX network switches to collect flow data at line rate from all the ports.  According to Cisco, a single Tetration appliance will monitor up to one million unique flows per second. Both software and hardware sensors communicate the flow information in real time to the Tetration Analytics platform. The platform can be installed in any data center with any servers and any network switches.

The Tetration platform is a “one-touch” appliance: the servers and switches are prewired and the software is pre-installed. There is no special big data expertise needed to deploy or operate Tetration. Available in July 2016, the first Tetration platform will be a full rack appliance that is deployed on-premise at the customer’s data center.

It can be deployed in any data center and the data center does not have to have Cisco technologies, said Kaushik. It could be legacy network based on old Cisco switches, a competitor’s network, or public cloud where the customers don’t know what network it is.  “Customers can address 85-90% of use cases just from the server sensors.”

According to Kaushik, Tetration can be deployed in any data center and does not have to have Cisco technologies. An organization could have a legacy network based on old Cisco switches, competitor’s network, or even a public cloud where the customers don’t know what network it is. “Customers can address 85-90% of issues just with the server sensors,” said Kaushik.

Relying on Ecosystem Partners

“The big gap in the market is a platform that can scale out - that is our intellectual property,” said Kaushik.

To expand the applicability of the platform, he added, “We are building a very open platform and the way that we envision scaling this out to address more use cases for customers is through ecosystem partners.  “We are working with a handful of partners today who do orchestration, including policy orchestration, security orchestration, and forensics players who can write more applications on top of this platform using the rich data and real-time analysis that Cisco provides to get more high-level forensics for compliance,” said Kaushik.

“As we go forward, we are going to start getting deeper into applications in three broad areas, IT operations, application performance monitoring and management, and security. We are going to do a handful of things ourselves and then rely on ecosystem partners to scale it out for us,” Kaushik said.

“Think of this as the Apple iOS or Google Android platform. They built the platform but they don’t write every single application. If they had kept it closed and didn’t let others write applications, it would not be useful today. We built the platform and we will do some applications that we know how to do very well, and we are going to have our partners write the rest of it so that it becomes richer for our customers - and that is why we have the open APIs.

Use Cases

According to Kaushik, the platform has broad applicability.

In financial services, he said, discussions are mainly centered around performance and monitoring performance of their financial applications, especially on the trading floor. “We can show the latency tracking every single event, every single packet, and we can pinpoint where the performance delays were so they can do something about it.”

“In healthcare, they want to understand that whatever policy and whatever access control they define is actually being met – if they get audited, they want to show proof that their policy was never violated and, if it was, that they took remedial action.”

Public sector customers, said Kaushik, don’t want to throw out any data so it is important that the platform can scale out and store billions of records, and they want to be able understand what happened at any precise moment and for any specific context.

For service providers, he added, the big issue they are concerned with is managing applications without any impact to their tenants because they have to continue to meet SLAs.

For more information, go to