Adaptigent has announced the launch of its newest product, Intelligent Caching. According to the company, the Intelligent Caching engine is a distributed, in-memory data cache designed to reduce mainframe integration load and costs, while reducing API response times up to 50X for mission-critical data and transaction calls.
Adaptigent is currently testing Intelligent Caching with a group of early adopters and will move it to general availability later in the year.
The company says the product was developed in response to customers' concerns over rising mainframe activity and costs, and the patented technology can pre-load highly cacheable data during periods when mainframe demand is relatively low, allowing customers to offload mainframe processing when demand is high. The Intelligent Caching engine also supports partial caching, in which a single API request has cacheable and non-cacheable data elements to it. The engine can pull data from the cache and combine it with live data to fulfil the API request. Users can expect to see a 10X-50X improvement in the response times of APIs that are calling cached data.
"Traditional caching engines use a naïve proxy cache that only caches the entire transaction, or none of it," said Alex Heublein, president of Adaptigent. "Our caching engine uses workflows that allows users to set caching policies for individual mainframe transactions. This adds a layer of intelligence that we haven't seen yet in the marketplace."
For more information, go to www.adaptigent.com.