Stream processing is gaining prominence within organizations, it unifies applications and analytics by processing data as it arrives, in real-time, and detects conditions within a short period of time from when data is received.
The key strength of stream processing is that it can provide insights faster, often within milliseconds to seconds. With that being said, stream processing naturally fits with time series data, as most continuous data series are time series data.
DBTA recently held a webinar with Riccardo Tommasini, assistant professor at University of Tartu, discussed how Stream processing, from an information-need perspective, requires considering order, context, and responsiveness.
According to Tommasini, the 8 requirements of real-time stream processing include keep the data moving, offers declarative access, handles imperfections, provides predictable outcomes, integrates stored and streaming data, provides data safety and availability, offers partitioning and scaling, and instantaneous response.
InfluxDB's scripting and query language, Flux, is a data scripting language that can help implement stream processing. The platform is Turing-complete, so it can express context-aware continuous information needs. Flux is functional, which makes it inherently declarative and is a scripting language, i.e., it is task-driven, portable and testable.
The platform takes historical data into account, offers tags that enable context awareness, provides low latency, and more, Tommasini said.
An archived on-demand replay of this webinar is available here.