Event streaming

If there are only a few events, there is not much of a need for event streaming, as event streaming provides real value with big data processes, where data comes in high volumes and in many different formats.

Event streaming refers to services that can accept data as and when it arises rather than accepting it periodically. For example, event streams should be capable of accepting temperature information from devices as and when they send it, rather than making the data wait in a queue or a staging environment.

Event streaming has the capability to query data while in transit. This is temporal data that is not stored, and the queries happen on the moving data. The data is not stationary. This capability is not available in other data platforms, which can query only stored data, not temporal data that has just been ingested.

Event streaming services should be able to scale easily to accept millions or even billions of events. They should be highly available such that sources can send events and data to them any time. Real-time data ingestion and being able to work on that data rather than data stored in a different location is the key to event streaming.

It should occur to the curious mind that when we already have so many data platforms with advanced query execution capabilities, why do we need event steaming? Let me provide you with some scenarios in which working on incoming data is quite important. These scenarios cannot be effectively and efficiently solved by existing data platforms:

There are many possibilities for applying event streaming within an enterprise, and its importance cannot be stressed enough.