Data flow

As shown in the analytics orchestration architecture diagram mentioned in the previous section, the data is streamed via Kafka as input data to the legacy analytics running using container technology, such as a Docker container to scale based on the data volume. The containers can embed a messaging consumer and publisher channel to exchange the data between the modern data infrastructure so as to read and write the results. The other option is to read and write the results based on business requirements to the respective data storage.

Get Industrial Internet Application Development now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.