Thanks Dormain. Hi everyone! Streaming and batch processing play a crucial role in modernizing data-driven applications.
In the previous keynote, my friend Oleg mentioned how one can bind a simple function's input and output to a messaging system using Spring Cloud Stream framework.
In a way, that single function would look like this: simple and straight-forward to deploy and manage. A single car on an open road with no turns.
But, In reality, what you end up deploying and managing is not just one function or one application.
It involves multiple applications forming a streaming pipeline and all the operational concerns around them.
In other words, your production deployment looks more like this complex interchange during rush hour.
This is where Spring Cloud Data Flow comes into play.
Spring Cloud Data Flow gives us a tool to:
- FIRST, construct a streaming pipeline (using individual applications) or compose batch applications
- THEN, automate the deployment configuration (for instance, if you are deploying your streaming pipeline into Kubernetes, SCDF takes care of generating K8s manifest files based on the stream configuration)
- SCDF provides continuously delivery of (individual applications of the stream or batch applications)
- FINALLY, SCDF let’s you securely manage all your streaming pipelines and batch applications, while inheriting the metrics and monitoring features from the monitoring systems that the Micrometer library supports.
With this, let’s see a demo.
With this, let’s see a demo.
Similarly you can compose, schedule and launch batch applications using Spring Cloud Data Flow.
We have a dedicated breakout session on Spring Cloud Data Flow.
My friend Glenn and I will walk through various features of SCDF.
Stay tuned. Thank you!