Data integration is a hard challenge in every enterprise. Batch processing and Reverse ETL are common practices in a data warehouse, data lake or lakehouse. Data inconsistency, high compute cost, and stale information are the consequences.
This video introduces a new design pattern to solve these problems: The Shift Left Architecture enables a data mesh with real-time data products to unify transactional and analytical workloads with Apache Kafka, Flink and Iceberg.
Consistent information is handled with streaming processing or ingested into Snowflake, Databricks, Google BigQuery, or any other analytics or AI platform to increase flexibility, reduce cost and enable a data-driven company culture with faster time-to-market building innovative software applications.
Table of Contents:
00:26 - Data Products Business Value
02:14 - Batch ETL and ELT in the Lakehouse
04:36 - Shift Left Architecture
07:24 - Shift Left with Kafka, Flink and Iceberg
More details about the Shift Left Architecture:
[ Ссылка ]
Ещё видео!