Join us for Module 3: SQL and the Transaction Log - Tuesday, June 14 -Delta Lake SQL -Time Travel Transaction Log Fundamentals This 3-part workshop is intended to teach you what Delta Lake is and how to use Apache Spark™ and Delta Lake in your data architectures for reliable large-scale distributed data pipelines. This course will show the features of Delta Lake that, alongside Spark SQL and Spark Structured Streaming, introduce ACID transactions and time travel (data versioning) to your ETL batch and streaming workloads. Slides, demos, exercises, and Q&A sessions should all together help you understand the concepts of the modern data lakehouse architecture. Requirements -Sign up for Databricks Community Edition -Participants are recommended to have experience with Apache Spark SQL and Python (PySpark)
IT Freelancer for Apache Spark, Delta Lake, Apache Kafka & Kafka Streams