Azure Databricks Delta Live Tables is a framework for building reliable, maintainable, and testable data processing pipelines. You define the transformations to perform on your data, and Delta Live Tables manages task orchestration, cluster management, monitoring, data quality, and error handling on its own. Instead of defining your data pipelines using a series of separate Apache Spark tasks, Delta Live Tables manages how your data is transformed based on a target schema you define for each processing step. You can also enforce data quality with Delta Live Tables with a feature called "expectations". "Expectations" allow you to define expected data quality and specify how to handle records that fail those expectations. In this session, we will learn how to develop a data processing pipeline with Azure Databricks Delta live tables. Who is the target attendee? Data Architect/ETL Developers/Consultants Why would that person want to attend your session? - To learn the modern way to develop the data processing pipelines What can the attendee walk away with? - What is an Azure Databricks Delta live table? - Delta Live Tables concepts. - How to Develop Data processing pipelines with Delta Live Tables?
Rajaniesh Kaushikk is a TOGAF Certified Enterprise Architect with 20 years of extensive experience in successfully delivering complex software application Architecture and solutions for Fortune 500 companies. He is an expert in Digital Transformation, Architecting, and developing modern applications in Azure, IoT, Edge, Microservices, and BOTS. Rajaniesh is a mentor, a speaker, and the man behind http://rajanieshkaushikk.com. Rajaniesh is Microsoft certified trainer and Microsoft Certified Architect Expert. Rajaniesh loves writing technical blogs and community contributions via various forums and won several recognitions. Rajaniesh can be reached via his blog and youtube channel: https://rajanieshkaushikk.com https://www.youtube.com/channel/UCyRuVBsflAelmCSzsiHr99Q
We seek to provide a respectful, friendly, professional experience for everyone, regardless of gender, sexual orientation, physical appearance, disability, age, race or religion. We do not tolerate any behavior that is harassing or degrading to any individual, in any form. The Code of Conduct will be enforced.
All live stream organizers using the Global Azure brand and Global Azure speakers are responsible for knowing and abiding by these standards. Each speaker who wishes to submit through our Call for Presentations needs to read and accept the Code of Conduct. We encourage every organizer and attendee to assist in creating a welcoming and safe environment. Live stream organizers are required to inform and enforce the Code of Conduct if they accept community content to their stream.
If you are being harassed, notice that someone else is being harassed, or have any other concerns, report it. Please report any concerns, suspicious or disruptive activity or behavior directly to any of the live stream organizers, or directly to the Global Azure admins at firstname.lastname@example.org. All reports to the Global admin team will remain confidential.
We expect local organizers to set up and enforce a Code of Conduct for all Global Azure live stream.
A good template can be found at https://confcodeofconduct.com/, including internationalized versions at https://github.com/confcodeofconduct/confcodeofconduct.com. An excellent version of a Code of Conduct, not a template, is built by the DDD Europe conference at https://dddeurope.com/2020/coc/.