The Complete Hands-On Introduction to Apache Airflow Udemy Free Download
What you'll learn:
- Create plugins to add functionalities to Apache Airflow.
- Using Docker with Airflow and different executors
- Master core functionalities such as DAGs, Operators, Tasks, Workflows, etc
- Understand and apply advanced concepts of Apache Airflow such as XCOMs, Branching and SubDAGs.
- The difference between Sequential, Local and Celery Executors, how do they work and how can you use them.
- Use Apache Airflow in a Big Data ecosystem with Hive, PostgreSQL, Elasticsearch etc.
- Install and configure Apache Airflow
- Think, answer and implement solutions using Airflow to real data processing problems
Requirements::
- VirtualBox must be installed - A VM of 3Gb will have to be downloaded
- At least 8 gigabytes of memory
- Some prior programming or scripting experience. Python experience will help you a lot but since it's a very easy language to learn, it shouldn't be too difficult if you are not familiar with.
Description:
Apache Airflow is an open-source platform to programmatically author, schedule and monitor workflows. If you have many ETL(s) to manage, Airflow is a must-have.
In this course you are going to learn everything you need to start using Apache Airflow through theory and pratical videos. Starting from very basic notions such as, what is Airflow and how it works, we will dive into advanced concepts such as, how to create plugins and make real dynamic pipelines.
Who this course is for:
- People being curious about data engineering.
- People who want to learn basic and advanced concepts about Apache Airflow.
- People who like hands-on approach.
Course Details:
-
5.5 hours on-demand video
-
15 articles
-
5 downloadable resources
-
Full lifetime access
-
Access on mobile and TV
-
Certificate of completion