Designing Large-Scale Data Processing Patterns Based on Data Workflow Management with Toss Developers
Learn the process of building data pipelines using Apache Airflow from basics to practical application. Understand Airflow's core concepts and architecture, and master advanced design patterns frequently used in practice such as dynamic DAGs, parallel processing, distributed processing, and Custom Operators through hands-on exercises. Set up a practice environment with Python and Docker, and develop practical skills to design and operate real workflows.
(4.8) 15 reviews
172 learners
Level Basic
Course period Unlimited
Big Data
Big Data
Docker
Docker
docker-compose
docker-compose
airflow
airflow
Big Data
Big Data
Docker
Docker
docker-compose
docker-compose
airflow
airflow





