Designing Large-Scale Data Processing Patterns Based on Data Workflow Management with Toss Developers
Learn the process of building data pipelines using Apache Airflow from basics to practical application. Understand Airflow's core concepts and architecture, and master advanced design patterns frequently used in practice such as dynamic DAGs, parallel processing, distributed processing, and Custom Operators through hands-on exercises. Set up a practice environment with Python and Docker, and develop practical skills to design and operate real workflows.
188 learners
Level Basic
Course period Unlimited

- Resolved
강의에서사용하신 root.py 파일이 안보여서 실습하면서 만든 텍스트 공유 드려요
dynamicDagfrom datetime import datetime, timedelta from airflow import DAG fr
빅데이터dockerdocker-composeairflowdellahong
・
3 months ago
0
101
7
- Resolved
실습환경에 대해 질문이 있습니다!
안녕하세요섹션6의 강의를 듣다가 질문드릴게 있어 글을 작성하게되었습니다우선 강의를 너무 잘 듣고 있음에
빅데이터dockerdocker-composeairflowjomseni0393
・
5 months ago
0
58
2

