
실리콘밸리 엔지니어와 함께하는 앤서블(Ansible)
미쿡엔지니어
실리콘밸리 엔지니어에게 배우는 실용적인 Ansible(앤서블) 강의 입니다. Ansible은 자동화된 인프라 관리와 프로비저닝을 위해 필요한 강력한 도구로, 시간과 노력을 절약하고 일관된 환경을 구축할 수 있습니다.
초급
Ansible, devops, infrastructure
You will learn Apache Airflow, the most widely used Orchestrator for creating software data pipelines.
551 learners
Build a data pipeline
Apache Airflow
Introduction to Apache Spark
Introducing MinIO
TaskFlow API
Slack Integration
Airflow, an essential tool for data pipelines
Learn from Silicon Valley developers! 🔥
Learn the simple setup and usage of Airflow. We'll guide you through creating your first data workflow.
As data analysis and processing tasks become more complex , issues such as job scheduling, dependency management, and error handling become more important. To effectively address these challenges, Airflow is one of the core tools of choice for many organizations.
This lecture is a lecture prepared in advance for those who are new to Airflow. Apache Airflow, which is popular in Silicon Valley. I will help you get started in a simple and easy way.
Automated workflow management
Airflow provides powerful scheduling capabilities to automatically run and manage tasks over time. You can plan and execute data processing tasks more efficiently.
Dependency Management
Complex data workflows require precise management of dependencies between tasks. Airflow provides the ability to clearly define dependencies between tasks and specify the order of tasks.
Powerful monitoring and notifications
You can monitor the progress of your jobs through the Airflow dashboard while your jobs are running. You can also set up alerts to respond quickly if your jobs fail or encounter issues.
Scalability and flexibility
Airflow supports a variety of plugins and libraries. It can integrate with various data stores, job execution environments, and notification mechanisms. Build custom workflows to fit your needs.
Community and Ecosystem
Airflow has a vibrant community and rich ecosystem, so there are a lot of great resources to help you troubleshoot.
✅ Easy to understand without difficult concepts! Explains what Airflow is and why it is needed through analogies and examples .
✅ A hands-on course that follows the actual Airflow usage process and creates a simple data workflow!
✅ If you have any questions or do not understand anything during the lecture, please feel free to ask questions at any time. Learn with Q&A !
We will share with you the know-how of current Silicon Valley engineers!
I am a current software engineer who runs YouTube's " American Engineer " and Brunch's " Silicon Valley News and Life ." I graduated from the University of California, Berkeley EECS and am currently working on big data at the headquarters of a global big tech company in Silicon Valley. I would like to share the know-how I have learned from my actual work with many people. 🙂
💡
Data Engineer
Automate and schedule data workflows to maintain data quality and consistency .
💡
Data Analyst
Try handling tasks like regular data updates or model retraining .
💡
Data Scientist
Efficiently manage your data science process by automating model training, evaluation, batch predictions, and more .
💡
system administrator
It can increase transparency and reliability of work execution.
💡
Data Engineering and Development Team
You can implement various automated tasks such as ETL (Extract, Transform, Load) tasks and API calls .
💡
Project Manager
You can effectively adjust your project schedule by setting task dependencies, priorities, expected execution times, etc.
Q. Why should I learn Apache Airflow?
Apache Airflow is a data workflow management tool used to automate, schedule, and monitor data pipelines. It can efficiently manage data workflows for various roles such as data engineers, data scientists, and system administrators.
In the latter half of the lecture, you will learn how to integrate with big data technology (Apache Spark), which will be of great help to data engineers who manage many pipelines.
Q. Is this a lecture that non-majors can also take?
If you are not a major but know the basics of Python and want to improve the efficiency of your data or task scheduling workflow, this will be of great help.
If you are new to Python, learn the basics of Python through YouTube or take the lecture below first! Even if you only watch the basics, you will have no trouble following the entire lecture.
Q. Is there anything I need to prepare before attending the lecture?
Since the code is written in Python, there will be no basic Python lectures. Also, there will be practical training using Docker, so it will be easier to understand if you have basic knowledge of Docker.
Who is this course right for?
Data Engineer
If you want to become a data engineer
Need to know before starting?
Python
10,547
Learners
716
Reviews
307
Answers
4.8
Rating
25
Courses
한국에서 끝낼 거야? 영어로 세계 시장을 뚫어라! 🌍🚀
안녕하세요. UC Berkeley에서 💻 컴퓨터 공학(EECS)을 전공하고, 실리콘 밸리에서 15년 이상을 소프트웨어 엔지니어로 일해왔으며, 현재는 실리콘밸리 빅테크 본사에서 빅데이터와 DevOps를 다루는 Staff Software Engineer로 있습니다.
🧭 실리콘 밸리의 혁신 현장에서 직접 배운 기술과 노하우를 온라인 강의를 통해 이제 여러분과 함께 나누고자 합니다.
🚀 기술 혁신의 최전선에서 배우고 성장해 온 저와 함께, 여러분도 글로벌 무대에서 경쟁할 수 있는 역량을 키워보세요!
🫡 똑똑하지는 않지만, 포기하지 않고 꾸준히 하면 뭐든지 이룰수 있다는 점을 꼭 말씀드리고 싶습니다. 항상 좋은 자료로 옆에서 도움을 드리겠습니다
All
32 lectures ∙ (4hr 8min)
Course Materials:
All
43 reviews
$51.70
Check out other courses by the instructor!
Explore other courses in the same field!