Shin Kyung-sik's Deep Learning - Gradients and PyTorch's Autograd
This is a course where you learn the basic calculus and PyTorch's Autograd functionality needed to get started with deep learning.
65 learners
Level Basic
Course period Unlimited
News
4 articles
Hello students 😃
From basics to professional level, systematically covering deep learning - [Shin Kyungsik's Deep Learning] fourth lecture [Gradient-based Linear Regression (2)] is now open!
(Course Link: https://inf.run/ymv1P)
This is a lecture where you learn how to implement the code directly implemented in Gradient-based Linear Regression (1) into practical code using PyTorch's features. It is also a lecture where you theoretically learn about the necessity of data preprocessing, theory, and its impact on learning, and implement it with practical code.
Those following my All About AI curriculum would benefit from taking advantage of the open discount period😃
Additionally, the follow-up lecture [Gradient-based Linear Regression (3)] is scheduled to open sometime next week.
I will always do my best so that we can work together diligently to create a deep learning world!
Thank you.
Best regards, Shin Kyung-sik
Hello students 😃
From basic to professional level, systematically covering deep learning - [Shin Kyung-sik's Deep Learning] third lecture [Gradient-based Linear Regression (1)] is now open!
(Course Link: https://inf.run/KgQoQ)
This lecture covers the process of training the simplest model on data, based on the previously opened [Gradient Descent].
We're currently offering an opening commemoration discount, so those following my curriculum should take advantage of this period to register for the course😃
Additionally, the follow-up lecture [Gradient-based Linear Regression (2)] is scheduled to open this week.
I will always do my best so that we can work together diligently to create a deep learning world!
Thank you.
Best regards, Shin Kyung-sik
Hello students 😃
From basic to professional level, systematically covering deep learning - [Shin Kyung-sik's Deep Learning] second lecture [Gradient Descent] is now open!
(Lecture Link: https://inf.run/bK5xe)
This lecture is based on the previously opened [Gradients and PyTorch's Autograd] and covers the most fundamental learning algorithm in deep learning,
It focuses intensively on gradient descent.
Currently offering an opening commemoration discount, so those following my curriculum should take advantage of this period to register for the course😃
Additionally, the next lecture [Gradient-based Linear Regression (1)] is scheduled to open this weekend.
I will always do my best so that we can work together diligently to create a deep learning world!
Thank you.
Best regards, Shin Kyung-sik
Hello everyone, students!
Starting today, we're launching our full-scale deep learning course curriculum, so we're here to guide you😃
The [Shin's Deep Learning (ShinDL)] curriculum that I personally create covers deep learning systematically from the basics to actual paper-level lectures,
This is a curriculum with the goal of perfectly understanding deep learning technology by directly implementing all the techniques yourself!
Additionally, since the field of deep learning is so broad, the course is structured as modularized lectures covering specific topics rather than large-volume lectures.
This newly opened course is the first lecture in the [Shin Kyung-sik's Deep Learning] curriculum, designed to properly understand deep learning
A course that teaches 'differentiation', an essential mathematical foundation, and PyTorch framework's autograd technology
[Gradients and PyTorch's Autograd]입니다.
(Course Link: https://inf.run/wZoxE)
Going forward, we plan to cover not only basic deep learning techniques but also conduct practical projects based on deep learning papers.
If you are preparing to study deep learning, please follow the curriculum step by step starting from this lecture😃
Additionally, please note that the second deep learning lecture [Gradient Descent] is scheduled to open next week!
I will do my best to provide even better lectures in the future!
Thank you.
Best regards, Shin Kyung-sik

