๊ฐ•์˜

๋ฉ˜ํ† ๋ง

์ปค๋ฎค๋‹ˆํ‹ฐ

BEST
AI Technology

/

Deep Learning & Machine Learning

<From Scratch: Building and Learning LLMs> Commentary Lecture

This is a course covering the GitHub notebooks and bonus content from <Build a Large Language Model from Scratch> (Gilbut, 2025). GitHub: https://github.com/rickiepark/llm-from-scratch/ <Build a Large Language Model from Scratch> is the Korean translation of the bestseller <Build a Large Language Model (from Scratch)> (Manning, 2024) by Sebastian Raschka. This book provides a way to learn and utilize the operating principles of large language models by building a complete model starting from scratch with OpenAI's GPT-2 model.

(4.8) 21 reviews

527 learners

Level Basic

Course period Unlimited

  • haesunpark
PyTorch
PyTorch
gpt-2
gpt-2
transformer
transformer
LLM
LLM
Fine-Tuning
Fine-Tuning
PyTorch
PyTorch
gpt-2
gpt-2
transformer
transformer
LLM
LLM
Fine-Tuning
Fine-Tuning

We're announcing the upload of the From Scratch LLM course.

Hello. I'm Park Haeseon.

Finally, the lectures covering the main content and code from chapters 1 to 7 have been completed! ๐Ÿ˜€The Build a Large Language Model (From Scratch) course includes concept summaries (YouTube videos) and source code explanations, covering everything up to the final instruction fine-tuning.

But that's not the end! The original book's GitHub is packed with rich bonus content. Starting with this book's appendix, I'll create and upload lectures covering bonus content like hidden gems.

The first target is Appendix A PyTorch Introduction. Many readers have expressed disappointment about the lack of video lectures covering PyTorch. I'm glad we get to cover PyTorch with this opportunity. This lecture is scheduled to be uploaded starting from the fourth week of November. Please look forward to it! ๐Ÿ˜„

Thank you!

P.S: This course is scheduled to have a price increase soon. Don't be surprised if you see the increased price. Of course, existing students are completely unaffected by the price change and will be able to access all future additional content just as they do now.

Comment