강의

멘토링

커뮤니티

AI Technology

/

Natural Language Processing

Understanding the Fundamental Principles of Large Language Models (LLMs)

This explains the foundational principles of large language models like ChatGPT with a focus on theory.

(4.0) 3 reviews

86 learners

Level Intermediate

Course period Unlimited

  • arigaram
llm
llm
llm성능평가및튜닝
llm성능평가및튜닝
chatgpt
chatgpt
생성형ai
생성형ai
NLP
NLP
gpt
gpt
AI
AI
ChatGPT
ChatGPT
LLM
LLM
llm
llm
llm성능평가및튜닝
llm성능평가및튜닝
chatgpt
chatgpt
생성형ai
생성형ai
NLP
NLP
gpt
gpt
AI
AI
ChatGPT
ChatGPT
LLM
LLM

What you will gain after the course

  • Fundamental Principles of Large Language Models (LLMs)

  • LLM Production Process

🧭Important Notes

This course is currently being completed. Please note that you may have to wait a long time until the course is fully finished (though updates will be added regularly). Please consider this when making your purchase decision.

🧭Change History

  • January 8, 2026

    • Previously, each lesson number used a chapter-section-subsection numbering system that differed from the section numbers, which caused some confusion. To make the table of contents easier to understand, I've changed it to a format linked to section numbers (e.g., lesson 1-1 for the first lesson in the first section). However, please understand that it may take considerable time to update the slide numbers in each lesson and the lesson numbers in each attached file.

  • December 10, 2025

    • I've added beginner, intermediate, and advanced sections covering the topic "Complete Guide to Tokenization for LLMs."

  • September 27, 2025

    • "Section 17. 'Understanding the Complete LLM Development Process' Advanced", "

      We have significantly expanded and reorganized the lesson outline for "Section 18. 'Understanding the Complete LLM Creation Process' Hands-on (Python + Google Colab)". We are preparing lecture content aligned with the new outline.

  • September 18, 2025

    • I added precautions to the detailed introduction page.

    • The table of contents for "Section 10, 'Transformer Architecture' Practice" has been revised. We are preparing lecture content aligned with the new table of contents.

    • The table of contents for "Section 16, Understanding the Complete LLM Development Process" has been revised. Accordingly, the existing lectures have been deleted, and new lecture content aligned with the updated table of contents is being prepared.

  • September 1, 2025

    • All lessons have been categorized with prefixes: [Basic], [Advanced], and [Practice]. Existing [Supplementary] lessons correspond to [Advanced] lessons, so they have been labeled with the '[Advanced]' prefix.

    • To reduce confusion and make the learning process easier to understand, all sections have been divided into general sections (sections containing [Basic] or [Advanced] lessons), advanced sections (sections containing only [Advanced] lessons), and practice sections (sections containing only [Practice] lessons).

    • By reducing the potential for confusion in this way, we have made public again all the lessons that were changed to private status on August 22, 2025.


  • August 31, 2025

    • The practice lesson table of contents for Sections 1 through 10 have been made public. The content will be released gradually over time.

    • The table of contents for [Supplementary] and [Advanced] lessons in Sections 1 through 10 have been made public again. This is to help students understand the connection with the practice lesson table of contents.


  • August 22, 2025

    • I have changed the lessons in the [Advanced] and [Supplementary] courses that are not yet completed to private status. They will be made public by section as they are completed. This measure is to reduce confusion for students, and I would appreciate your understanding.

  • August 17, 2025

    • We are currently adding advanced course lessons and splitting longer lectures. Therefore, the section numbers in the course materials may differ from the section numbers shown in the table of contents.


🧠 Understanding the Fundamental Principles of Large Language Models (LLM): From Practical Applications of Generative AI to Cutting-Edge Research Trends

A foundational course for becoming a full-stack practical AI expert to understand and apply the latest LLMs such as GPT, Claude, and LLaMA

👥 Recommended for

  • Engineers/Data Scientists who want to develop and deploy AI models

  • Startup/corporate professionals planning new services based on generative AI

  • Policy planners and legal professionals considering AI ethics and legal risks

  • Researchers and master's/doctoral students who want to stay updated on the latest AI trends

  • Developers who want to learn prompt engineering and LangChain

  • Anyone interested in LLM, NLP, GPT, ChatGPT, generative artificial intelligence (AI), etc.

🔥 Course Features

  • "Today's learning becomes tomorrow's competitive edge! The most practical course to build AI expertise that will shine even 10 years from now."

  • "Worth more than 100,000 won? No. It's an investment in AI capabilities that will protect your career even 10 years from now."


  • "No more superficial knowledge! Through bonus lectures, you can learn the depths of LLM technology."

  • "This is different from other courses. It covers everything from the latest research trends to future AI."

  • "Grow as an AI expert while developing responsible AI capabilities! Learn ethics, regulations, and safety all at once."

🧑‍💻 Teaching Method

  • I take notes based on key content and explain with a theory-focused approach.

  • [Added September 1, 2025] However, we have added practical exercises using Python code to aid understanding.

A scene explaining how to select an appropriate LLM

A scene explaining RLHF (Reinforcement Learning from Human Feedback) in detail.

A section explaining neural network quantization methods.

After completing the course

  • Based on a deep understanding of the definition and characteristics of generative AI and the principles of language models, you will be able to clearly explain the fundamentals of the technology.

  • You will be able to understand the entire LLM creation process, from data collection to preprocessing, model selection, training, evaluation, and maintenance.

  • You will be able to understand the theoretical process of creating language models that can solve specific problems using pre-training, transfer learning, fine-tuning, and RLHF (Reinforcement Learning from Human Feedback) techniques.


Notes Before Taking the Course

Practice Environment

  • Since this is a theory-focused lecture, no separate practice environment is required.

  • [Added content] However, if you want to practice on your own with the content from the added practical sessions, you can prepare Google Colab. Google Colab can be used immediately for free if you have a Google account (however, in special cases among the practice content, server performance provided only in paid plans may be required).


Learning Materials

  • The lecture materials are attached in PDF file format.

Prerequisites and Important Notes

  • Having background knowledge in natural language processing, artificial intelligence, deep learning, and reinforcement learning will help you better understand the content.

  • [Added Content] If you want to practice on your own with the content covered in the added practical lessons, it will be very helpful to know Python programming and machine learning/deep learning programming.

🧭 Now is the time to start

In the era of LLM-centered artificial intelligence, properly understanding and applying it in practice is an essential competency for next-generation AI experts.
This course is not just about knowledge transfer, but provides the in-depth knowledge needed to truly handle and build LLMs.

Recommended for
these people

Who is this course right for?

  • People who want to learn the principles of large language models with a theory-focused approach

  • For those who want to understand the LLM creation process

Need to know before starting?

  • Deep Learning

  • Reinforcement Learning

  • Natural Language Processing

Hello
This is

611

Learners

31

Reviews

2

Answers

4.5

Rating

18

Courses

IT가 취미이자 직업인 사람입니다.

다양한 저술, 번역, 자문, 개발, 강의 경력이 있습니다.

Curriculum

All

233 lectures ∙ (50hr 5min)

Course Materials:

Lecture resources
Published: 
Last updated: 

Reviews

All

3 reviews

4.0

3 reviews

  • wj08286955님의 프로필 이미지
    wj08286955

    Reviews 1

    Average Rating 5.0

    5

    60% enrolled

    • arigaram
      Instructor

      Thank you.

  • khkwon님의 프로필 이미지
    khkwon

    Reviews 3

    Average Rating 4.7

    5

    61% enrolled

  • dbdusgur95님의 프로필 이미지
    dbdusgur95

    Reviews 1

    Average Rating 2.0

    Edited

    2

    100% enrolled

    .

$77.00

arigaram's other courses

Check out other courses by the instructor!

Similar courses

Explore other courses in the same field!