강의

멘토링

커뮤니티

AI Technology

/

Natural Language Processing

Learning Transformer Through Implementation

From Multi Head Attention to the Original Transformer model, BERT, and Encoder-Decoder based MarianMT translation model, you'll learn Transformer inside and out by implementing them directly with code.

(5.0) 6 reviews

200 learners

  • dooleyz3525
이론 실습 모두
transformer
NLP
바닥부터구현
Deep Learning(DL)
PyTorch
encoder-decoder
bert

What you will gain after the course

  • Hands-on Implementation and Mastery of Transformer's Self, Causal, Cross Attention Mechanisms

  • Learn the Original Transformer Model Architecture by Implementing Positional Encoding, Feed Forward, Encoder, Decoder, and More

  • Tokenization, embedding NLP foundations and RNN models - prerequisite knowledge for Transformers

  • Implementing BERT model directly and applying sentence classification training with the implemented BERT

  • MarianMT Model: A Directly Implemented Encoder-Decoder Translation Model

  • Understanding and Utilizing Hugging Face Dataset, Tokenizer, and DataCollator

  • Training Encoder-Decoder MarianMT Models and Greedy vs Beam Search Inference

  • Implementing Vision Transformer (ViT) from scratch and training an image classification model with custom data

Completely master Transformers with this one course!

Through this course, you can learn by directly implementing Transformer, the core of the latest AI.

I've structured this course as a complete Transformer course that covers everything from Multi-Head Attention, the core mechanism of Transformers, to the Original Transformer model, BERT model, Encoder-Decoder MarianMT translation model, and Vision Transformer, all implemented and understood through code.

Features of this course

[[SPAN_1]]💡 [[/SPAN_1]][[SPAN_2]]바닥부터 코드로 구현하며 배우는 Transformer[[/SPAN_2]]

From Multi Head Attention, the core mechanism of Transformer, to the Original Transformer model and BERT, as well as the Encoder-Decoder translation model MarianMT, you will learn about Transformers inside and out by implementing them directly with code.

💡 Step-by-step learning from NLP fundamentals to core Transformer models

To understand Transformers, it's important to first understand the fundamentals of NLP.

Starting from tokenization and embedding, and RNN models before Transformer, progressing through Attention -> Transformer -> BERT -> MarianMT translation model -> Vision Transformer, the course is structured in one continuous flow enabling step-by-step learning from solid NLP fundamentals to core Transformer models.

[[SPAN_1]]💡 [[/SPAN_1]][[SPAN_2]]이론과 구현의 균형[[/SPAN_2]]

We don't just focus on implementation. The core mechanisms that make up the Transformer are designed to be easily understood and stick in your head. We spent a lot of time conceptualizing ideas and creating this course. From easy and detailed theoretical explanations to actual code implementation, this course will dramatically improve your Transformer application skills.

💡 Presenting Core NLP Problem-Solving Process

We cover specific elements encountered in actual research/practice such as embedding, padding masking, various types of Attention, loss calculation for padded labels, dynamic padding, etc., and present the solution process for these.


💡 Utilizing Key Hugging Face Libraries

Hugging Face is an essential library for utilizing Transformers. In this course, we will use Hugging Face's Tokenizer, Dataset, DataCollator, and other tools to perform data preprocessing, tokenization, dynamic padding, label value and decoder input value transformation, and other data processing tasks for Transformer model training in an easy and convenient way. We will provide you with a detailed guide on how to handle these processes.

You'll learn this kind of content

NLP Fundamentals and RNN for Transformer Prerequisites

We will summarize and explain the prerequisite knowledge needed to learn Transformers, including tokenization, embedding, RNN and Seq2Seq models, and the basics of Attention.

Transformer Core Mechanisms and Key Modules

You can clearly understand the core Attention mechanisms such as Self Attention, Causal Attention, Cross Attention, and the key modules of Transformers including Positional Encoding, Layer Normalization, Feed Forward, Encoder/Decoder Layer through detailed theory and hands-on practice.

Using Hugging Face Tokenizer, Dataset, DataCollator

I will provide a detailed explanation of the features, advantages, and usage methods of Hugging Face's Dataset, Tokenizer, and DataCollator. Additionally, you will be able to master how to effectively perform data pipeline processing tasks for Transformer NLP models by combining these components through various hands-on exercises and examples.

Implementation and Application of BERT Models

Learn BERT by directly implementing the key components of the BERT model. Additionally, you can learn how to apply model training and inference for sentence classification using the BERT implemented this way and various features provided by Hugging Face.

Implementation and Application of Encoder-Decoder Based MarianMT Translation Model

You will directly implement the MarianMT model, which is an Encoder-Decoder based Korean-English translation model, learn various data preprocessing methods and techniques required for training Encoder-Decoder models, and learn how to implement and apply Auto Regressive based Greedy Search and Beam Search.

Implementation and Training of Vision Transformer Models

We will directly implement Vision Transformer, which has established Transformer as a model comparable to CNN in the Vision domain, and train it using a custom dataset. By directly implementing the main modules of Vision Transformer, you can easily understand and learn the characteristics and mechanisms of ViT.

Pre-enrollment Reference Information

Practice Environment 💾

The hands-on environment will be conducted using notebook kernels provided by Kaggle. After signing up for Kaggle and selecting the Code menu, you can use P100 GPU for 30 hours per week for free in a Jupyter Notebook environment similar to Colab.


A 160-page lecture textbook is provided together.

Recommended for
these people

Who is this course right for?

  • Deep learning NLP beginners who want to solidify their foundation by directly implementing everything from tokenization to RNN and Transformer with code

  • Someone who wants to deeply understand the Transformer architecture by directly implementing the internal mechanisms rather than simply using the Transformer library

  • Those who want to understand the core mechanisms of Transformers more easily through a balanced approach of theory and practice

  • Developers who want to solidly build foundational skills in Attention or Transformer when developing AI services

  • Those who want a complete End-to-End practical project experience from Transformer fundamentals to text classification and translation models

Need to know before starting?

  • Deep Learning CNN Complete Guide - PyTorch Version

Hello
This is

26,945

Learners

1,369

Reviews

4,011

Answers

4.9

Rating

14

Courses

(전) 엔코아 컨설팅

(전) 한국 오라클

AI 프리랜서 컨설턴트

파이썬 머신러닝 완벽 가이드 저자

Curriculum

All

145 lectures ∙ (28hr 9min)

Course Materials:

Lecture resources
Published: 
Last updated: 

Reviews

All

6 reviews

5.0

6 reviews

  • aboutexo046263님의 프로필 이미지
    aboutexo046263

    Reviews 13

    Average Rating 4.8

    5

    60% enrolled

    • ugowego16567님의 프로필 이미지
      ugowego16567

      Reviews 1

      Average Rating 5.0

      5

      30% enrolled

      分かりやすく上手に説明してくださいます。

      • dooleyz3525
        Instructor

        お時間を割いて、このような良い受講レビューを書いていただき、ありがとうございます。

    • wannabmc님의 프로필 이미지
      wannabmc

      Reviews 8

      Average Rating 5.0

      5

      30% enrolled

      • dooleyz3525
        Instructor

        良いレビューをありがとうございます。

    • shinedngus225님의 프로필 이미지
      shinedngus225

      Reviews 2

      Average Rating 5.0

      5

      7% enrolled

      • dooleyz3525
        Instructor

        良い受講レビューをありがとうございます ^*^

    • jaeuk님의 프로필 이미지
      jaeuk

      Reviews 5

      Average Rating 5.0

      5

      31% enrolled

      • dooleyz3525
        Instructor

        良いレビューをありがとうございます ^^

    $59.40

    dooleyz3525's other courses

    Check out other courses by the instructor!

    Similar courses

    Explore other courses in the same field!