
Building the Basics of R Programming
coco
This course covers the basics of R programming for those who have no knowledge of R programming.
Beginner
R
Learn how to collect and manage all stocks listed on the stock market. Create a dashboard using shiny that automatically collects new stock prices every day and can also identify stock trends by industry.

Reviews from Early Learners
5.0
DT로
It's great for practical assignments.
5.0
hakjuknu
Good!
5.0
조정태
I am very satisfied with the lecture content. I will continue to review it so that I can fully understand it. Thank you.
Collection of all KOSPI/KOSDAQ stocks
Industry-specific stock data management
Understanding industry-specific stock trends
🙆🏻♀ Automate all stock data collection and management/industry-specific stock management 🙆🏻♂
Would you like to analyze a stock you are interested in or all stocks listed on KOSPI/KOSDAQ?
To do analysis, you need data .
This course collects and manages all stocks listed on our country's stock market.
Due to time constraints, the lecture collects data for the past three years for all subjects.
If you change 3 to 10, you can easily collect 10 years' worth of data.
Starting today, we will collect not only the last 10 years of data, but also new data, that is, data generated the next day.
Automation updates the stocks daily by collecting data on the day's transactions around 4 p.m., when the stock market closes.
Create a Shiny Dash Board like the address below.
https://leegt.shinyapps.io/shiny/
(Connection may not be possible if the number of people exceeds a certain number)
All companies (stocks) listed on the stock market have their own unique code.
Depending on this code, the address to be crawled will change.
So, first, we collect the unique code for each company.
Additionally, we preprocess the code so that it can be imported from Naver Finance.
After setting the Naver Financial address for each stock, data for the past three years is collected for all stocks.
It took about 4 hours to collect 3 years' worth, so I think 10 years' worth could be collected in about 12 hours.
After collecting daily stock data by stock, create a folder for each stock and save it in each folder.
Additionally, exception handling is provided in case an error occurs.
We can't scrape 10 years' worth of data like this every day. It's highly inefficient.
After today's stock trading is completed, automation proceeds by collecting only today's stock data and merging it with previously stored data.
Now we can automatically update all daily stock data every day at 4 PM.
From a mid- to long-term stock investment perspective, it is important to understand industry/theme trends.
We collect stock codes by industry, retrieve data on these stocks, identify trends, and visualize them.

After the stock market closes each day, we collect additional daily data and automate the entire process, from managing and visualizing stocks by industry.
This lecture is
The lecture assumes basic knowledge of the R language and crawling.
Who is this course right for?
Someone who knows the basics of R
Anyone who needs stock data
Anyone who wants to build up basic data for investing
8,388
Learners
509
Reviews
136
Answers
4.4
Rating
20
Courses
I am an unemployed scholar who majored in statistics as an undergraduate, earned a PhD in industrial engineering (artificial intelligence), and is still studying.
Awards ㆍ 6th Big Contest: Game User Churn Algorithm Development / NCSOFT Award (2018) ㆍ 5th Big Contest: Loan Delinquency Prediction Algorithm Development / Korea Association for ICT Promotion
Awards
ㆍ 6th Big Contest Game User Churn Prediction Algorithm Development / NCSOFT Award (2018)
ㆍ 5th Big Contest Loan Defaulter Prediction Algorithm Development / Korea Association for ICT Promotion (KAIT) Award (2017)
ㆍ 2016 Weather Big Data Contest / Korea Institute of Geoscience and Mineral Resources President's Award (2016)
ㆍ 4th Big Contest: Development of Insurance Fraud Prediction Algorithm / Finalist (2016)
ㆍ 3rd Big Contest Baseball Game Prediction Algorithm Development / Minister of Science, ICT and Future Planning Award (2015)
* blog : https://bluediary8.tistory.com
My primary research areas are data science, reinforcement learning, and deep learning.
I am currently doing crawling and text mining as a hobby :)
I developed an app called Marong that uses crawling to collect and display only popular community posts,
I also created a restaurant recommendation app by collecting lists of famous restaurants and blog posts from across the country :) (it failed miserably..)
I am currently a PhD student researching artificial intelligence.
I even developed a restaurant recommendation app by collecting blog posts and lists of top-rated restaurants across the country :) (though it failed miserably...) Now, I am a PhD student researching artificial intelligence.
I even developed a restaurant recommendation app by collecting lists of famous restaurants and blogs from all over the country :) (It failed miserably...) Now, I am a PhD student researching artificial intelligence.
I even developed a restaurant recommendation app by collecting lists of famous restaurants and blogs from all over the country :) (It failed miserably...) Now, I am a PhD student researching artificial intelligence.
I even developed a restaurant recommendation app by collecting lists of famous restaurants and blogs from all over the country :) (It failed miserably...) Now, I am a PhD student researching artificial intelligence.
All
23 lectures ∙ (3hr 55min)
Course Materials:
1. Orientation
01:47
All
8 reviews
4.9
8 reviews
Reviews 4
∙
Average Rating 5.0
Reviews 155
∙
Average Rating 5.0
Reviews 15
∙
Average Rating 5.0
Reviews 1
∙
Average Rating 5.0
Reviews 1
∙
Average Rating 4.0
$42.90
Check out other courses by the instructor!
Explore other courses in the same field!