월 14,740원
5개월 할부 시다른 수강생들이 자주 물어보는 질문이 궁금하신가요?
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
이미지 분류 - 합성곱 신경망(CNN) 마지막 코드 질문드립니다.
- 학습 관련 질문을 남겨주세요. 상세히 작성하면 더 좋아요! - 먼저 유사한 질문이 있었는지 검색해보세요. - 서로 예의를 지키며 존중하는 문화를 만들어가요. - 잠깐! 인프런 서비스 운영 관련 문의는 1:1 문의하기를 이용해주세요. 안녕하세요 딥러닝호형님! 유튜브에서 알게되어서 인프런 강의까지 잘 수강 중 입니다. 딥러닝 너무 어려운데 잘 알려주셔서 먼저 감사인사 말씀 드립니다! 다름이 아니라 5-1강 CNN 마지막 5.6 모델 정확도 구하기에서 주석에 " 누적 (맞으면 1, 틀리면 0으로 합산) 이라고 적혀있는데 Print 해보니 label 값이 더해지고 있는 것 같습니다. 제가 코드를 잘못쳤는지 아니면 주석의 내용을 잘못 이해한건지 궁금합니다. 감사합니다. _, predicted = torch.max(outputs.data, 1) # torch.max() returns = (values=tensor([1.111,2.222...], device='cuda:0'), # indices=tensor([7,7,4,5,...], device='cuda:0')) print('여기야 =', (predicted==labels).sum().item()) total += labels.size(0) # 개수 누적 (총 개수) correct += (predicted == labels).sum().item() # 누적 (맞으면 1, 틀리면 0으로 합산) print('correct=', correct); 출력 : correct= 6166 여기야 = 4 correct= 6170 여기야 = 5 correct= 6175 여기야 = 6 correct= 6181
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
질문이 있어서 글 남깁니다!
프로젝트를 진행하면서 이 강의를 수강하고 있습니다. 아무리 학습을 시켜도 Test-accurary 가 0.68~0.69 을 넘어가지 못하고loss가 2.8 정도로 찍히는데 더 학습을 시켜야 할까요 아니면 코드를 수정해야 할까요.. ㅠㅠ대략적인 코드 첨부합니다!
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
코드는 어디서 받을 수 있나요 ?
강의에서 사용된 코드는 어디서 받을 수 있나요 ?
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
주가예측에 Autoencoder를 적용할 수 있나요??
안녕하세요. 주가예측의 LSTM모델을 기반으로 Autoencoder에 적용 가능할까요?해보진 않았는데, 강의해주신 코드에 데이터 전처리만 해서 사용해도 될것 같아서 질문드립니다.
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
LSTM fc 코드 관련 문의
안녕하세요.LSTM 모델 구축할 때,self.fc = nn.Linear(hidden_size*sequence_length,1) 이부분에서 hidden_szie와 sequence_length를 곱해준 이유가 무엇인지 궁금합니다.
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
4강 neural networks
trainsets 배치 형태로 구축을 위해 DataLoad 할 때 shuffle을 True로 하고, testloader은 shuffle을 False로 해주는 이유가 뭔가요..?shuffle이 섞는 다는 것이라면 train/test 모두 shuffle 해주어야 하는거 아닌가욤...? 헷갈려서 질문 드립니다.
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
선생님! 아래 코드가 잘 이해가 되지 않습니다.
def cluster_acc(y_true, y_pred): y_true = np.array(y_true) y_pred = np.array(y_pred) D = max(y_pred.max(), y_true.max()) + 1 w = np.zeros((D, D), dtype=np.int64) for i in range(y_pred.size): w[y_pred[i], y_true[i]] += 1 ind = linear_assignment(w.max() - w) return sum([w[i, j] for i, j in zip(ind[0], ind[1])]) * 1.0 / y_pred.size위 코드 부분에서 이게 어떻게 cluster의 accuracy를 계산하는지 직관적으로 이해가 되지 않습니다.제가 이해하기로는 y_true는 각 image의 label을 batchsize에 따라서 가져오니까 y_true는 (120, )의 tensor가 될 것이고, y_pred는 k_means를 통과한 값이 될 것입니다.이런 식이겠죠y_true = [0, 1, 1, 2, 3, 0, ...]y_pred = [3, 4, 4, 5, 1, 3, ...]계산을 하면 D = 10이 될 것이고, w는 10X10짜리 tensor가 될 것입니다.이 때 for i in range(120):에서 위의 6 라벨이 들어왔다고 치면W = [[0, 0, 2, 0, 0, 0, 0, 0, 0, 0] [0, 0, 0, 2, 0, 0, 0, 0, 0, 0] [0, 0, 0, 0, 1, 0, 0, 0, 0, 0] [0, 1, 0, 0, 0, 0, 0, 0, 0, 0]...] 이런 식으로 코딩이 될 것입니다.그러면 이게 W.max()를 취하면 2가 나올테고, w.max() - w이 식은 [[2, 2, 0, 2, 2, ...], [2,2,2,0,2,...], [2,2,2,2,1,...],[2,1,2,2,2,...]...] 이렇게 되고여기에서 linear_assignment를 통과한다면한 batch에서 정확도를 구할 수 있다는 것인데,linear_assignment가 hungarian algorhythm을 통해서 bipartite한graph에서의 정보를 처리한다데 이게 구체적으로 어떤 공식에 의해서되는 것인지 잘 모르겠습니다..매번 감사합니다.
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
4. 인공신경망 모델평가
RMSE 를 구할때 오류가 떠서 size 찍어보니까 크기가 다르게 나옵니다. 학습까지는 잘 진행이 되었는데 혹시 수정해야 할 부분이 있을까요? 본 학습 자료와 같은 데이터로 trainloader를 이와 같이 만들었고 계속해서 오류가 발생한 부분을 검색하고 찾아보지만 아직 파악을 못하였습니다.
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
성능평가 RMSE 관련 질의사항
안녕하세요.mean_squared_error(predictions, actual)에 predictions, actual 의 순서를 바꿔도 동일한 값이 나오는데, 순서는 상관 없이 써도 되는 것인가요?
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
로스가 감소하지 않습니다... 혹시 제 코드에 문제가 있을까요?
- 학습 관련 질문을 남겨주세요. 상세히 작성하면 더 좋아요! - 먼저 유사한 질문이 있었는지 검색해보세요. - 서로 예의를 지키며 존중하는 문화를 만들어가요. - 잠깐! 인프런 서비스 운영 관련 문의는 1:1 문의하기를 이용해주세요. 안녕하세요 선생님! 이번 강의를 다 들은 후에 구현까지 해보고 코랩에서 돌려봤는데, loss가 고정된 상태로 감소하지가 않습니다. 제가 뭔가 잘못한 것 같은데 아무리 봐도 어디가 잘못되었는지 알수 없어서요... 혹시 inputs = data[0].to(device) 이 부분이 문제인 걸까요? 감사합니다.
- 해결됨[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
RNN 출력 수정 방법(Many to one)
강의를 듣고 직접 RNN을 구현하는 것을 시도했습니다. 하나의 물체를 다른 위치에서 촬영하고 각 사진마다 물체 좌표값(x,y)을 추출했습니다. 얻은 좌표값을 이용해서 실제 3D 좌표(x,y,z)로 예측하는 것을 만들려고 합니다. 제가 생각한 모델구성은 아래와 같습니다. 하지만, 모델을 구현해보니 좌표값이 1개만 출력되고 출력된 값이 무엇을 예측하고 나왔는지 모르겠습니다. 어디를 수정해야 x,y,z값이 출력되나요? 아래는 사용된 소스 및 데이터 일부입니다. import numpy as np import pandas as pd import torch from sklearn.model_selection import train_test_split import torch.nn as nn import torch.optim as optim import matplotlib.pyplot as plt #데이터 불러오기 path = r"redball_Data.csv" df = pd.read_csv(path) df_suffled = df.sample(frac=1).reset_index() #텐서 데이터 만들기 x = df_suffled[['x1','y1','x2','y2','x3','y3','x4','y4','x5','y5','x6','y6','x7','y7','x8','y8']].values y = df_suffled[['x','y','z']].values print(x.shape) x = x.reshape(len(x),8,2) #2d to 3d print(x.shape) print(y.shape) device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu") xt = torch.FloatTensor(x).to(device) yt = torch.FloatTensor(y).to(device) x_train, x_test, y_train, y_test = train_test_split(xt,yt,test_size=0.2) print(f"x_train {x_train.shape} | x_test {x_test.shape} | y_train {y_train.shape} | y_test {y_test.shape}") train = torch.utils.data.TensorDataset(x_train, y_train) test = torch.utils.data.TensorDataset(x_test, y_test) batch_size = 1000 train_loader = torch.utils.data.DataLoader(dataset=train, batch_size=batch_size, shuffle=False) test_loader = torch.utils.data.DataLoader(dataset=test, batch_size=batch_size, shuffle=False) #Hyperparameter setting # RNN print(f"input_size : {xt.size(2)}") input_size = xt.size(2) num_layers = 2 hidden_size = 8 sequence_length = xt.size(1) #Create Model class VanillaRNN(nn.Module): def __init__(self, input_size, hidden_size, sequence_length, num_layers, device): super(VanillaRNN, self).__init__() self.device = device self.hidden_size = hidden_size self.num_layers = num_layers self.rnn = nn.RNN(input_size, hidden_size, num_layers, batch_first=True) self.fc = nn.Sequential(nn.Linear(hidden_size * sequence_length, 3), nn.ReLU()) #1 => 3으로 수정 def forward(self, x): h0 = torch.zeros(self.num_layers, x.size()[0], self.hidden_size).to(self.device) # 초기 hidden state 설정 out, _ = self.rnn(x, h0) # out: RNN의 마지막 레이어로 부터 나온 output feature 반환, hn: hidden state 반환 out = out.reshape(out.shape[0], -1) # many to many 전략 out = self.fc(out) return out model = VanillaRNN(input_size=input_size, hidden_size=hidden_size, sequence_length=sequence_length, num_layers=num_layers, device=device).to(device) criterion = nn.MSELoss() lr = 1e-3 num_epochs = 800 optimizer = optim.Adam(model.parameters(), lr=lr) #Learn Model loss_graph = [] n = len(train_loader) for epoch in range(num_epochs): running_loss = 0.0 for data in train_loader: seq, target = data # 배치 데이터 out = model(seq) loss = criterion(out, target) optimizer.zero_grad() loss.backward() optimizer.step() running_loss += loss.item() loss_graph.append(running_loss / n) if epoch % 50 == 0: print('[epoch: %d] loss: %.4f' % (epoch, running_loss / n)) plt.figure(figsize=(20,10)) plt.plot(loss_graph) plt.show() def plotting(train_loader, test_loader, actual): with torch.no_grad(): train_pred = [] test_pred = [] for data in train_loader: seq, target = data # 배치 데이터 # print(seq.size()) out = model(seq) train_pred += out.cpu().numpy().tolist() for data in test_loader: seq, target = data # 배치 데이터 # print(seq.size()) out = model(seq) test_pred += out.cpu().numpy().tolist() total = train_pred + test_pred plt.figure(figsize=(20, 10)) plt.plot(np.ones(100) * len(train_pred), np.linspace(0, 1, 100), '--', linewidth=0.6) plt.plot(actual, '--') plt.plot(total, 'b', linewidth=0.6) plt.legend(['train boundary', 'actual', 'prediction']) plt.show() plotting(train_loader, test_loader, y[:500,0]) x1,y1,x2,y2,x3,y3,x4,y4,x5,y5,x6,y6,x7,y7,x8,y8,x,z,y 0,0,1554.5,639.5,1022,561.5,0,0,0,0,0,0,516,580,291.5,778,2,1.5,-8.5 0,0,1478.5,630.5,975,567.5,0,0,0,0,0,0,465.5,591,303.5,811.5,3,1.5,-8.5 0,0,1407,621.5,926,573,0,0,0,0,0,0,410.5,603,317.5,849,4,1.5,-8.5 0,0,1338.5,614,874.5,579,0,0,0,0,0,0,351,615.5,333.5,893,5,1.5,-8.5 0,0,1274,606,821.5,585,0,0,0,0,1865.5,679.5,286.5,629.5,352.5,945,6,1.5,-8.5 0,0,1212,599,765,592,0,0,1516.5,1073.5,1780.5,661,216,644.5,375,1006.5,7,1.5,-8.5 0,0,1154,592,707,599,0,0,1544,1006.5,1703,644.5,138.5,661,403,1073.5,8,1.5,-8.5 0,0,1097.5,585,645,606,0,0,1566.5,945,1632.5,629.5,53.5,679.5,0,0,9,1.5,-8.5 0,0,1044.5,579,580.5,614,0,0,1585.5,893,1568,615.5,0,0,0,0,10,1.5,-8.5 0,0,993,573,512,621.5,0,0,1601.5,849,1508.5,603,0,0,0,0,11,1.5,-8.5 0,0,944,567.5,440.5,630.5,0,0,1615.5,811.5,1453.5,591,0,0,0,0,12,1.5,-8.5 0,0,897,561.5,364.5,639.5,0,0,1627,778,1403,580,0,0,0,0,13,1.5,-8.5 0,0,882,566.5,368.5,647.5,0,0,1572.5,768.5,1384,583.5,0,0,0,0,13,1.5,-8 0,0,929.5,572.5,446,638,0,0,1557,800,1434.5,595,0,0,0,0,12,1.5,-8 0,0,979.5,578,519,629.5,0,0,1539,836,1489,607,0,0,0,0,11,1.5,-8 0,0,1031,584.5,588.5,620.5,0,0,1519,878,1548,619.5,0,0,0,0,10,1.5,-8 0,0,1085,591,654.5,613,0,0,1495,927,1612.5,634.5,71.5,686.5,0,0,9,1.5,-8 0,0,1142,598,717,605,0,0,1466,985,1683.5,650,157.5,667,487.5,1055.5,8,1.5,-8 0,0,1202,605,777,598,0,0,1431.5,1055.5,1761.5,667,235.5,650,453,985,7,1.5,-8 0,0,1264.5,613,833.5,591,0,0,0,0,1847.5,686.5,306.5,634.5,424,927,6,1.5,-8 0,0,1330.5,620.5,888,584.5,0,0,0,0,0,0,371,619.5,400,878,5,1.5,-8 0,0,1400,629.5,939.5,578,0,0,0,0,0,0,430,607,380,836,4,1.5,-8 0,0,1473,638,989.5,572.5,0,0,0,0,0,0,484.5,595,362,800,3,1.5,-8 0,0,1550.5,647.5,1037,566.5,0,0,0,0,0,0,535,583.5,346.5,768.5,2,1.5,-8 0,0,1546.5,656,1052.5,571.5,0,0,0,0,0,0,555,587,399,759,2,1.5,-7.5 0,0,1467.5,646,1004.5,577.5,0,0,0,0,0,0,504.5,598.5,418,789,3,1.5,-7.5 0,0,1392.5,637,954.5,584,0,0,0,0,0,0,450,611,439.5,824,4,1.5,-7.5 0,0,1322,628.5,901.5,590,0,0,0,0,1916.5,714,391,624.5,463.5,863.5,5,1.5,-7.5 0,0,1254.5,619.5,846.5,597,29.5,1072.5,0,0,1828.5,692.5,326.5,639,492.5,910.5,6,1.5,-7.5 0,0,1191,611.5,789,604,0,0,1352,1031,1742,673,255,655,526.5,965,7,1.5,-7.5 0,0,1130,604,728,611.5,0,0,1392.5,965,1664,655,177,673,567,1031,8,1.5,-7.5 1889.5,1072.5,1072,597,664.5,619.5,0,0,1426.5,910.5,1592.5,639,90.5,692.5,0,0,9,1.5,-7.5 0,0,1017.5,590,597,628.5,0,0,1455.5,863.5,1528,624.5,2.5,714,0,0,10,1.5,-7.5 0,0,964.5,584,526.5,637,0,0,1479.5,824,1469,611,0,0,0,0,11,1.5,-7.5 0,0,914.5,577.5,451.5,646,0,0,1501,789,1414.5,598.5,0,0,0,0,12,1.5,-7.5 0,0,866.5,571.5,372.5,656,0,0,1520,759,1364,587,0,0,0,0,13,1.5,-7.5 0,0,850,576.5,377,665,0,0,1469,750,1344,591,0,0,0,0,13,1.5,-7 0,0,899,583,457.5,655,0,0,1447.5,779,1394,602.5,0,0,0,0,12,1.5,-7 0,0,950,589,534,645,0,0,1423,812.5,1448.5,615.5,0,0,0,0,11,1.5,-7 1882,997,1003,596,606,635.5,0,0,1395,850,1507.5,629,14,722,0,0,10,1.5,-7 1802.5,1038.5,1059,603,674.5,626.5,0,0,1362,894,1572.5,644,110.5,699.5,698.5,1074.5,9,1.5,-7 1711,1075.5,1117.5,610.5,739.5,618.5,0,0,1323.5,946,1643.5,661,197,679.5,642,1008.5,8,1.5,-7 0,0,1179.5,618.5,801.5,610.5,208,1075.5,1277,1008.5,1722,679.5,275.5,661,595.5,946,7,1.5,-7 0,0,1244.5,626.5,860,603,116.5,1038.5,1221.5,1074.5,1808.5,699.5,346.5,644,557,894,6,1.5,-7 0,0,1313,635.5,916,596,37,997,0,0,1905,722,411.5,629,524,850,5,1.5,-7 0,0,1385,645,969,589,0,0,0,0,0,0,470.5,615.5,496,812.5,4,1.5,-7 0,0,1461.5,655,1020,583,0,0,0,0,0,0,525,602.5,471.5,779,3,1.5,-7 0,0,1542,665,1069,576.5,0,0,0,0,0,0,575,591,450,750.5,2,1.5,-7 0,0,1538,674.5,1085.5,582,0,0,0,0,0,0,596,594,498.5,741.5,2,1.5,-6.5 0,0,1455,664,1036.5,588,0,0,0,0,0,0,546,606.5,523,769.5,3,1.5,-6.5 0,0,1377.5,653,985,595,43.5,931,0,0,0,0,492,619.5,550.5,800.5,4,1.5,-6.5 0,0,1303,643.5,931,602,115.5,964.5,0,0,1885,730,432.5,633.5,581.5,837,5,1.5,-6.5 0,0,1233.5,634.5,874,609.5,195,1002.5,1146,1057.5,1788,706.5,368,649.5,618.5,879,6,1.5,-6.5 1530,1079,1167.5,625.5,814.5,617.5,284.5,1044.5,1207.5,987,1701,685.5,296.5,666.5,661,928.5,7,1.5,-6.5 1634.5,1044.5,1104.5,617.5,751.5,625.5,389,1079,1258,928.5,1622,666.5,218,685.5,711.5,987,8,1.5,-6.5 1724,1002.5,1045,609.5,685.5,634.5,0,0,1300.5,879,1551,649.5,131,706.5,773,1057.5,9,1.5,-6.5 1803.5,964.5,988,602,616,643.5,0,0,1337.5,837,1486.5,633.5,34,730,0,0,10,1.5,-6.5 1875.5,931,934,595,541.5,653,0,0,1368.5,800.5,1427,619.5,0,0,0,0,11,1.5,-6.5 0,0,882.5,588,464,664,0,0,1396,769.5,1373,606.5,0,0,0,0,12,1.5,-6.5 0,0,833.5,582,381,674.5,0,0,1420.5,741.5,1323,594,0,0,0,0,13,1.5,-6.5 1917,849.5,816,587.5,385.5,684.5,0,0,1373.5,733,1302,598,0,0,0,0,13,1.5,-6 1870.5,876,866,594,470,673,0,0,1347,760,1351.5,611,0,0,0,0,12,1.5,-6 1805,904.5,918,601,550.5,662,0,0,1316.5,790,1405.5,624.5,0,0,0,0,11,1.5,-6 1733,935,973,609,625.5,652,0,0,1282,825,1464.5,638.5,54,738.5,0,0,10,1.5,-6 1653,969.5,1030.5,616.5,697,642.5,0,0,1242.5,864.5,1529,655,152,714,844,1033,9,1.5,-6 1564,1007.5,1090.5,624.5,764,633.5,455,1050.5,1196,911.5,1600.5,672.5,240,692.5,778,966.5,8,1.5,-6 1464,1050.5,1155,633.5,828.5,624.5,355,1007.5,1141,966.5,1679,692.5,318.5,672.5,723,911.5,7,1.5,-6 0,0,1222,642.5,888.5,616.5,266,969.5,1075.5,1033,1767,714,390,655,676.5,864.5,6,1.5,-6 0,0,1293.5,652,946,608.5,186,935,0,0,1865,738.5,454.5,638.5,637,825,5,1.5,-6 0,0,1368.5,662,1001,601,114,904.5,0,0,0,0,513.5,624.5,602.5,790,4,1.5,-6 0,0,1449,673,1053,594,48.5,876,0,0,0,0,567.5,611,572,759.5,3,1.5,-6 0,0,1533.5,684.5,1103,587.5,2,849.5,0,0,0,0,617,598,545.5,733,2,1.5,-6 0,0,1528.5,695,1121,593,53.5,830,0,0,0,0,639,602,590.5,725,2,1.5,-5.5 0,0,1441.5,682.5,1071,600,113.5,854,0,0,0,0,589.5,615,619.5,750.5,3,1.5,-5.5 0,0,1360,671.5,1018,607.5,179,880,0,0,0,0,536,628.5,652,780,4,1.5,-5.5 0,0,1282.5,661,962.5,615.5,251,908.5,924.5,1075.5,1843,746.5,477,644,689,813,5,1.5,-5.5 1290.5,1057,1210,651,903.5,623.5,330.5,940,1009,1009.5,1745,721.5,412,660,732,851.5,6,1.5,-5.5 1401.5,1013.5,1142,641,842.5,632.5,419,974.5,1079,947.5,1656.5,699,341.5,678.5,781.5,895.5,7,1.5,-5.5 1500,974.5,1076.5,632.5,777.5,641,517.5,1013.5,1137.5,895.5,1577.5,678.5,262.5,699,840,947.5,8,1.5,-5.5 1588.5,940,1015,623.5,709,651,628.5,1057,1187,851.5,1507,660,174,721.5,909.5,1009.5,9,1.5,-5.5 1668,908.5,956.5,615.5,636.5,661,0,0,1229.5,813,1442,644,76,746.5,994.5,1075.5,10,1.5,-5.5 1740,880,901,607.5,559,671.5,0,0,1267,780,1383,628.5,0,0,0,0,11,1.5,-5.5 1805.5,854,848,600,477.5,682.5,0,0,1299.5,751,1329.5,615,0,0,0,0,12,1.5,-5.5 1865.5,830,798,593,390.5,695,0,0,1328.5,725,1280,602,0,0,0,0,13,1.5,-5.5 1806.5,811.5,779,599.5,395,706,0,0,1285,717,1257.5,606,0,0,0,0,13,1.5,-5 1746,833,830,606.5,484.5,693.5,0,0,1254,742,1306.5,619.5,0,0,0,0,12,1.5,-5 1680,857.5,883.5,614.5,568,681.5,0,0,1219.5,770,1360,633.5,0,0,0,0,11,1.5,-5 1608.5,883.5,939.5,622.5,647,670,805,1063,1179.5,801.5,1418.5,649,98,755.5,1060.5,1059,10,1.5,-5 1529,912.5,999,631,721.5,659,683,1019,1134,838,1483,666,197,729.5,972,988.5,9,1.5,-5 1441.5,944,1062,640,791,649.5,574.5,979,1082,880,1554.5,685,285.5,706,899,930,8,1.5,-5 1344.5,979,1128,649.5,857.5,640,477.5,944,1020,930,1633.5,706,364.5,685,837,880,7,1.5,-5 1236,1019,1197.5,659,920,631,390,912.5,947,988.5,1722,729.5,436,666,785,838,6,1.5,-5 1114,1063,1272,670,979.5,622.5,310.5,883.5,858.5,1059,1821,755.5,500.5,649,739.5,801.5,5,1.5,-5 0,0,1351,681.5,1035.5,614.5,239,857.5,0,0,0,0,559,633.5,700,770,4,1.5,-5 0,0,1434.5,693.5,1089,606.5,173,833,0,0,0,0,612.5,619.5,665,742,3,1.5,-5 0,0,1524,706,1140,599.5,112.5,811.5,0,0,0,0,661.5,606,634,717,2,1.5,-5 0,0,1518.5,718,1160,605.5,168,794,0,0,0,0,684.5,610.5,675.5,710,2,1.5,-4.5 0,0,1426.5,704.5,1108.5,613.5,228,814.5,0,0,0,0,636,624,709,734,3,1.5,-4.5 935,1067.5,1341,691.5,1054.5,621.5,294,836.5,0,0,1908.5,796,582.5,638.5,745.5,760.5,4,1.5,-4.5 1068,1024.5,1260.5,679.5,997,629.5,365.5,860.5,797,1035,1797.5,765,524.5,654,787,791,5,1.5,-4.5 1187,984,1184.5,668.5,936.5,638.5,444,887,888.5,968,1698,737.5,460,672,835,826,6,1.5,-4.5 1293,948.5,1113.5,658,873,648,530.5,916.5,964.5,912.5,1609.5,713,388.5,691.5,890.5,866,7,1.5,-4.5 1388.5,916.5,1046,648,805.5,658,626,948.5,1028.5,866,1530.5,691.5,309.5,713,954.5,912.5,8,1.5,-4.5 1475,887,982,638.5,734.5,668.5,732,984,1084,826,1459,672,221,737.5,1030.5,968.5,9,1.5,-4.5 1553.5,860.5,922,629.5,659,679.5,851,1024.5,1132,791,1394.5,654,121.5,765,1122,1035,10,1.5,-4.5 1625,836.5,864.5,621.5,578,691.5,984,1067.5,1173.5,760.5,1336.5,638.5,10.5,796,0,0,11,1.5,-4.5 1691,814.5,810.5,613.5,492.5,704.5,0,0,1210,734,1283,623.5,0,0,0,0,12,1.5,-4.5 1751,794,759,605.5,400.5,718,0,0,1243.5,710,1234.5,610.5,0,0,0,0,13,1.5,-4.5 1699.5,777,738.5,612.5,406.5,730.5,0,0,1203,702.5,1210.5,614.5,0,0,0,0,13,1.5,-4 1639.5,796.5,790.5,620.5,500.5,716,1166.5,1071,1168.5,726,1259,628.5,0,0,0,0,12,1.5,-4 1573.5,817,845.5,628.5,588.5,702.5,1021,1030,1129.5,751.5,1312,643.5,31.5,807.5,1292,1076.5,11,1.5,-4 1502,840,903.5,637.5,671,690.5,892,989.5,1086,781,1370.5,660,145.5,775,1179.5,1011.5,10,1.5,-4 1424.5,864.5,965,647,748,678.5,776.5,953.5,1035.5,814,1434.5,678.5,245.5,746.5,1085.5,949,9,1.5,-4 1339.5,891,1029.5,657,821,667,673,920.5,978.5,852.5,1505.5,698.5,334.5,721,1007,897,8,1.5,-4 1246,920.5,1098,667,889.5,657,579.5,891,911.5,897,1584.5,721,413.5,698.5,940.5,852.5,7,1.5,-4 1142,953.5,1171,678.5,954,647,494.5,864.5,833.5,949,1673.5,746.5,484.5,678.5,883.5,814,6,1.5,-4 1027,989.5,1248,690.5,1015.5,637.5,417,840,739.5,1011.5,1773.5,775,548.5,660,833,781,5,1.5,-4 898,1030,1330.5,702.5,1073.5,628.5,345.5,817,627,1076.5,1888,807.5,607,643.5,789.5,751.5,4,1.5,-4 752.5,1071,1418.5,716,1128.5,620.5,279.5,796.5,0,0,0,0,660,628.5,750.5,726,3,1.5,-4 0,0,1512.5,730.5,1180.5,612.5,219.5,777,0,0,0,0,708.5,614.5,716,702.5,2,1.5,-4 568,1074,1506.5,743.5,1202.5,619.5,267.5,762,0,0,0,0,732.5,618.5,755,695.5,2,1.5,-3.5 724.5,1036,1410,728.5,1149.5,627.5,328,780,0,0,0,0,685,633.5,791.5,718,3,1.5,-3.5 864.5,994.5,1319.5,714,1094,636.5,393,799.5,569.5,1061,1863,818.5,632,648.5,832,742.5,4,1.5,-3.5 989.5,957.5,1235.5,701,1035,645.5,463.5,820,685.5,990,1748.5,785,574.5,665.5,877.5,771,5,1.5,-3.5 1101,924.5,1156,688.5,973,655,540.5,843,781,931,1647.5,755,510.5,684.5,929,802.5,6,1.5,-3.5 1202.5,895,1082,676.5,907,666,624,867.5,862,881.5,1558.5,729,439.5,705.5,988,839,7,1.5,-3.5 1295,867.5,1012,666,837,676.5,716.5,895,931,839,1479.5,705.5,360.5,729,1057,881.5,8,1.5,-3.5 1378.5,843,946,655,763,688.5,818,924.5,990,802.5,1408.5,684.5,271.5,755,1138,931,9,1.5,-3.5 1455.5,820,884,645.5,683.5,701,929.5,957.5,1041.5,771,1344.5,666,170.5,785,1233.5,990,10,1.5,-3.5 1526,799.5,825.5,636.5,599.5,714,1054.5,994.5,1087,742.5,1287,648.5,56,818.5,1349.5,1061,11,1.5,-3.5 1591,780,769.5,627.5,509,728.5,1194.5,1036,1127.5,718,1234,633.5,0,0,0,0,12,1.5,-3.5 1651.5,762,716.5,619.5,412.5,743.5,1351,1074,1164,695.5,1186.5,618.5,0,0,0,0,13,1.5,-3.5 1606.5,747.5,694,626.5,419,757.5,1370.5,1042,1126,688.5,1161,623.5,0,0,0,0,13,1.5,-3 1546.5,764.5,747.5,635,518.5,741.5,1219.5,1000,1088.5,710.5,1209,638,0,0,0,0,12,1.5,-3 1481.5,782.5,804,644,611,726.5,1085,962.5,1047,734.5,1261,654,81.5,831,1401.5,1036.5,11,1.5,-3 1411.5,801.5,864,654,697.5,712,964,929,999.5,761.5,1318.5,671.5,197,795.5,1284.5,969.5,10,1.5,-3 1336,823,927,664,778,699.5,855,898.5,946,792,1382,691.5,298,764,1187,914.5,9,1.5,-3 1253.5,846,993.5,675,854,686.5,756,871,885,827,1453,713,387,737,1104.5,867,8,1.5,-3 1163,871,1065,686.5,925.5,675.5,665.5,846,814.5,867,1532,737,466,713,1034,827,7,1.5,-3 1064,898.5,1141,699,992,664,583,823,732,914.5,1621,764,537,691,973,792,6,1.5,-3 955,929,1221.5,712,1055,654,507.5,801.5,634.5,969.5,1722,795.5,600.5,671.5,919.5,761.5,5,1.5,-3 834,962.5,1308,726.5,1115,644,437.5,782.5,517.5,1036.5,1837.5,831,658,654,872,734.5,4,1.5,-3 699.5,1000,1400.5,741.5,1171.5,635,372.5,764.5,0,0,0,0,710,638.5,830.5,710.5,3,1.5,-3 548.5,1042,1500,757.5,1225,626.5,312.5,747.5,0,0,0,0,758,623.5,793,688.5,2,1.5,-3 532,1005.5,1493.5,773,1248.5,634,355,734.5,0,0,0,0,783.5,627.5,829,682.5,2,1.5,-2.5 676.5,967.5,1391,755.5,1194,643,415,749.5,329,1077.5,0,0,737,643.5,868,703.5,3,1.5,-2.5 806,933,1296,739.5,1137,653,479,767,468.5,1013.5,1810.5,843,684.5,659.5,911.5,726,4,1.5,-2.5 923.5,902.5,1207,724.5,1076.5,663,548.5,785,586.5,950.5,1694.5,806.5,628,677.5,960,752.5,5,1.5,-2.5 1029.5,874.5,1124.5,710.5,1012.5,673.5,623,804.5,685.5,898,1593,774,564.5,697.5,1015,781.5,6,1.5,-2.5 1126.5,849,1047,697.5,944.5,685.5,704,826,769.5,853.5,1504,745.5,494,720,1077.5,815,7,1.5,-2.5 1215,826,974.5,685.5,872,697.5,792.5,849,841.5,815,1425,720,415,745.5,1149.5,853.5,8,1.5,-2.5 1296,804.5,906.5,673.5,794.5,710.5,889.5,874.5,904,781.5,1354.5,698,326,774,1233.5,898,9,1.5,-2.5 1370.5,785,842.5,663,712,724.5,995.5,902.5,959,752.5,1291,677.5,224.5,806.5,1332.5,950.5,10,1.5,-2.5 1440,767,782,653,623,739.5,1113,933,1007.5,726,1234.5,659.5,108.5,843,1450.5,1013.5,11,1.5,-2.5 1504,749.5,725,643,528,755.5,1242.5,967.5,1051,703.5,1182,643,0,0,1591,1077.5,12,1.5,-2.5 1564,734.5,670.5,634,425.5,773,1387,1005.5,1090,682.5,1135.5,627.5,0,0,0,0,13,1.5,-2.5 1524,721.5,646,642,433,789,1402.5,972,1054.5,676,1108.5,632.5,0,0,0,0,13,1.5,-2 1464.5,736.5,700.5,651,538.5,770.5,1263.5,938,1015,696.5,1155,648.5,8,901.5,1640,1063,12,1.5,-2 1401,752.5,758.5,661.5,636.5,753.5,1138,906.5,970.5,718,1206.5,665,136.5,856.5,1496.5,992,11,1.5,-2 1333,769,819.5,672,727,737.5,1024.5,878,920,743.5,1263,684,253,817.5,1377.5,932.5,10,1.5,-2 1259,787.5,885,683.5,811.5,722.5,921.5,852.5,863.5,771.5,1326,705,355,784,1278,883,9,1.5,-2 1178.5,807.5,954,695.5,890.5,709,827,829,799.5,803.5,1396,728,444,754.5,1193,840,8,1.5,-2 1092,829,1028.5,709,965,695.5,740,807.5,726,840,1475,754.5,523,728,1119.5,803.5,7,1.5,-2 998,852.5,1107.5,723,1034,683.5,660,787.5,641,883,1564,784,593,705,1055.5,771.5,6,1.5,-2 894.5,878,1192,737.5,1099.5,672,586,769,541.5,932.5,1666,817.5,656,684,999,743.5,5,1.5,-2 780.5,906.5,1282.5,753.5,1160.5,661.5,518,752.5,422.5,992,1782.5,856.5,712.5,665,949,718,4,1.5,-2 655.5,938,1380.5,770.5,1218.5,651,454.5,736.5,279,1063,1911,901.5,764,648.5,904,696.5,3,1.5,-2 516.5,972,1486,789,1273,642,395,721.5,0,0,0,0,810.5,632.5,864.5,676,2,1.5,-2 503,942,1478,806.5,1299.5,650,433,709.5,0,0,0,0,838,637.5,898.5,670,2,1.5,-1.5 637,910.5,1369.5,786,1243.5,660,491.5,723.5,236.5,1038.5,1891,918,792,653.5,939.5,689.5,3,1.5,-1.5 758,882,1269,768,1185.5,670.5,554.5,738.5,379.5,971.5,1753,870.5,741,671,985.5,711,4,1.5,-1.5 867.5,855.5,1175.5,751.5,1123,682,622,754.5,498.5,915.5,1635.5,830,685,690.5,1036.5,735,5,1.5,-1.5 968.5,832,1089,735.5,1057,694.5,695,771.5,599,868.5,1534,794.5,622.5,712,1093.5,762,6,1.5,-1.5 1060.5,810.5,1008.5,721,986,707,773,790,685,828,1445,763.5,553,736.5,1159,792.5,7,1.5,-1.5 1146,790,933,707,910.5,721,858.5,810.5,760,792.5,1366,736.5,474,763.5,1234,828,8,1.5,-1.5 1224,771.5,862,694.5,830,735.5,950.5,832,825,762,1296.5,712,385,794.5,1320,868.5,9,1.5,-1.5 1297,754.5,796,682,743.5,751.5,1051.5,855.5,882.5,735,1234,690.5,283.5,830,1420.5,915.5,10,1.5,-1.5 1364.5,738.5,733.5,671,650,768,1161,882,933.5,711,1178,671,166,870.5,1539.5,971.5,11,1.5,-1.5 1427.5,723.5,675.5,660,549.5,786,1282,910.5,979.5,689.5,1127,653.5,28,918,1682.5,1038.5,12,1.5,-1.5 1486,709.5,620,650,441,806.5,1416,942,1020.5,670,1081,637.5,0,0,0,0,13,1.5,-1.5 1450.5,698.5,592.5,659,449.5,824.5,1428.5,914.5,987.5,664,1053,642.5,0,0,1890,1078.5,13,1.5,-1 1392.5,711,648.5,669,561.5,803.5,1299.5,885.5,945.5,683,1098.5,659,58,935,1722.5,1015.5,12,1.5,-1 1330,725.5,708,680.5,665,784,1182.5,859.5,899,703.5,1148.5,677.5,197,885,1580,952,11,1.5,-1 1263.5,740.5,771,692.5,760.5,765.5,1076,835,846.5,727,1204,697.5,314.5,842,1461,899.5,10,1.5,-1 1191.5,756.5,838.5,705,849.5,748.5,977.5,813,788,752.5,1266,720,417,805.5,1360,854.5,9,1.5,-1 1114.5,774,910.5,719,932,733.5,887.5,792.5,722,782,1335,744.5,506,773,1273,815.5,8,1.5,-1 1031,792.5,987,733.5,1008.5,719,804.5,774,646,815.5,1413,773,584,744.5,1197,782,7,1.5,-1 941.5,813,1069.5,748.5,1080.5,705,727.5,756.5,559,854.5,1502,805.5,653,720,1131,752.5,6,1.5,-1 843,835,1158.5,765.5,1148,692.5,655.5,740.5,458,899.5,1604.5,842,715,697.5,1072.5,727,5,1.5,-1 736.5,859.5,1254,784,1211,680.5,589,725.5,339,952,1722,885,770.5,677,1020,703.5,4,1.5,-1 619.5,885,1357.5,803.5,1270.5,669,526.5,711.5,196.5,1015.5,1861,935,820.5,659,973.5,683,3,1.5,-1 490.5,914.5,1469.5,824.5,1326.5,659,468.5,698.5,29.5,1078.5,0,0,866,642.5,931.5,664,2,1.5,-1 479,889,1460.5,844.5,1355,668,502,687.5,3.5,1062.5,0,0,895,648,963,659,2,1.5,-0.5 603,862.5,1344.5,821.5,1299,679,559.5,700,159,993.5,1829.5,953.5,850.5,665,1006.5,677,3,1.5,-0.5 716.5,838,1238,800.5,1238.5,690.5,621,713,300.5,934,1689.5,900.5,801,683.5,1054,696.5,4,1.5,-0.5 820.5,815.5,1140,781,1174.5,703.5,687,727.5,419,884,1571,855.5,746,704,1107.5,719,5,1.5,-0.5 916,795.5,1048.5,763.5,1106,717,757.5,742.5,521,841,1469,816.5,685,727.5,1167,744.5,6,1.5,-0.5 1004,776,964.5,746.5,1033,731.5,833.5,758.5,608.5,804.5,1380.5,783,616.5,753.5,1234,772,7,1.5,-0.5 1085.5,758.5,886,731.5,954.5,746.5,915,776,685,772,1302.5,753.5,538.5,783,1310.5,804.5,8,1.5,-0.5 1161.5,742.5,812.5,717,870.5,763.5,1003,795.5,752,744.5,1234,727.5,450,816.5,1398,841,9,1.5,-0.5 1232,727.5,744.5,703.5,779,781,1098.5,815.5,811.5,719,1173,704.5,348,855.5,1500,884,10,1.5,-0.5 1298,713,680.5,690.5,681,800.5,1202.5,838,865,696.5,1118,683.5,229.5,900.5,1619,934,11,1.5,-0.5 1359.5,700,620,679,574.5,821.5,1316,862.5,912.5,677,1068.5,665,89.5,953.5,1760,993.5,12,1.5,-0.5 1417,687.5,564,668,458.5,844.5,1440,889,956,659,1024,648,0,0,1915.5,1062.5,13,1.5,-0.5 1385,677.5,534,677.5,468.5,866,1450.5,866,925,653,994,653,0,0,0,0,13,1.5,0 1328,689.5,591,689.5,588.5,841,1330.5,841,881,671,1038,671,123,973,1796,973,12,1.5,0 1267.5,701.5,651.5,701.5,698,818.5,1221,818.5,832.5,690,1087,690,263.5,916.5,1655.5,916.5,11,1.5,0 1202.5,715,716.5,715,799,798,1120,798,778.5,711.5,1140.5,711.5,383,869.5,1536,869.5,10,1.5,0 1133,729.5,786,729,892.5,779,1027,779,718,735.5,1201,735.5,484.5,829,1434.5,829,9,1.5,0 1058,744.5,860.5,744.5,978.5,761,940.5,761,650,763,1269,762.5,572.5,793.5,1346.5,793.5,8,1.5,0 978.5,761,940.5,761,1058,744.5,860.5,744.5,572.5,793.5,1346.5,793.5,650,763,1269,762.5,7,1.5,0 892.5,779,1027,779,1133,729.5,786,729,484.5,829,1434.5,829,718,735.5,1201,735.5,6,1.5,0 799,798,1120,798,1202.5,715,716.5,715,383,869.5,1536,869.5,778.5,711.5,1140.5,711.5,5,1.5,0 698,818.5,1221,818.5,1267.5,701.5,651.5,701.5,263.5,916.5,1655.5,916.5,832.5,690,1087,690,4,1.5,0 588.5,841,1330.5,841,1328,689.5,591,689.5,123,973,1796,973,881,671,1037.5,671,3,1.5,0 468.5,866,1450.5,866,1385,677.5,534,677.5,0,0,0,0,925,653,994,653,2,1.5,0 458.5,844.5,1440,889,1417,687.5,564,668,0,0,1915.5,1062.5,956,659,1024,648,2,1.5,0.5 574.5,821.5,1316,862.5,1359.5,700,620,679,89.5,953.5,1760,993.5,912.5,677,1068.5,665,3,1.5,0.5 681,800.5,1202.5,838,1298,713,680.5,690.5,229.5,900.5,1619,934,865,696.5,1118,683.5,4,1.5,0.5 779,781,1098.5,815.5,1232,727.5,744.5,703.5,348,855.5,1500,884,811.5,719,1173,704.5,5,1.5,0.5 870.5,763.5,1003,795.5,1161.5,742.5,812.5,717,450,816.5,1398,841,752,744.5,1234,727.5,6,1.5,0.5 954.5,746.5,915,776,1085.5,758.5,886,731.5,538.5,783,1310.5,804.5,685,772,1302.5,753.5,7,1.5,0.5 1033,731.5,833.5,758.5,1004,776,964.5,746.5,616.5,753.5,1234,772,608.5,804.5,1380.5,783,8,1.5,0.5 1106.5,717,757.5,742.5,916,795.5,1048.5,763.5,685,727.5,1167,744.5,521,841,1469,816.5,9,1.5,0.5 1174.5,703.5,687,727.5,820.5,815.5,1140,781,746,704,1107.5,719,419,884,1571,855.5,10,1.5,0.5 1238.5,690.5,621,713,716.5,838,1238,800.5,801,683.5,1054,696.5,300.5,934,1689.5,900.5,11,1.5,0.5 1299,679,559.5,700,603,862.5,1344.5,821.5,850.5,665,1006.5,677,159,993.5,1829.5,953.5,12,1.5,0.5 1355,668,502,687.5,479,889,1460.5,844.5,895,648,963,659,3.5,1062.5,0,0,13,1.5,0.5 1326.5,659,468.5,698.5,490.5,914.5,1469.5,824.5,866,642.5,931.5,664,29.5,1078.5,0,0,13,1.5,1 1270.5,669,526.5,711.5,619.5,885,1357.5,803.5,820.5,659,973.5,683,196.5,1015.5,1861,935,12,1.5,1 1211,680.5,589,725.5,736.5,859.5,1254,784,770.5,677,1020,703.5,339,952,1722,885,11,1.5,1 1148,692.5,655.5,740.5,843,835,1158.5,765.5,715,697.5,1072.5,727,458,899.5,1604.5,842,10,1.5,1 1080.5,705,727.5,756.5,941.5,813,1069.5,748.5,653,720,1131,752.5,559,854.5,1502,805.5,9,1.5,1 1008.5,719,804.5,774,1031,792.5,987,733.5,584,744.5,1197,782,646,815.5,1413,773,8,1.5,1 932,733.5,887.5,792.5,1114.5,774,910.5,719,506,773,1273,815.5,722,782,1335,744.5,7,1.5,1 849.5,748.5,977.5,813,1191.5,756.5,838.5,705,417,805.5,1360,854.5,788,752.5,1266,720,6,1.5,1 760.5,765.5,1076,835,1263.5,740.5,771,692.5,314.5,842,1461,899.5,846.5,727,1204,697.5,5,1.5,1 665,784,1182.5,859.5,1330,725.5,708,680.5,197,885,1580,952,899,703.5,1148.5,677.5,4,1.5,1 561.5,803.5,1299.5,885.5,1392.5,711,648.5,669,58,935,1722.5,1015.5,945.5,683,1098.5,659,3,1.5,1 449.5,824.5,1428.5,914.5,1450.5,698.5,592.5,659,0,0,1890,1078.5,987.5,664,1053,642.5,2,1.5,1 441,806.5,1416,942,1486,709.5,620,650,0,0,0,0,1020.5,670,1081,637.5,2,1.5,1.5 549.5,786,1282,910.5,1427.5,723.5,675.5,660,28,918,1682.5,1038.5,979.5,689.5,1127,653.5,3,1.5,1.5 650,768,1161,882,1364.5,738.5,733.5,671,166,870.5,1539.5,971.5,933.5,711,1178,671,4,1.5,1.5 743.5,751.5,1051.5,855.5,1297,754.5,796,682,283.5,830,1420.5,915.5,882.5,735,1234,690.5,5,1.5,1.5 830,735.5,950.5,832,1224,771.5,862,694.5,385,794.5,1320,868.5,825,762,1296.5,712,6,1.5,1.5 910.5,721,858.5,810.5,1146,790,933,707,474,763.5,1234,828,760,792.5,1366,736.5,7,1.5,1.5 986,707,773,790,1060.5,810.5,1008.5,721,553,736.5,1159,792.5,685,828,1445,763.5,8,1.5,1.5 1057,694.5,695,771.5,968.5,832,1089,735.5,622.5,712,1094,762,599,868.5,1534,794.5,9,1.5,1.5 1123,682,622,754.5,867.5,855.5,1175.5,751.5,685,690.5,1036.5,735,498.5,915.5,1635.5,830,10,1.5,1.5 1185.5,670.5,554.5,738.5,758,882,1269,768,741,671,985.5,711,379.5,971.5,1753,870.5,11,1.5,1.5 1243.5,660,491.5,723.5,637,910.5,1369.5,786,792,653.5,939.5,689.5,236.5,1038.5,1891,918,12,1.5,1.5 1299.5,650,433,709.5,503,942,1478,806.5,838,637.5,898.5,670,0,0,0,0,13,1.5,1.5 1273,642,395,721.5,516.5,972,1486,789,810.5,632.5,864.5,676,0,0,0,0,13,1.5,2 1218.5,651,454.5,736.5,655.5,938,1380.5,770.5,764,648.5,904,696.5,279,1063,1911,901.5,12,1.5,2 1160.5,661.5,518,752.5,780.5,906.5,1282.5,753.5,712.5,665,949,718,422.5,992,1782.5,856.5,11,1.5,2 1099.5,672,586,769,894.5,878,1192,737.5,656,684,999,743.5,541.5,932.5,1666,817.5,10,1.5,2 1034,683.5,660,787.5,997.5,852.5,1107.5,723,593,705,1055.5,771.5,641,883,1564,784,9,1.5,2 965,695.5,740,807.5,1092,829,1028.5,709,523,728,1119.5,803.5,726,840,1475,754.5,8,1.5,2 890.5,709,827,829,1178.5,807.5,954,695.5,444,754.5,1193,840,799.5,803.5,1396,728,7,1.5,2 811.5,722.5,921.5,852.5,1259,787.5,885,683.5,355,784,1278,883,863.5,771.5,1326,705,6,1.5,2 727,737.5,1024.5,878,1333,769,819.5,672,253,817.5,1377.5,932.5,920,743.5,1263,684,5,1.5,2 636.5,753.5,1138,906.5,1401,752.5,758.5,661.5,136.5,856.5,1496.5,992,970.5,718,1206.5,665,4,1.5,2 538.5,770.5,1263.5,938,1464.5,736.5,700.5,651,8,901.5,1640,1063,1015,696.5,1155,648.5,3,1.5,2 433,789,1402.5,972,1524,721.5,646,642,0,0,0,0,1054.5,676,1108.5,632.5,2,1.5,2 425.5,773,1387,1005.5,1564,734.5,670.5,634,0,0,0,0,1090,682.5,1135.5,627.5,2,1.5,2.5 528,755.5,1242.5,967.5,1504,749.5,725,643,0,0,1591,1077.5,1051,703.5,1182,643,3,1.5,2.5 623,739.5,1113,933,1440,767,782,653,108.5,843,1450.5,1013.5,1007.5,726,1234.5,659.5,4,1.5,2.5 712,724.5,995.5,902.5,1370.5,785,842.5,663,224.5,806.5,1332.5,950.5,959,752.5,1291,677.5,5,1.5,2.5 794.5,710.5,889.5,874.5,1296,804.5,906.5,673.5,326,774,1233.5,898,904,781.5,1354.5,698,6,1.5,2.5 872,697.5,792.5,849,1215,826,974.5,685.5,415,745.5,1149.5,853.5,841.5,815,1425,720,7,1.5,2.5 944.5,685.5,704,826,1126.5,849,1047,697.5,494,720,1077.5,815,769.5,853.5,1504,745.5,8,1.5,2.5 1012.5,673.5,623,804.5,1029.5,874.5,1124.5,710.5,564.5,697.5,1015,781.5,685.5,898,1593,774,9,1.5,2.5 1076.5,663,548.5,785,923.5,902.5,1207,724.5,628,677.5,960,752.5,586.5,950.5,1694.5,806.5,10,1.5,2.5 1137,653,479,767,806,933,1296,739.5,684.5,659.5,911.5,726,468.5,1013.5,1810.5,843,11,1.5,2.5 1194,643,415,749.5,676.5,967.5,1391,755.5,737,643.5,868,703.5,329,1077.5,0,0,12,1.5,2.5 1248.5,634,355,734.5,532,1005.5,1493.5,773,783.5,627.5,829,682.5,0,0,0,0,13,1.5,2.5 1225,626.5,312.5,747.5,549,1042,1500,757.5,758,623.5,793,688.5,0,0,0,0,13,1.5,3 1171.5,635,372.5,764.5,699.5,1000,1400.5,741.5,710,638.5,830.5,710.5,0,0,0,0,12,1.5,3 1115,644,437.5,782.5,834,962.5,1308,726.5,658,654,872,734.5,517.5,1036.5,1837.5,831,11,1.5,3 1055,654,507.5,801.5,955,929,1221.5,712,600.5,671.5,919.5,761.5,634.5,969.5,1722,795.5,10,1.5,3 992,664,583,823,1064,898.5,1141,699,537,691,973,792,732,914.5,1621,764,9,1.5,3 925.5,675.5,665.5,846,1163,871,1065,686.5,466,713,1034,827,814.5,867,1532,737,8,1.5,3 854,686.5,756,871,1253.5,846,993.5,675,387,737,1104.5,867,885,827,1453,713,7,1.5,3 778,699.5,855,898.5,1336,823,927,664,298,764,1187,914.5,946,792,1382,691.5,6,1.5,3 697.5,712,964,929,1411.5,801.5,864,654,197,795.5,1284.5,969.5,999.5,761.5,1318.5,671.5,5,1.5,3 611,726.5,1085,962.5,1481.5,782.5,804,644,81.5,831,1401.5,1036.5,1047,734.5,1261,654,4,1.5,3 518.5,741.5,1219.5,1000,1546.5,764.5,747.5,635,0,0,0,0,1088.5,710.5,1209,638,3,1.5,3 419,757.5,1370.5,1042,1606.5,747.5,694,626.5,0,0,0,0,1126,688.5,1161,623.5,2,1.5,3 412.5,743.5,1351,1074,1651.5,762,716.5,619.5,0,0,0,0,1164,695.5,1186.5,618.5,2,1.5,3.5 509,728.5,1194.5,1036,1591,780,769.5,627.5,0,0,0,0,1127.5,718,1234,633.5,3,1.5,3.5 599.5,714,1054.5,994.5,1526,799.5,825.5,636.5,56,818.5,1349.5,1061,1087,742.5,1287,648.5,4,1.5,3.5 683.5,701,929.5,957.5,1455.5,820,884,645.5,170.5,785,1233.5,990,1041.5,771,1344.5,666,5,1.5,3.5 763,688.5,818,924.5,1378.5,843,946,655,271.5,755,1138,931,990,802.5,1408.5,684.5,6,1.5,3.5 837,676.5,716.5,895,1295,867.5,1012,666,360.5,729,1057,881.5,931,839,1479.5,705.5,7,1.5,3.5 907,666,624,867.5,1202.5,895,1082,676.5,439.5,705.5,988,839,862,881.5,1558.5,729,8,1.5,3.5 973,655,540.5,843,1101,924.5,1156,688.5,510.5,684.5,929,802.5,781,931,1647.5,755,9,1.5,3.5 1035,645.5,463.5,820,989.5,957.5,1235.5,701,574.5,665.5,877.5,771,685.5,990,1748.5,785,10,1.5,3.5 1094,636.5,393,799.5,864.5,994.5,1319.5,714,632,648.5,832,742.5,569.5,1061,1863,818.5,11,1.5,3.5 1149.5,627.5,328,780,724.5,1036,1410,728.5,685,633.5,791.5,718,0,0,0,0,12,1.5,3.5 1202.5,619.5,267.5,762,568,1074,1506.5,743.5,732.5,618.5,755,695.5,0,0,0,0,13,1.5,3.5 1180.5,612.5,219.5,777,0,0,1512.5,730.5,708.5,614.5,716,702.5,0,0,0,0,13,1.5,4 1128.5,620.5,279.5,796.5,752.5,1071,1418.5,716,660,628.5,750.5,726,0,0,0,0,12,1.5,4 1073.5,628.5,345.5,817,898,1030,1330.5,702.5,607,643.5,789.5,751.5,627,1076.5,1888,807.5,11,1.5,4 1015.5,637.5,417,840,1027,989.5,1248,690.5,548.5,660,833,781,739.5,1011.5,1773.5,775,10,1.5,4 954,647,494.5,864.5,1142,953.5,1171,678.5,485,678.5,883.5,814,833.5,949,1673.5,746.5,9,1.5,4 889.5,657,579.5,891,1246,920.5,1098,667,413.5,698.5,940.5,852.5,911.5,897,1584.5,721,8,1.5,4 821,667,673,920.5,1339.5,891,1029.5,657,334.5,721,1007,897,978.5,852.5,1505.5,698.5,7,1.5,4 748,678.5,776.5,953.5,1424.5,864.5,965,647,245.5,746.5,1085.5,949,1035.5,814,1434.5,678.5,6,1.5,4 671,690.5,892,989.5,1502,840,903.5,637.5,145.5,775,1179.5,1011.5,1086,781,1370.5,660,5,1.5,4 588.5,702.5,1021,1030,1573.5,817,845.5,628.5,31.5,807.5,1292,1076.5,1129.5,751.5,1312,643.5,4,1.5,4 500.5,716,1166.5,1071,1639.5,796.5,790.5,620.5,0,0,0,0,1168.5,726,1259,628.5,3,1.5,4 406.5,730.5,0,0,1699.5,777,738.5,612.5,0,0,0,0,1203,702.5,1210.5,614.5,2,1.5,4 400.5,718,0,0,1751,794,759,605.5,0,0,0,0,1243.5,710,1234.5,610.5,2,1.5,4.5 492.5,704.5,0,0,1691,814.5,810.5,613.5,0,0,0,0,1210,734,1283,623.5,3,1.5,4.5 578,691.5,984,1067.5,1625,836.5,864.5,621.5,10.5,796,0,0,1173.5,760.5,1336.5,638.5,4,1.5,4.5 659,679.5,851,1024.5,1553.5,860.5,922,629.5,121.5,765,1122,1035,1132,791,1394.5,654,5,1.5,4.5 734.5,668.5,732,984,1475,887,982,638.5,221,737.5,1030.5,968.5,1084,826,1459,672,6,1.5,4.5 805.5,658,626,948.5,1388.5,916.5,1046,648,309.5,713,954.5,912.5,1028.5,866,1530.5,691.5,7,1.5,4.5 873,648,530.5,916.5,1293,948.5,1113.5,658,388.5,691.5,890.5,866,964.5,912.5,1609.5,713,8,1.5,4.5 936.5,638.5,444,887,1187,984,1184.5,668.5,460,672,835,826,888.5,968,1698,737.5,9,1.5,4.5 997,629.5,365.5,860.5,1068,1024.5,1260.5,679.5,524.5,654,787,791,797,1035,1797.5,765,10,1.5,4.5 1054.5,621.5,294,836.5,935,1067.5,1341,691.5,582.5,638.5,745.5,760.5,0,0,1908.5,796,11,1.5,4.5 1108.5,613.5,228,814.5,0,0,1426.5,704.5,636,624,709,734,0,0,0,0,12,1.5,4.5 1160,605.5,168,794,0,0,1518.5,718,684.5,610.5,675.5,710,0,0,0,0,13,1.5,4.5 1140,599.5,112.5,811.5,0,0,1524,706,661.5,606,634,717,0,0,0,0,13,1.5,5 1089,606.5,173,833,0,0,1434.5,693.5,612.5,619.5,665,742,0,0,0,0,12,1.5,5 1035.5,614.5,239,857.5,0,0,1351,681.5,559,633.5,699.5,770,0,0,0,0,11,1.5,5 979.5,622.5,310.5,883.5,1114,1063,1272,670,500.5,649,739.5,801.5,858.5,1059,1821,755.5,10,1.5,5 920,631,390,912.5,1236,1019,1197.5,659,436,666,785,838,947,988.5,1722,729.5,9,1.5,5 857.5,640,477.5,944,1344.5,979,1128,649.5,364.5,685,837,880,1020,930,1633.5,706,8,1.5,5 791,649.5,574.5,979,1441.5,944,1061.5,640,285.5,706,899,930,1082,880,1554.5,685,7,1.5,5 721.5,659,683,1019,1529,912.5,999,631,197,729.5,972,988.5,1134,838,1483,666,6,1.5,5 647,670,805,1063,1608.5,883.5,939.5,622.5,98,755.5,1060.5,1059,1179.5,801.5,1418.5,649,5,1.5,5 568,681.5,0,0,1680,857.5,883.5,614.5,0,0,0,0,1219.5,770,1360,633.5,4,1.5,5 484.5,693.5,0,0,1746,833,830,606.5,0,0,0,0,1254,742,1306.5,619.5,3,1.5,5 395,706,0,0,1806.5,811.5,779,599.5,0,0,0,0,1285,717,1257.5,606,2,1.5,5 390.5,695,0,0,1865.5,830,798,593,0,0,0,0,1328.5,725,1280,602,2,1.5,5.5 477.5,682.5,0,0,1805.5,854,848,600,0,0,0,0,1299.5,751,1329.5,615,3,1.5,5.5 559,671.5,0,0,1740,880,901,607.5,0,0,0,0,1267,780,1383,628.5,4,1.5,5.5 636.5,661,0,0,1668,908.5,956.5,615.5,76,746.5,994.5,1075.5,1229.5,813,1442,644,5,1.5,5.5 709,651,628.5,1057,1588.5,940,1015,623.5,174,721.5,910,1009.5,1187,851.5,1507,660,6,1.5,5.5 777.5,641,517.5,1013.5,1500,974.5,1076.5,632.5,262.5,699,840,947.5,1137.5,895.5,1577.5,678.5,7,1.5,5.5 842.5,632.5,419,974.5,1401.5,1013.5,1142,641,341.5,678.5,781.5,895.5,1079,947.5,1656.5,699,8,1.5,5.5 903.5,623.5,330.5,940,1290.5,1057,1210,651,412,660,732,851.5,1009,1009.5,1745,721.5,9,1.5,5.5 962.5,615.5,251,908.5,0,0,1282.5,661,477,644,689,813,924.5,1075.5,1843,746.5,10,1.5,5.5 1018,607.5,179,880,0,0,1360,671.5,536,628.5,652,780,0,0,0,0,11,1.5,5.5 1071,600,113.5,854,0,0,1441.5,682.5,589.5,615,619.5,750.5,0,0,0,0,12,1.5,5.5 1121,593,53.5,830,0,0,1528.5,695,639,602,590.5,725,0,0,0,0,13,1.5,5.5 1103,587.5,2,849.5,0,0,1533.5,684.5,617,598,545.5,733,0,0,0,0,13,1.5,6 1053,594,48.5,876,0,0,1449,673,567.5,611,572,759.5,0,0,0,0,12,1.5,6 1001,601,114,904.5,0,0,1368.5,662,513.5,624.5,602.5,790,0,0,0,0,11,1.5,6 946,608.5,186,935,0,0,1293.5,652,454.5,638.5,637,825,0,0,1865,738.5,10,1.5,6 888.5,616.5,266,969.5,0,0,1222,642.5,390,655,676.5,864.5,1075.5,1033,1767,714,9,1.5,6 828.5,624.5,355,1007.5,1464,1050.5,1155,633.5,318.5,672.5,723,911.5,1141,966.5,1679,692.5,8,1.5,6 764,633.5,455,1050.5,1564,1007.5,1090.5,624.5,240,692.5,778,966.5,1196,911.5,1600.5,672.5,7,1.5,6 697,642.5,0,0,1653,969.5,1030.5,616.5,152,714,844,1033,1242.5,864.5,1529,655,6,1.5,6 625.5,652,0,0,1733,935,973,609,54,738.5,0,0,1282,825,1464.5,638.5,5,1.5,6 550.5,662,0,0,1805,904.5,918,601,0,0,0,0,1316.5,790,1405.5,624.5,4,1.5,6 470,673,0,0,1870.5,876,866,594,0,0,0,0,1347,760,1351.5,611,3,1.5,6 385.5,684.5,0,0,1917,849.5,816,587.5,0,0,0,0,1373.5,733,1302,598,2,1.5,6 381,674.5,0,0,0,0,833.5,582,0,0,0,0,1420.5,741.5,1323,594,2,1.5,6.5 464,664,0,0,0,0,882.5,588,0,0,0,0,1396,769.5,1373,606.5,3,1.5,6.5 541.5,653,0,0,1875.5,931,934,595,0,0,0,0,1368.5,800.5,1427,619.5,4,1.5,6.5 616,643.5,0,0,1803.5,964.5,988,602,34,730,0,0,1337.5,837,1486.5,633.5,5,1.5,6.5 685.5,634.5,0,0,1724,1002.5,1045,609.5,131,706.5,773,1057.5,1300.5,879,1551,649.5,6,1.5,6.5 751.5,625.5,389,1079,1634.5,1044.5,1104.5,617.5,218,685.5,711.5,987,1258,928.5,1622.5,666.5,7,1.5,6.5 814.5,617.5,284.5,1044.5,1530,1079,1167.5,625.5,296.5,666.5,661,928.5,1207.5,987,1701,685.5,8,1.5,6.5 874,609.5,195,1002.5,0,0,1233.5,634.5,368,649.5,618.5,879,1146,1057.5,1788,706.5,9,1.5,6.5 931,602,115.5,964.5,0,0,1303,643.5,432.5,633.5,581.5,837,0,0,1885,730,10,1.5,6.5 985,595,43.5,931,0,0,1377.5,653,492,619.5,550.5,800.5,0,0,0,0,11,1.5,6.5 1036.5,588,0,0,0,0,1455,664,546,606.5,523,769.5,0,0,0,0,12,1.5,6.5 1085.5,582,0,0,0,0,1538,674.5,596,594,498.5,741.5,0,0,0,0,13,1.5,6.5 1069,576.5,0,0,0,0,1542,665,575,591,450,750.5,0,0,0,0,13,1.5,7 1020,583,0,0,0,0,1461.5,655,525,602.5,471.5,779,0,0,0,0,12,1.5,7 969,589,0,0,0,0,1385,645,470.5,615.5,496,812.5,0,0,0,0,11,1.5,7 916,596,37,997,0,0,1313,635.5,411.5,629,524,850,0,0,1905,722,10,1.5,7 860,603,116.5,1038.5,0,0,1244.5,626.5,346.5,644,557,894,1221.5,1074.5,1808.5,699.5,9,1.5,7 801.5,610.5,208,1075.5,0,0,1179.5,618.5,275.5,661,595.5,946,1277,1008.5,1722,679.5,8,1.5,7 739.5,618.5,0,0,1711,1075.5,1117.5,610.5,197,679.5,642,1008.5,1323.5,946,1643.5,661,7,1.5,7 674.5,626.5,0,0,1802.5,1038.5,1059,603,110.5,699.5,698.5,1074.5,1362,894,1572.5,644,6,1.5,7 606,635.5,0,0,1882,997,1003,596,14,722,0,0,1395,850,1507.5,629,5,1.5,7 534,645,0,0,0,0,950,589,0,0,0,0,1423,812.5,1448.5,615.5,4,1.5,7 457.5,655,0,0,0,0,899,583,0,0,0,0,1447.5,779,1394,602.5,3,1.5,7 377,665,0,0,0,0,850,576.5,0,0,0,0,1469,750,1344,591,2,1.5,7 372.5,656,0,0,0,0,866.5,571.5,0,0,0,0,1520,759,1364,587,2,1.5,7.5 451.5,646,0,0,0,0,914.5,577.5,0,0,0,0,1501,789,1414.5,598.5,3,1.5,7.5 526.5,637,0,0,0,0,964.5,584,0,0,0,0,1479.5,824,1469,611,4,1.5,7.5 597,628.5,0,0,0,0,1017.5,590,2.5,714,0,0,1455.5,863.5,1528,624.5,5,1.5,7.5 664.5,619.5,0,0,1889.5,1072.5,1072,597,90.5,692.5,0,0,1426.5,910.5,1592.5,639,6,1.5,7.5 728,611.5,0,0,0,0,1130,604,177,673,567,1031,1392.5,965,1664,655,7,1.5,7.5 789,604,0,0,0,0,1191,611.5,255,655,526.5,965,1352,1031,1742,673,8,1.5,7.5 846.5,597,29.5,1072.5,0,0,1254.5,619.5,326.5,639,492.5,910.5,0,0,1828.5,692.5,9,1.5,7.5 901.5,590,0,0,0,0,1322,628.5,391,624.5,463.5,863.5,0,0,1916.5,714,10,1.5,7.5 954.5,584,0,0,0,0,1392.5,637,450,611,439.5,824,0,0,0,0,11,1.5,7.5 1004.5,577.5,0,0,0,0,1467.5,646,504.5,598.5,418,789,0,0,0,0,12,1.5,7.5 1052.5,571.5,0,0,0,0,1546.5,656,555,587,399,759,0,0,0,0,13,1.5,7.5 1037,566.5,0,0,0,0,1550.5,647.5,535,583.5,346.5,768.5,0,0,0,0,13,1.5,8 989.5,572.5,0,0,0,0,1473,638,484.5,595,362,800,0,0,0,0,12,1.5,8 939.5,578,0,0,0,0,1400,629.5,430,607,380,836,0,0,0,0,11,1.5,8
- 해결됨[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
input_size
input_size = x_seq.size(2)라고 했는데 5일치를 입력으로 하는 거면 input_size = x_seq.size(1)아닌가요?
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
텐서데이터 만들기 코드 관련 문의
안녕하세요. 아래 시퀀스데이터 만들 때 sequence_length 5일로 다음날 1일을 예측하는 것인데, 만약 5일로 3일을 예측하려고 하면 아래와 같이 변경하면 되나요? 기존 : y_seq.append(y[i+sequence_length]) 변경 : y_seq.append(y[i+sequence_length+2]) [강사님 코드] def seq_data(x, y, sequence_length): x_seq = [] y_seq = [] for i in range(len(x)-sequence_length): x_seq.append(x[i:i+sequence_length]) y_seq.append(y[i+sequence_length]) return torch.FloatTensor(x_seq).to(device), torch.FloatTensor(y_seq).to(device).view([-1, 1]) split = 200sequence_length = 5 x_seq, y_seq = seq_data(X, y, sequence_length)x_train_seq = x_seq[:split]y_train_seq = y_seq[:split]x_test_seq = x_seq[split:]y_test_seq = y_seq[split:]print(x_train_seq.size(), y_train_seq.size())print(x_test_seq.size(), y_test_seq.size())
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
모델로 실제 예측값 구하기 위한 코드 관련 질문
안녕하세요. 강의해주신 RNN 및 LSTM 기반으로 모델링 까지는 모두 이해를 한 상태인것 같습니다. 더 나아가서 해당 모델로 실제 예측값을 구하기 위해 구글링을 통해 방법을 검색하고 해보고 있으나 자꾸 오류가 뜨네요.. https://data-panic.tistory.com/33 위 URL의 예측 코드를 참고하여 강의해주신 모델링 기반 미래가격을 뽑으려고 아래 코드를 써서 미래 가격을 산출했으나, 예측치 14개 모두가 동일한 값이 나오고, inverse_transform 도 먹히지 않아 scaling 된 값이 그대로 산출 됩니다. 어떤 문제 때문일지 혹시 예측이 되실까요? DAYS_TO_PREDICT = 14 with torch.no_grad(): test_seq = X_all[:1] preds = [] for _ in range(DAYS_TO_PREDICT): y_test_pred = model(test_seq) pred = torch.flatten(y_test_pred).item() preds.append(pred) new_seq = test_seq.numpy().flatten() new_seq = np.append(new_seq, [pred]) new_seq = new_seq[1:] #test_seq = torch.as_tensor(new_seq).view(1, seq_length, 1).float() In [74]: # Prediction value 스케일링 역변환 pred_values = yscaler.inverse_transform(np.array(preds).reshape(-1,1)) In [75]: # 예측값 반올림 import math pred_values_ceiled = list(pred_values.flatten()) predicted_cases=pred_values_ceiled predicted_cases
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
시계열 데이터로 Bi-LSTM 수행 후 RMSE 평가 코드 관련
Bi-LSTM에서 RMSE를 구하려고 하는데, y_true and y_pred have different number of output(10!=1) 이라는 메시지가 뜹니다. 코딩은 ANN의 RMSE 코딩을 차용했습니다. 오류 원인이 혹시 무엇인지 알 수 있을까요? def evaluation(dataloader): predictions = torch.tensor([], dtype=torch.float) # 예측값을 저장하는 텐서 actual = torch.tensor([], dtype=torch.float) # 실제값을 저장하는 텐서 with torch.no_grad(): model.eval() # 평가를 할 때에는 .eval() 반드시 사용해야 한다. for data in dataloader: inputs, values = data outputs = model(inputs) predictions = torch.cat((predictions, outputs), 0) # cat을 통해 예측값을 누적 actual = torch.cat((actual, values), 0) # cat을 통해 실제값을 누적 rmse = np.sqrt(mean_squared_error(predictions, actual)) # sklearn을 이용하여 RMSE 계산 return rmse
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
순환신경망 모델에서 fully connected layer 차이
안녕하세요. RNN은 forward 부분에서 fully connected layer에 nn.linear의 결과를 다시 sigmoid 함수에 넣어주는데 LSTM, GRU 는 그렇지 않은 이유가 무엇인지 궁금합니다. RNN의 경우 self.fc = nn.Sequential(nn.Linear(hidden_size*sequence_length, 1), nn.Sigmoid()) LSTM 및 GRU의 경우 self.fc = nn.Linear(hidden_size*sequence_length, 1)
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
LSTM 평가 코드
LSTM 평가를 위해 교재의 4-1 Neural Networks 상 평가 코드(아래와 같음)을 붙여넣기 해서 출력한 결과 [can't convert cuda:0 device type to numpy. Use Tensor.cpu() to copy the tensor to host memory.]라는 오류가 뜹니다. 조치방법이 어떻게 되는지 궁금합니다. def evaluation(dataloader): predictions = torch.tensor([], dtype=torch.float) # 예측값을 저장하는 텐서 actual = torch.tensor([], dtype=torch.float) # 실제값을 저장하는 텐서 with torch.no_grad(): model.eval() # 평가를 할 때에는 .eval() 반드시 사용해야 한다. for data in dataloader: inputs, values = data outputs = model(inputs) predictions = torch.cat((predictions, outputs), 0) # cat을 통해 예측값을 누적 actual = torch.cat((actual, values), 0) # cat을 통해 실제값을 누적 rmse = np.sqrt(mean_squared_error(predictions, actual)) # sklearn을 이용하여 RMSE 계산 return rmse 평가 시 .eval()을 사용해야 하는 이유# 평가 시에는 온전한 모델로 평가를 해야하는데 .eval()이 아닌 .train()인 경우 드랍아웃이 활성화 되어 있다.# 따라서 드랍아웃이나 배치 정규화 등과 같이 학습 시에만 사용하는 기술들을 평가 시에는 비활성화 해야만 한다. train_rmse = evaluation(trainloader) # 학습 데이터의 RMSEtest_rmse = evaluation(testloader) # 시험 데이터의 RMSE print("Train RMSE: ",train_rmse)print("Test RMSE: ",test_rmse) # 예시를 위한 단순 비교입니다. 실제 연구에서는 디테일한 비교가 필요합니다.# 20번의 평가 결과의 평균으로 결과값을 산정 했습니다.# 데이터를 무작위로 나누고 모델의 초기값도 random initial parameter를 사용했기 때문에 학습을 할 때 마다 결과가 다르게 나올 수 있습니다.# 이 강의에서는 학습의 흐름(for문)과 모델(Regressor) 부분을 주의 깊게 보시면 됩니다.
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
가중치 선택
안녕하세요 강사님! 궁금한 점이 있는데요~ 같은 모델에 대한 학습 및 가중치 저장을 총 30회 정도 반복할 경우 각각의 학습에 대한 가중치 및 결과가 다르게 나올텐데요. 1.서비스를 한다고 하면 max 결과를 돌려준 가중치를 사용하면 되는 건가요? 2.이 모델에 대한 설명은 어떻게 하는게 더 적절할까요? 1) 최대 max 평가수치를 줄 수 있는 모델이다? 2) 평균(또는 중위값)을 줄 수 있는 모델이다?
- 해결됨[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
검증 및 테스트 진행시 Drop-out 질문
드롭아웃은 학습시에만 사용해야한다고 하였는데, 그렇다면 검증 및 Test를 진행할때 model.eval()을 해주면 드롭아웃이 자동으로 비활성화 되는 것인가요?? 아니면 따로 설정을 해주어야 하는건가요??
- 미해결[파이토치] 실전 인공지능으로 이어지는 딥러닝 - 기초부터 논문 구현까지
집값 예측에서 보스턴 자료 오류
집값 예측 강의에서 보스턴 자료가 더이상 파이토치에서 지원하지 않는다고 합니다. 초보분들을 위해서 다른 데이터로 변경한 강의가 제공된다면, 더 좋은 강의가 될 것 같습니다. (또는 이전 보스턴 자료를 받는 다른 방법을 이용하고 윤리적인 이슈로 스터디 용도로만 사용할 것을 명시)