【Python机器学习】实验13 基于神经网络的回归-分类实验
文章目录
- 神经网络
- 例1 基于神经网络的回归(简单例子)
- 1.1 导入包
- 1.2 构造数据集(随机构造的)
- 1.3 构造训练集和测试集
- 1.4 构建神经网络模型
- 1.5 采用训练数据来训练神经网络模型
- 实验:基于神经网络的分类(鸢尾花数据集)
- 1. 导入包
- 2. 构造数据集
- 3. 构造训练集和测试集
- 4. 构建神经网络模型
- 5. 采用训练数据来训练神经网络模型
神经网络
例1 基于神经网络的回归(简单例子)
1.1 导入包
import torch
import numpy as np
from torch import nn
from sklearn.model_selection import train_test_split
1.2 构造数据集(随机构造的)
from torch.autograd import Variable
batch_n=100
hidden_layer=100
input_data=1000
output_data=10
x=Variable(torch.randn(batch_n,input_data),requires_grad=True)
y=Variable(torch.randn(batch_n,output_data),requires_grad=True)
1.3 构造训练集和测试集
x_train,x_test,y_train,y_test=train_test_split(x,y,test_size=0.2,random_state=0)
x_train.shape,x_test.shape,y_train.shape,y_test.shape
(torch.Size([80, 1000]),torch.Size([20, 1000]),torch.Size([80, 10]),torch.Size([20, 10]))
torch.Tensor(np.array([1,2]))
tensor([1., 2.])
y_test
tensor([[-0.1810, 0.2906, 0.4490, 1.3190, -1.1832, -0.0035, 0.5440, -0.8954,0.7686, 1.3758],[ 1.1767, -0.6170, -0.7946, -1.2191, 0.5998, -0.8591, -2.7796, -0.7918,-0.1282, 0.2730],[ 1.8079, 0.9862, -1.7850, -0.4031, 1.5472, 0.1663, -0.5043, 1.2402,-2.2270, 1.9437],[-0.0478, 0.1177, -0.4014, 0.6531, -2.0040, 1.5664, 2.0697, -0.5635,-0.4687, 1.5910],[ 1.5076, 1.0444, -1.7943, 0.7268, 1.1636, 0.1772, -1.0183, -1.0916,0.5012, 2.0798],[ 0.7027, -0.0999, -0.0670, -0.1838, 0.6959, 1.5484, 0.1950, -0.5757,1.4192, -0.6865],[ 1.7699, -1.9956, 0.1742, -0.6788, -2.0619, 0.8384, 2.1277, -1.2390,-1.0382, 0.5834],[ 0.8416, 1.6485, -0.0215, 0.0048, -1.7932, 0.1007, -2.4015, 0.3087,-0.7603, 0.9714],[-0.6723, -1.3535, -0.8598, -0.4294, -1.6416, 0.3986, -0.3160, 0.9952,0.6939, -1.2953],[ 0.1403, 0.2171, -1.0277, -0.6372, 0.2468, 1.6663, 0.3363, 0.5068,-0.0259, -0.8080],[ 0.9330, 0.8476, -0.3819, 0.8394, 1.1713, -0.6932, -0.0453, -1.3850,0.6089, -0.7219],[-0.1061, -2.8115, -1.7533, -0.3561, 0.5066, 0.5846, 0.2225, 0.7907,0.6693, 0.1164],[ 1.4511, -0.7063, -0.2785, 1.1644, -0.4726, -0.9858, 0.1105, 2.6274,0.8037, 0.1488],[ 0.9054, -0.1386, 0.6521, -2.7186, -1.1272, -0.7584, -1.1367, -0.0416,-0.0663, 0.6517],[-0.9568, -0.0174, -0.8611, 0.5748, -0.9300, 1.1043, -1.6796, 0.9629,-1.1011, 0.6005],[ 0.9963, 0.5226, 0.5209, 1.0107, 0.6931, 1.6149, -0.3450, 0.5082,1.2774, -0.1767],[ 0.3884, -1.8515, -0.6365, -0.1225, 1.2765, -0.1700, 0.4384, 0.0291,0.4540, 0.7085],[ 0.9688, 1.4026, 1.1516, -0.1575, 0.6101, -0.5406, 1.9612, 0.1654,-0.8425, -0.0459],[-1.5699, 0.0486, -1.7415, 1.5327, 0.0225, -1.1386, -0.6188, 0.3958,0.5564, -1.1593],[ 0.5734, 0.8675, 0.0328, -0.2371, -0.5879, 0.7541, 0.5935, 0.9097,0.9884, 0.6365]], grad_fn=<IndexBackward0>)
1.4 构建神经网络模型
class Nerual_Network(nn.Module):def __init__(self):super().__init__()self.hidden1=nn.Linear(input_data,hidden_layer)self.output=nn.Linear(hidden_layer,output_data)self.relu=nn.ReLU()self.softmax=nn.Softmax(dim=1)def forward(self,x):x=self.hidden1(x)x=self.relu(x)x=self.output(x)x=self.softmax(x)return x
import torch.optim as optim
model=Nerual_Network()
model
Nerual_Network((hidden1): Linear(in_features=1000, out_features=100, bias=True)(output): Linear(in_features=100, out_features=10, bias=True)(relu): ReLU()(softmax): Softmax(dim=1)
)
1.5 采用训练数据来训练神经网络模型
epochs=1000
learnng_rate=0.003
critier=nn.MSELoss()
optimizer=optim.Adam(model.parameters(),lr=learnng_rate)
for i in range(epochs):outputs=model(x_train)loss=critier(outputs,y_train)print("Epoch:{},Loss:{:4f}".format(i,loss))optimizer.zero_grad()loss.backward(retain_graph=True)optimizer.step()
Epoch:0,Loss:0.948208
Epoch:1,Loss:0.896322
Epoch:2,Loss:0.855293
Epoch:3,Loss:0.819206
Epoch:4,Loss:0.790216
Epoch:5,Loss:0.769548
Epoch:6,Loss:0.755935
Epoch:7,Loss:0.747829
Epoch:8,Loss:0.743429
Epoch:9,Loss:0.741071
Epoch:10,Loss:0.739489
Epoch:11,Loss:0.738407
Epoch:12,Loss:0.737566
Epoch:13,Loss:0.736756
Epoch:14,Loss:0.736009
Epoch:15,Loss:0.735342
Epoch:16,Loss:0.734747
Epoch:17,Loss:0.734446
Epoch:18,Loss:0.734121
Epoch:19,Loss:0.733825
Epoch:20,Loss:0.733538
Epoch:21,Loss:0.733174
Epoch:22,Loss:0.732976
Epoch:23,Loss:0.732888
Epoch:24,Loss:0.732744
Epoch:25,Loss:0.732587
Epoch:26,Loss:0.732487
Epoch:27,Loss:0.732393
Epoch:28,Loss:0.732277
Epoch:29,Loss:0.732168
Epoch:30,Loss:0.732101
Epoch:31,Loss:0.732098
Epoch:32,Loss:0.731946
Epoch:33,Loss:0.731655
Epoch:34,Loss:0.731511
Epoch:35,Loss:0.731603
Epoch:36,Loss:0.731634
Epoch:37,Loss:0.731516
Epoch:38,Loss:0.731375
Epoch:39,Loss:0.731263
Epoch:40,Loss:0.731153
Epoch:41,Loss:0.731199
Epoch:42,Loss:0.731237
Epoch:43,Loss:0.731082
Epoch:44,Loss:0.730953
Epoch:45,Loss:0.730905
Epoch:46,Loss:0.730879
Epoch:47,Loss:0.730842
Epoch:48,Loss:0.730784
Epoch:49,Loss:0.730665
Epoch:50,Loss:0.730640
Epoch:51,Loss:0.730709
Epoch:52,Loss:0.730659
Epoch:53,Loss:0.730601
Epoch:54,Loss:0.730571
Epoch:55,Loss:0.730595
Epoch:56,Loss:0.730605
Epoch:57,Loss:0.730550
Epoch:58,Loss:0.730524
Epoch:59,Loss:0.730512
Epoch:60,Loss:0.730482
Epoch:61,Loss:0.730442
Epoch:62,Loss:0.730421
Epoch:63,Loss:0.730365
Epoch:64,Loss:0.730232
Epoch:65,Loss:0.730102
Epoch:66,Loss:0.730107
Epoch:67,Loss:0.730175
Epoch:68,Loss:0.730177
Epoch:69,Loss:0.730097
Epoch:70,Loss:0.730023
Epoch:71,Loss:0.730047
Epoch:72,Loss:0.730051
Epoch:73,Loss:0.729966
Epoch:74,Loss:0.729911
Epoch:75,Loss:0.729961
Epoch:76,Loss:0.729982
Epoch:77,Loss:0.729963
Epoch:78,Loss:0.729940
Epoch:79,Loss:0.729932
Epoch:80,Loss:0.729937
Epoch:81,Loss:0.729935
Epoch:82,Loss:0.729909
Epoch:83,Loss:0.729893
Epoch:84,Loss:0.729907
Epoch:85,Loss:0.729910
Epoch:86,Loss:0.729892
Epoch:87,Loss:0.729884
Epoch:88,Loss:0.729888
Epoch:89,Loss:0.729883
Epoch:90,Loss:0.729874
Epoch:91,Loss:0.729868
Epoch:92,Loss:0.729864
Epoch:93,Loss:0.729858
Epoch:94,Loss:0.729847
Epoch:95,Loss:0.729843
Epoch:96,Loss:0.729848
Epoch:97,Loss:0.729852
Epoch:98,Loss:0.729849
Epoch:99,Loss:0.729840
Epoch:100,Loss:0.729836
Epoch:101,Loss:0.729834
Epoch:102,Loss:0.729832
Epoch:103,Loss:0.729832
Epoch:104,Loss:0.729834
Epoch:105,Loss:0.729833
Epoch:106,Loss:0.729828
Epoch:107,Loss:0.729825
Epoch:108,Loss:0.729824
Epoch:109,Loss:0.729821
Epoch:110,Loss:0.729816
Epoch:111,Loss:0.729813
Epoch:112,Loss:0.729810
Epoch:113,Loss:0.729806
Epoch:114,Loss:0.729799
Epoch:115,Loss:0.729792
Epoch:116,Loss:0.729782
Epoch:117,Loss:0.729771
Epoch:118,Loss:0.729763
Epoch:119,Loss:0.729760
Epoch:120,Loss:0.729763
Epoch:121,Loss:0.729765
Epoch:122,Loss:0.729761
Epoch:123,Loss:0.729753
Epoch:124,Loss:0.729747
Epoch:125,Loss:0.729744
Epoch:126,Loss:0.729743
Epoch:127,Loss:0.729739
Epoch:128,Loss:0.729731
Epoch:129,Loss:0.729718
Epoch:130,Loss:0.729700
Epoch:131,Loss:0.729674
Epoch:132,Loss:0.729634
Epoch:133,Loss:0.729571
Epoch:134,Loss:0.729517
Epoch:135,Loss:0.729545
Epoch:136,Loss:0.729541
Epoch:137,Loss:0.729501
Epoch:138,Loss:0.729543
Epoch:139,Loss:0.729531
Epoch:140,Loss:0.729507
Epoch:141,Loss:0.729527
Epoch:142,Loss:0.729508
Epoch:143,Loss:0.729499
Epoch:144,Loss:0.729505
Epoch:145,Loss:0.729486
Epoch:146,Loss:0.729480
Epoch:147,Loss:0.729476
Epoch:148,Loss:0.729455
Epoch:149,Loss:0.729445
Epoch:150,Loss:0.729428
Epoch:151,Loss:0.729400
Epoch:152,Loss:0.729373
Epoch:153,Loss:0.729345
Epoch:154,Loss:0.729355
Epoch:155,Loss:0.729364
Epoch:156,Loss:0.729335
Epoch:157,Loss:0.729335
Epoch:158,Loss:0.729328
Epoch:159,Loss:0.729310
Epoch:160,Loss:0.729303
Epoch:161,Loss:0.729285
Epoch:162,Loss:0.729242
Epoch:163,Loss:0.729181
Epoch:164,Loss:0.729270
Epoch:165,Loss:0.729187
Epoch:166,Loss:0.729191
Epoch:167,Loss:0.729215
Epoch:168,Loss:0.729211
Epoch:169,Loss:0.729182
Epoch:170,Loss:0.729173
Epoch:171,Loss:0.729202
Epoch:172,Loss:0.729167
Epoch:173,Loss:0.729181
Epoch:174,Loss:0.729184
Epoch:175,Loss:0.729166
Epoch:176,Loss:0.729160
Epoch:177,Loss:0.729178
Epoch:178,Loss:0.729157
Epoch:179,Loss:0.729164
Epoch:180,Loss:0.729166
Epoch:181,Loss:0.729156
Epoch:182,Loss:0.729158
Epoch:183,Loss:0.729161
Epoch:184,Loss:0.729151
Epoch:185,Loss:0.729155
Epoch:186,Loss:0.729156
Epoch:187,Loss:0.729150
Epoch:188,Loss:0.729153
Epoch:189,Loss:0.729153
Epoch:190,Loss:0.729149
Epoch:191,Loss:0.729151
Epoch:192,Loss:0.729149
Epoch:193,Loss:0.729147
Epoch:194,Loss:0.729149
Epoch:195,Loss:0.729147
Epoch:196,Loss:0.729147
Epoch:197,Loss:0.729147
Epoch:198,Loss:0.729146
Epoch:199,Loss:0.729145
Epoch:200,Loss:0.729145
Epoch:201,Loss:0.729144
Epoch:202,Loss:0.729145
Epoch:203,Loss:0.729144
Epoch:204,Loss:0.729144
Epoch:205,Loss:0.729143
Epoch:206,Loss:0.729143
Epoch:207,Loss:0.729142
Epoch:208,Loss:0.729142
Epoch:209,Loss:0.729142
Epoch:210,Loss:0.729141
Epoch:211,Loss:0.729141
Epoch:212,Loss:0.729140
Epoch:213,Loss:0.729140
Epoch:214,Loss:0.729139
Epoch:215,Loss:0.729139
Epoch:216,Loss:0.729138
Epoch:217,Loss:0.729138
Epoch:218,Loss:0.729137
Epoch:219,Loss:0.729137
Epoch:220,Loss:0.729136
Epoch:221,Loss:0.729135
Epoch:222,Loss:0.729134
Epoch:223,Loss:0.729134
Epoch:224,Loss:0.729133
Epoch:225,Loss:0.729132
Epoch:226,Loss:0.729131
Epoch:227,Loss:0.729131
Epoch:228,Loss:0.729130
Epoch:229,Loss:0.729129
Epoch:230,Loss:0.729129
Epoch:231,Loss:0.729128
Epoch:232,Loss:0.729127
Epoch:233,Loss:0.729126
Epoch:234,Loss:0.729125
Epoch:235,Loss:0.729124
Epoch:236,Loss:0.729123
Epoch:237,Loss:0.729121
Epoch:238,Loss:0.729119
Epoch:239,Loss:0.729116
Epoch:240,Loss:0.729112
Epoch:241,Loss:0.729106
Epoch:242,Loss:0.729095
Epoch:243,Loss:0.729075
Epoch:244,Loss:0.729035
Epoch:245,Loss:0.728994
Epoch:246,Loss:0.729103
Epoch:247,Loss:0.729000
Epoch:248,Loss:0.729026
Epoch:249,Loss:0.729047
Epoch:250,Loss:0.729044
Epoch:251,Loss:0.729022
Epoch:252,Loss:0.728987
Epoch:253,Loss:0.729031
Epoch:254,Loss:0.728992
Epoch:255,Loss:0.728995
Epoch:256,Loss:0.729006
Epoch:257,Loss:0.728994
Epoch:258,Loss:0.728984
Epoch:259,Loss:0.728999
Epoch:260,Loss:0.728980
Epoch:261,Loss:0.728992
Epoch:262,Loss:0.728987
Epoch:263,Loss:0.728978
Epoch:264,Loss:0.728986
Epoch:265,Loss:0.728979
Epoch:266,Loss:0.728976
Epoch:267,Loss:0.728983
Epoch:268,Loss:0.728976
Epoch:269,Loss:0.728975
Epoch:270,Loss:0.728978
Epoch:271,Loss:0.728973
Epoch:272,Loss:0.728976
Epoch:273,Loss:0.728974
Epoch:274,Loss:0.728972
Epoch:275,Loss:0.728973
Epoch:276,Loss:0.728970
Epoch:277,Loss:0.728972
Epoch:278,Loss:0.728970
Epoch:279,Loss:0.728969
Epoch:280,Loss:0.728970
Epoch:281,Loss:0.728968
Epoch:282,Loss:0.728967
Epoch:283,Loss:0.728967
Epoch:284,Loss:0.728965
Epoch:285,Loss:0.728964
Epoch:286,Loss:0.728962
Epoch:287,Loss:0.728961
Epoch:288,Loss:0.728958
Epoch:289,Loss:0.728954
Epoch:290,Loss:0.728950
Epoch:291,Loss:0.728942
Epoch:292,Loss:0.728928
Epoch:293,Loss:0.728899
Epoch:294,Loss:0.728823
Epoch:295,Loss:0.728630
Epoch:296,Loss:0.728751
Epoch:297,Loss:0.728786
Epoch:298,Loss:0.728595
Epoch:299,Loss:0.728683
Epoch:300,Loss:0.728725
Epoch:301,Loss:0.728664
Epoch:302,Loss:0.728570
Epoch:303,Loss:0.728632
Epoch:304,Loss:0.728585
Epoch:305,Loss:0.728485
Epoch:306,Loss:0.728588
Epoch:307,Loss:0.728561
Epoch:308,Loss:0.728491
Epoch:309,Loss:0.728501
Epoch:310,Loss:0.728533
Epoch:311,Loss:0.728463
Epoch:312,Loss:0.728422
Epoch:313,Loss:0.728431
Epoch:314,Loss:0.728469
Epoch:315,Loss:0.728421
Epoch:316,Loss:0.728418
Epoch:317,Loss:0.728427
Epoch:318,Loss:0.728412
Epoch:319,Loss:0.728418
Epoch:320,Loss:0.728410
Epoch:321,Loss:0.728394
Epoch:322,Loss:0.728381
Epoch:323,Loss:0.728382
Epoch:324,Loss:0.728369
Epoch:325,Loss:0.728349
Epoch:326,Loss:0.728347
Epoch:327,Loss:0.728362
Epoch:328,Loss:0.728340
Epoch:329,Loss:0.728344
Epoch:330,Loss:0.728345
Epoch:331,Loss:0.728342
Epoch:332,Loss:0.728344
Epoch:333,Loss:0.728341
Epoch:334,Loss:0.728331
Epoch:335,Loss:0.728325
Epoch:336,Loss:0.728333
Epoch:337,Loss:0.728325
Epoch:338,Loss:0.728315
Epoch:339,Loss:0.728312
Epoch:340,Loss:0.728302
Epoch:341,Loss:0.728283
Epoch:342,Loss:0.728251
Epoch:343,Loss:0.728215
Epoch:344,Loss:0.728237
Epoch:345,Loss:0.728224
Epoch:346,Loss:0.728241
Epoch:347,Loss:0.728240
Epoch:348,Loss:0.728236
Epoch:349,Loss:0.728247
Epoch:350,Loss:0.728234
Epoch:351,Loss:0.728240
Epoch:352,Loss:0.728227
Epoch:353,Loss:0.728231
Epoch:354,Loss:0.728216
Epoch:355,Loss:0.728212
Epoch:356,Loss:0.728203
Epoch:357,Loss:0.728202
Epoch:358,Loss:0.728209
Epoch:359,Loss:0.728204
Epoch:360,Loss:0.728200
Epoch:361,Loss:0.728195
Epoch:362,Loss:0.728183
Epoch:363,Loss:0.728181
Epoch:364,Loss:0.728175
Epoch:365,Loss:0.728174
Epoch:366,Loss:0.728175
Epoch:367,Loss:0.728168
Epoch:368,Loss:0.728171
Epoch:369,Loss:0.728168
Epoch:370,Loss:0.728167
Epoch:371,Loss:0.728168
Epoch:372,Loss:0.728167
Epoch:373,Loss:0.728169
Epoch:374,Loss:0.728166
Epoch:375,Loss:0.728165
Epoch:376,Loss:0.728165
Epoch:377,Loss:0.728163
Epoch:378,Loss:0.728163
Epoch:379,Loss:0.728162
Epoch:380,Loss:0.728160
Epoch:381,Loss:0.728159
Epoch:382,Loss:0.728158
Epoch:383,Loss:0.728158
Epoch:384,Loss:0.728158
Epoch:385,Loss:0.728159
Epoch:386,Loss:0.728159
Epoch:387,Loss:0.728158
Epoch:388,Loss:0.728157
Epoch:389,Loss:0.728156
Epoch:390,Loss:0.728156
Epoch:391,Loss:0.728156
Epoch:392,Loss:0.728156
Epoch:393,Loss:0.728156
Epoch:394,Loss:0.728156
Epoch:395,Loss:0.728155
Epoch:396,Loss:0.728155
Epoch:397,Loss:0.728154
Epoch:398,Loss:0.728154
Epoch:399,Loss:0.728153
Epoch:400,Loss:0.728153
Epoch:401,Loss:0.728153
Epoch:402,Loss:0.728153
Epoch:403,Loss:0.728153
Epoch:404,Loss:0.728153
Epoch:405,Loss:0.728153
Epoch:406,Loss:0.728152
Epoch:407,Loss:0.728152
Epoch:408,Loss:0.728152
Epoch:409,Loss:0.728152
Epoch:410,Loss:0.728153
Epoch:411,Loss:0.728153
Epoch:412,Loss:0.728152
Epoch:413,Loss:0.728152
Epoch:414,Loss:0.728152
Epoch:415,Loss:0.728152
Epoch:416,Loss:0.728152
Epoch:417,Loss:0.728152
Epoch:418,Loss:0.728152
Epoch:419,Loss:0.728152
Epoch:420,Loss:0.728152
Epoch:421,Loss:0.728152
Epoch:422,Loss:0.728152
Epoch:423,Loss:0.728152
Epoch:424,Loss:0.728151
Epoch:425,Loss:0.728151
Epoch:426,Loss:0.728151
Epoch:427,Loss:0.728151
Epoch:428,Loss:0.728151
Epoch:429,Loss:0.728151
Epoch:430,Loss:0.728151
Epoch:431,Loss:0.728151
Epoch:432,Loss:0.728151
Epoch:433,Loss:0.728151
Epoch:434,Loss:0.728151
Epoch:435,Loss:0.728152
Epoch:436,Loss:0.728152
Epoch:437,Loss:0.728153
Epoch:438,Loss:0.728154
Epoch:439,Loss:0.728158
Epoch:440,Loss:0.728161
Epoch:441,Loss:0.728168
Epoch:442,Loss:0.728167
Epoch:443,Loss:0.728169
Epoch:444,Loss:0.728161
Epoch:445,Loss:0.728157
Epoch:446,Loss:0.728152
Epoch:447,Loss:0.728151
Epoch:448,Loss:0.728151
Epoch:449,Loss:0.728153
Epoch:450,Loss:0.728155
Epoch:451,Loss:0.728156
Epoch:452,Loss:0.728158
Epoch:453,Loss:0.728156
Epoch:454,Loss:0.728155
Epoch:455,Loss:0.728153
Epoch:456,Loss:0.728152
Epoch:457,Loss:0.728151
Epoch:458,Loss:0.728151
Epoch:459,Loss:0.728152
Epoch:460,Loss:0.728152
Epoch:461,Loss:0.728153
Epoch:462,Loss:0.728153
Epoch:463,Loss:0.728153
Epoch:464,Loss:0.728152
Epoch:465,Loss:0.728152
Epoch:466,Loss:0.728151
Epoch:467,Loss:0.728150
Epoch:468,Loss:0.728150
Epoch:469,Loss:0.728150
Epoch:470,Loss:0.728150
Epoch:471,Loss:0.728150
Epoch:472,Loss:0.728150
Epoch:473,Loss:0.728150
Epoch:474,Loss:0.728150
Epoch:475,Loss:0.728151
Epoch:476,Loss:0.728151
Epoch:477,Loss:0.728152
Epoch:478,Loss:0.728153
Epoch:479,Loss:0.728154
Epoch:480,Loss:0.728155
Epoch:481,Loss:0.728157
Epoch:482,Loss:0.728157
Epoch:483,Loss:0.728159
Epoch:484,Loss:0.728158
Epoch:485,Loss:0.728159
Epoch:486,Loss:0.728157
Epoch:487,Loss:0.728157
Epoch:488,Loss:0.728155
Epoch:489,Loss:0.728154
Epoch:490,Loss:0.728152
Epoch:491,Loss:0.728152
Epoch:492,Loss:0.728152
Epoch:493,Loss:0.728155
Epoch:494,Loss:0.728160
Epoch:495,Loss:0.728176
Epoch:496,Loss:0.728173
Epoch:497,Loss:0.728173
Epoch:498,Loss:0.728159
Epoch:499,Loss:0.728152
Epoch:500,Loss:0.728150
Epoch:501,Loss:0.728154
Epoch:502,Loss:0.728158
Epoch:503,Loss:0.728160
Epoch:504,Loss:0.728159
Epoch:505,Loss:0.728151
Epoch:506,Loss:0.728142
Epoch:507,Loss:0.728133
Epoch:508,Loss:0.728125
Epoch:509,Loss:0.728117
Epoch:510,Loss:0.728114
Epoch:511,Loss:0.728127
Epoch:512,Loss:0.728130
Epoch:513,Loss:0.728116
Epoch:514,Loss:0.728111
Epoch:515,Loss:0.728115
Epoch:516,Loss:0.728118
Epoch:517,Loss:0.728120
Epoch:518,Loss:0.728119
Epoch:519,Loss:0.728117
Epoch:520,Loss:0.728114
Epoch:521,Loss:0.728115
Epoch:522,Loss:0.728118
Epoch:523,Loss:0.728117
Epoch:524,Loss:0.728114
Epoch:525,Loss:0.728116
Epoch:526,Loss:0.728119
Epoch:527,Loss:0.728122
Epoch:528,Loss:0.728121
Epoch:529,Loss:0.728120
Epoch:530,Loss:0.728118
Epoch:531,Loss:0.728119
Epoch:532,Loss:0.728117
Epoch:533,Loss:0.728115
Epoch:534,Loss:0.728112
Epoch:535,Loss:0.728111
Epoch:536,Loss:0.728110
Epoch:537,Loss:0.728109
Epoch:538,Loss:0.728108
Epoch:539,Loss:0.728107
Epoch:540,Loss:0.728106
Epoch:541,Loss:0.728106
Epoch:542,Loss:0.728106
Epoch:543,Loss:0.728105
Epoch:544,Loss:0.728104
Epoch:545,Loss:0.728104
Epoch:546,Loss:0.728103
Epoch:547,Loss:0.728102
Epoch:548,Loss:0.728101
Epoch:549,Loss:0.728099
Epoch:550,Loss:0.728097
Epoch:551,Loss:0.728093
Epoch:552,Loss:0.728085
Epoch:553,Loss:0.728074
Epoch:554,Loss:0.728062
Epoch:555,Loss:0.728072
Epoch:556,Loss:0.728080
Epoch:557,Loss:0.728063
Epoch:558,Loss:0.728067
Epoch:559,Loss:0.728073
Epoch:560,Loss:0.728076
Epoch:561,Loss:0.728074
Epoch:562,Loss:0.728070
Epoch:563,Loss:0.728068
Epoch:564,Loss:0.728076
Epoch:565,Loss:0.728081
Epoch:566,Loss:0.728080
Epoch:567,Loss:0.728084
Epoch:568,Loss:0.728094
Epoch:569,Loss:0.728093
Epoch:570,Loss:0.728093
Epoch:571,Loss:0.728088
Epoch:572,Loss:0.728092
Epoch:573,Loss:0.728093
Epoch:574,Loss:0.728096
Epoch:575,Loss:0.728087
Epoch:576,Loss:0.728085
Epoch:577,Loss:0.728076
Epoch:578,Loss:0.728071
Epoch:579,Loss:0.728065
Epoch:580,Loss:0.728064
Epoch:581,Loss:0.728064
Epoch:582,Loss:0.728064
Epoch:583,Loss:0.728064
Epoch:584,Loss:0.728065
Epoch:585,Loss:0.728067
Epoch:586,Loss:0.728067
Epoch:587,Loss:0.728069
Epoch:588,Loss:0.728070
Epoch:589,Loss:0.728073
Epoch:590,Loss:0.728073
Epoch:591,Loss:0.728076
Epoch:592,Loss:0.728075
Epoch:593,Loss:0.728076
Epoch:594,Loss:0.728074
Epoch:595,Loss:0.728073
Epoch:596,Loss:0.728070
Epoch:597,Loss:0.728070
Epoch:598,Loss:0.728068
Epoch:599,Loss:0.728067
Epoch:600,Loss:0.728067
Epoch:601,Loss:0.728068
Epoch:602,Loss:0.728069
Epoch:603,Loss:0.728070
Epoch:604,Loss:0.728070
Epoch:605,Loss:0.728071
Epoch:606,Loss:0.728071
Epoch:607,Loss:0.728071
Epoch:608,Loss:0.728070
Epoch:609,Loss:0.728070
Epoch:610,Loss:0.728068
Epoch:611,Loss:0.728065
Epoch:612,Loss:0.728059
Epoch:613,Loss:0.728049
Epoch:614,Loss:0.728026
Epoch:615,Loss:0.727994
Epoch:616,Loss:0.728072
Epoch:617,Loss:0.728004
Epoch:618,Loss:0.728045
Epoch:619,Loss:0.728061
Epoch:620,Loss:0.728064
Epoch:621,Loss:0.728058
Epoch:622,Loss:0.728051
Epoch:623,Loss:0.728018
Epoch:624,Loss:0.728035
Epoch:625,Loss:0.728020
Epoch:626,Loss:0.728013
Epoch:627,Loss:0.728013
Epoch:628,Loss:0.728008
Epoch:629,Loss:0.727999
Epoch:630,Loss:0.727998
Epoch:631,Loss:0.727995
Epoch:632,Loss:0.727986
Epoch:633,Loss:0.727996
Epoch:634,Loss:0.727997
Epoch:635,Loss:0.727989
Epoch:636,Loss:0.727994
Epoch:637,Loss:0.727995
Epoch:638,Loss:0.727990
Epoch:639,Loss:0.727997
Epoch:640,Loss:0.727997
Epoch:641,Loss:0.727991
Epoch:642,Loss:0.727994
Epoch:643,Loss:0.727990
Epoch:644,Loss:0.727988
Epoch:645,Loss:0.727991
Epoch:646,Loss:0.727987
Epoch:647,Loss:0.727985
Epoch:648,Loss:0.727986
Epoch:649,Loss:0.727983
Epoch:650,Loss:0.727982
Epoch:651,Loss:0.727982
Epoch:652,Loss:0.727981
Epoch:653,Loss:0.727980
Epoch:654,Loss:0.727980
Epoch:655,Loss:0.727978
Epoch:656,Loss:0.727979
Epoch:657,Loss:0.727979
Epoch:658,Loss:0.727979
Epoch:659,Loss:0.727981
Epoch:660,Loss:0.727983
Epoch:661,Loss:0.727985
Epoch:662,Loss:0.727988
Epoch:663,Loss:0.727992
Epoch:664,Loss:0.727995
Epoch:665,Loss:0.728004
Epoch:666,Loss:0.728005
Epoch:667,Loss:0.728013
Epoch:668,Loss:0.728009
Epoch:669,Loss:0.728011
Epoch:670,Loss:0.728003
Epoch:671,Loss:0.728001
Epoch:672,Loss:0.727998
Epoch:673,Loss:0.727997
Epoch:674,Loss:0.727998
Epoch:675,Loss:0.728001
Epoch:676,Loss:0.728009
Epoch:677,Loss:0.728015
Epoch:678,Loss:0.728027
Epoch:679,Loss:0.728025
Epoch:680,Loss:0.728023
Epoch:681,Loss:0.728011
Epoch:682,Loss:0.728002
Epoch:683,Loss:0.727991
Epoch:684,Loss:0.727984
Epoch:685,Loss:0.727980
Epoch:686,Loss:0.727978
Epoch:687,Loss:0.727978
Epoch:688,Loss:0.727979
Epoch:689,Loss:0.727981
Epoch:690,Loss:0.727982
Epoch:691,Loss:0.727984
Epoch:692,Loss:0.727986
Epoch:693,Loss:0.727987
Epoch:694,Loss:0.727988
Epoch:695,Loss:0.727989
Epoch:696,Loss:0.727989
Epoch:697,Loss:0.727989
Epoch:698,Loss:0.727988
Epoch:699,Loss:0.727987
Epoch:700,Loss:0.727986
Epoch:701,Loss:0.727985
Epoch:702,Loss:0.727983
Epoch:703,Loss:0.727982
Epoch:704,Loss:0.727980
Epoch:705,Loss:0.727979
Epoch:706,Loss:0.727978
Epoch:707,Loss:0.727977
Epoch:708,Loss:0.727977
Epoch:709,Loss:0.727977
Epoch:710,Loss:0.727978
Epoch:711,Loss:0.727979
Epoch:712,Loss:0.727980
Epoch:713,Loss:0.727982
Epoch:714,Loss:0.727985
Epoch:715,Loss:0.727986
Epoch:716,Loss:0.727989
Epoch:717,Loss:0.727990
Epoch:718,Loss:0.727994
Epoch:719,Loss:0.727993
Epoch:720,Loss:0.727995
Epoch:721,Loss:0.727992
Epoch:722,Loss:0.727991
Epoch:723,Loss:0.727987
Epoch:724,Loss:0.727986
Epoch:725,Loss:0.727984
Epoch:726,Loss:0.727984
Epoch:727,Loss:0.727984
Epoch:728,Loss:0.727985
Epoch:729,Loss:0.727987
Epoch:730,Loss:0.727988
Epoch:731,Loss:0.727991
Epoch:732,Loss:0.727993
Epoch:733,Loss:0.727995
Epoch:734,Loss:0.727998
Epoch:735,Loss:0.728000
Epoch:736,Loss:0.728001
Epoch:737,Loss:0.728001
Epoch:738,Loss:0.728000
Epoch:739,Loss:0.727997
Epoch:740,Loss:0.727994
Epoch:741,Loss:0.727990
Epoch:742,Loss:0.727986
Epoch:743,Loss:0.727983
Epoch:744,Loss:0.727980
Epoch:745,Loss:0.727978
Epoch:746,Loss:0.727976
Epoch:747,Loss:0.727975
Epoch:748,Loss:0.727974
Epoch:749,Loss:0.727973
Epoch:750,Loss:0.727973
Epoch:751,Loss:0.727974
Epoch:752,Loss:0.727974
Epoch:753,Loss:0.727975
Epoch:754,Loss:0.727976
Epoch:755,Loss:0.727977
Epoch:756,Loss:0.727978
Epoch:757,Loss:0.727979
Epoch:758,Loss:0.727980
Epoch:759,Loss:0.727982
Epoch:760,Loss:0.727985
Epoch:761,Loss:0.727988
Epoch:762,Loss:0.727990
Epoch:763,Loss:0.727994
Epoch:764,Loss:0.727996
Epoch:765,Loss:0.727999
Epoch:766,Loss:0.728003
Epoch:767,Loss:0.728006
Epoch:768,Loss:0.728010
Epoch:769,Loss:0.728012
Epoch:770,Loss:0.728014
Epoch:771,Loss:0.728012
Epoch:772,Loss:0.728011
Epoch:773,Loss:0.728006
Epoch:774,Loss:0.728004
Epoch:775,Loss:0.727998
Epoch:776,Loss:0.727997
Epoch:777,Loss:0.727992
Epoch:778,Loss:0.727992
Epoch:779,Loss:0.727990
Epoch:780,Loss:0.727992
Epoch:781,Loss:0.727992
Epoch:782,Loss:0.727995
Epoch:783,Loss:0.727996
Epoch:784,Loss:0.727999
Epoch:785,Loss:0.727998
Epoch:786,Loss:0.727997
Epoch:787,Loss:0.727994
Epoch:788,Loss:0.727991
Epoch:789,Loss:0.727986
Epoch:790,Loss:0.727983
Epoch:791,Loss:0.727979
Epoch:792,Loss:0.727977
Epoch:793,Loss:0.727975
Epoch:794,Loss:0.727974
Epoch:795,Loss:0.727974
Epoch:796,Loss:0.727974
Epoch:797,Loss:0.727974
Epoch:798,Loss:0.727975
Epoch:799,Loss:0.727976
Epoch:800,Loss:0.727977
Epoch:801,Loss:0.727978
Epoch:802,Loss:0.727979
Epoch:803,Loss:0.727980
Epoch:804,Loss:0.727981
Epoch:805,Loss:0.727982
Epoch:806,Loss:0.727983
Epoch:807,Loss:0.727985
Epoch:808,Loss:0.727987
Epoch:809,Loss:0.727988
Epoch:810,Loss:0.727991
Epoch:811,Loss:0.727993
Epoch:812,Loss:0.727996
Epoch:813,Loss:0.727998
Epoch:814,Loss:0.728001
Epoch:815,Loss:0.728002
Epoch:816,Loss:0.728004
Epoch:817,Loss:0.728002
Epoch:818,Loss:0.728001
Epoch:819,Loss:0.727997
Epoch:820,Loss:0.727995
Epoch:821,Loss:0.727991
Epoch:822,Loss:0.727989
Epoch:823,Loss:0.727987
Epoch:824,Loss:0.727988
Epoch:825,Loss:0.727988
Epoch:826,Loss:0.727993
Epoch:827,Loss:0.727994
Epoch:828,Loss:0.728000
Epoch:829,Loss:0.727999
Epoch:830,Loss:0.728003
Epoch:831,Loss:0.728001
Epoch:832,Loss:0.728002
Epoch:833,Loss:0.728000
Epoch:834,Loss:0.728000
Epoch:835,Loss:0.727996
Epoch:836,Loss:0.727994
Epoch:837,Loss:0.727988
Epoch:838,Loss:0.727984
Epoch:839,Loss:0.727979
Epoch:840,Loss:0.727974
Epoch:841,Loss:0.727969
Epoch:842,Loss:0.727967
Epoch:843,Loss:0.727967
Epoch:844,Loss:0.727969
Epoch:845,Loss:0.727971
Epoch:846,Loss:0.727973
Epoch:847,Loss:0.727973
Epoch:848,Loss:0.727973
Epoch:849,Loss:0.727974
Epoch:850,Loss:0.727975
Epoch:851,Loss:0.727975
Epoch:852,Loss:0.727975
Epoch:853,Loss:0.727975
Epoch:854,Loss:0.727974
Epoch:855,Loss:0.727973
Epoch:856,Loss:0.727972
Epoch:857,Loss:0.727972
Epoch:858,Loss:0.727972
Epoch:859,Loss:0.727973
Epoch:860,Loss:0.727975
Epoch:861,Loss:0.727977
Epoch:862,Loss:0.727979
Epoch:863,Loss:0.727981
Epoch:864,Loss:0.727984
Epoch:865,Loss:0.727985
Epoch:866,Loss:0.727988
Epoch:867,Loss:0.727989
Epoch:868,Loss:0.727991
Epoch:869,Loss:0.727994
Epoch:870,Loss:0.727997
Epoch:871,Loss:0.728004
Epoch:872,Loss:0.728004
Epoch:873,Loss:0.728011
Epoch:874,Loss:0.728003
Epoch:875,Loss:0.728000
Epoch:876,Loss:0.727988
Epoch:877,Loss:0.727983
Epoch:878,Loss:0.727978
Epoch:879,Loss:0.727980
Epoch:880,Loss:0.727976
Epoch:881,Loss:0.727973
Epoch:882,Loss:0.727965
Epoch:883,Loss:0.727958
Epoch:884,Loss:0.727957
Epoch:885,Loss:0.727967
Epoch:886,Loss:0.727960
Epoch:887,Loss:0.727967
Epoch:888,Loss:0.727977
Epoch:889,Loss:0.727987
Epoch:890,Loss:0.727992
Epoch:891,Loss:0.727996
Epoch:892,Loss:0.727995
Epoch:893,Loss:0.727995
Epoch:894,Loss:0.727992
Epoch:895,Loss:0.727986
Epoch:896,Loss:0.727973
Epoch:897,Loss:0.727966
Epoch:898,Loss:0.727960
Epoch:899,Loss:0.727956
Epoch:900,Loss:0.727953
Epoch:901,Loss:0.727951
Epoch:902,Loss:0.727951
Epoch:903,Loss:0.727954
Epoch:904,Loss:0.727955
Epoch:905,Loss:0.727955
Epoch:906,Loss:0.727954
Epoch:907,Loss:0.727952
Epoch:908,Loss:0.727950
Epoch:909,Loss:0.727948
Epoch:910,Loss:0.727945
Epoch:911,Loss:0.727944
Epoch:912,Loss:0.727945
Epoch:913,Loss:0.727945
Epoch:914,Loss:0.727946
Epoch:915,Loss:0.727947
Epoch:916,Loss:0.727949
Epoch:917,Loss:0.727949
Epoch:918,Loss:0.727949
Epoch:919,Loss:0.727949
Epoch:920,Loss:0.727949
Epoch:921,Loss:0.727949
Epoch:922,Loss:0.727949
Epoch:923,Loss:0.727950
Epoch:924,Loss:0.727952
Epoch:925,Loss:0.727952
Epoch:926,Loss:0.727955
Epoch:927,Loss:0.727956
Epoch:928,Loss:0.727960
Epoch:929,Loss:0.727961
Epoch:930,Loss:0.727965
Epoch:931,Loss:0.727963
Epoch:932,Loss:0.727965
Epoch:933,Loss:0.727962
Epoch:934,Loss:0.727963
Epoch:935,Loss:0.727961
Epoch:936,Loss:0.727962
Epoch:937,Loss:0.727962
Epoch:938,Loss:0.727964
Epoch:939,Loss:0.727966
Epoch:940,Loss:0.727968
Epoch:941,Loss:0.727969
Epoch:942,Loss:0.727969
Epoch:943,Loss:0.727967
Epoch:944,Loss:0.727965
Epoch:945,Loss:0.727962
Epoch:946,Loss:0.727960
Epoch:947,Loss:0.727959
Epoch:948,Loss:0.727960
Epoch:949,Loss:0.727962
Epoch:950,Loss:0.727965
Epoch:951,Loss:0.727969
Epoch:952,Loss:0.727972
Epoch:953,Loss:0.727974
Epoch:954,Loss:0.727974
Epoch:955,Loss:0.727973
Epoch:956,Loss:0.727969
Epoch:957,Loss:0.727964
Epoch:958,Loss:0.727958
Epoch:959,Loss:0.727952
Epoch:960,Loss:0.727947
Epoch:961,Loss:0.727945
Epoch:962,Loss:0.727943
Epoch:963,Loss:0.727943
Epoch:964,Loss:0.727944
Epoch:965,Loss:0.727946
Epoch:966,Loss:0.727947
Epoch:967,Loss:0.727950
Epoch:968,Loss:0.727952
Epoch:969,Loss:0.727954
Epoch:970,Loss:0.727954
Epoch:971,Loss:0.727956
Epoch:972,Loss:0.727956
Epoch:973,Loss:0.727957
Epoch:974,Loss:0.727956
Epoch:975,Loss:0.727958
Epoch:976,Loss:0.727958
Epoch:977,Loss:0.727962
Epoch:978,Loss:0.727962
Epoch:979,Loss:0.727968
Epoch:980,Loss:0.727968
Epoch:981,Loss:0.727973
Epoch:982,Loss:0.727968
Epoch:983,Loss:0.727967
Epoch:984,Loss:0.727960
Epoch:985,Loss:0.727957
Epoch:986,Loss:0.727953
Epoch:987,Loss:0.727951
Epoch:988,Loss:0.727950
Epoch:989,Loss:0.727950
Epoch:990,Loss:0.727950
Epoch:991,Loss:0.727951
Epoch:992,Loss:0.727952
Epoch:993,Loss:0.727953
Epoch:994,Loss:0.727955
Epoch:995,Loss:0.727956
Epoch:996,Loss:0.727957
Epoch:997,Loss:0.727957
Epoch:998,Loss:0.727958
Epoch:999,Loss:0.727956
loss=critier(model(x_test),y_test)
loss
tensor(1.0953, grad_fn=<MseLossBackward0>)
实验:基于神经网络的分类(鸢尾花数据集)
1 数据用鸢尾花数据集(所有样本的四个特征,三个类别)
2 输出标签(one hot vector)
3 构建模型时输出端映射到0,1之间
4 修改损失函数为交叉熵函数
1. 导入包
import torch
import torch.nn as nn
import torch.optim as optim
from torch.utils.data import DataLoader
from sklearn.datasets import load_iris
from sklearn.preprocessing import OneHotEncoder
from sklearn.model_selection import train_test_split
2. 构造数据集
iris=load_iris()
X,y=iris.data,iris.target
one_hot_vector=OneHotEncoder(sparse=False)
y=one_hot_vector.fit_transform(y.reshape(-1,1))
3. 构造训练集和测试集
X_train,X_test,y_train,y_test=train_test_split(X,y,test_size=0.2)
X_train = torch.Tensor(X_train)
X_test = torch.Tensor(X_test)
y_train = torch.Tensor(y_train)
y_test = torch.Tensor(y_test)
X_train.shape,X_test.shape,y_train.shape,y_test.shape
(torch.Size([120, 4]),torch.Size([30, 4]),torch.Size([120, 3]),torch.Size([30, 3]))
4. 构建神经网络模型
class Nerual_Network(nn.Module):def __init__(self):super().__init__()self.output=nn.Linear(X_train.shape[1],y_train.shape[1])self.sigmoid=nn.Sigmoid()self.softmax=nn.Softmax(dim=1)def forward(self,x):x=self.output(x)x=self.softmax(x)x=self.sigmoid(x)return x
model=Nerual_Network()
model
Nerual_Network((output): Linear(in_features=4, out_features=3, bias=True)(sigmoid): Sigmoid()(softmax): Softmax(dim=1)
)
5. 采用训练数据来训练神经网络模型
epochs=1000
learnng_rate=0.003
critier=nn.BCELoss()
optimizer=optim.Adam(model.parameters(),lr=learnng_rate)
for i in range(epochs):outputs=model(X_train)loss=critier(outputs,y_train)print("Epoch:{},Loss:{:4f}".format(i,loss))optimizer.zero_grad()loss.backward(retain_graph=True)optimizer.step()
Epoch:0,Loss:0.788641
Epoch:1,Loss:0.787205
Epoch:2,Loss:0.785736
Epoch:3,Loss:0.784244
Epoch:4,Loss:0.782739
Epoch:5,Loss:0.781233
Epoch:6,Loss:0.779737
Epoch:7,Loss:0.778262
Epoch:8,Loss:0.776822
Epoch:9,Loss:0.775428
Epoch:10,Loss:0.774091
Epoch:11,Loss:0.772821
Epoch:12,Loss:0.771625
Epoch:13,Loss:0.770507
Epoch:14,Loss:0.769467
Epoch:15,Loss:0.768504
Epoch:16,Loss:0.767612
Epoch:17,Loss:0.766784
Epoch:18,Loss:0.766013
Epoch:19,Loss:0.765293
Epoch:20,Loss:0.764617
Epoch:21,Loss:0.763979
Epoch:22,Loss:0.763374
Epoch:23,Loss:0.762798
Epoch:24,Loss:0.762245
Epoch:25,Loss:0.761712
Epoch:26,Loss:0.761195
Epoch:27,Loss:0.760692
Epoch:28,Loss:0.760198
Epoch:29,Loss:0.759711
Epoch:30,Loss:0.759228
Epoch:31,Loss:0.758748
Epoch:32,Loss:0.758268
Epoch:33,Loss:0.757787
Epoch:34,Loss:0.757302
Epoch:35,Loss:0.756813
Epoch:36,Loss:0.756318
Epoch:37,Loss:0.755816
Epoch:38,Loss:0.755305
Epoch:39,Loss:0.754786
Epoch:40,Loss:0.754257
Epoch:41,Loss:0.753716
Epoch:42,Loss:0.753165
Epoch:43,Loss:0.752602
Epoch:44,Loss:0.752026
Epoch:45,Loss:0.751437
Epoch:46,Loss:0.750836
Epoch:47,Loss:0.750221
Epoch:48,Loss:0.749594
Epoch:49,Loss:0.748953
Epoch:50,Loss:0.748300
Epoch:51,Loss:0.747635
Epoch:52,Loss:0.746958
Epoch:53,Loss:0.746270
Epoch:54,Loss:0.745572
Epoch:55,Loss:0.744864
Epoch:56,Loss:0.744148
Epoch:57,Loss:0.743425
Epoch:58,Loss:0.742694
Epoch:59,Loss:0.741958
Epoch:60,Loss:0.741217
Epoch:61,Loss:0.740473
Epoch:62,Loss:0.739725
Epoch:63,Loss:0.738975
Epoch:64,Loss:0.738224
Epoch:65,Loss:0.737471
Epoch:66,Loss:0.736718
Epoch:67,Loss:0.735964
Epoch:68,Loss:0.735211
Epoch:69,Loss:0.734458
Epoch:70,Loss:0.733706
Epoch:71,Loss:0.732954
Epoch:72,Loss:0.732204
Epoch:73,Loss:0.731456
Epoch:74,Loss:0.730709
Epoch:75,Loss:0.729964
Epoch:76,Loss:0.729223
Epoch:77,Loss:0.728484
Epoch:78,Loss:0.727750
Epoch:79,Loss:0.727019
Epoch:80,Loss:0.726294
Epoch:81,Loss:0.725574
Epoch:82,Loss:0.724860
Epoch:83,Loss:0.724152
Epoch:84,Loss:0.723452
Epoch:85,Loss:0.722758
Epoch:86,Loss:0.722072
Epoch:87,Loss:0.721393
Epoch:88,Loss:0.720722
Epoch:89,Loss:0.720058
Epoch:90,Loss:0.719403
Epoch:91,Loss:0.718755
Epoch:92,Loss:0.718115
Epoch:93,Loss:0.717483
Epoch:94,Loss:0.716859
Epoch:95,Loss:0.716242
Epoch:96,Loss:0.715634
Epoch:97,Loss:0.715033
Epoch:98,Loss:0.714440
Epoch:99,Loss:0.713856
Epoch:100,Loss:0.713278
Epoch:101,Loss:0.712709
Epoch:102,Loss:0.712148
Epoch:103,Loss:0.711594
Epoch:104,Loss:0.711047
Epoch:105,Loss:0.710508
Epoch:106,Loss:0.709977
Epoch:107,Loss:0.709452
Epoch:108,Loss:0.708935
Epoch:109,Loss:0.708425
Epoch:110,Loss:0.707922
Epoch:111,Loss:0.707425
Epoch:112,Loss:0.706936
Epoch:113,Loss:0.706452
Epoch:114,Loss:0.705976
Epoch:115,Loss:0.705505
Epoch:116,Loss:0.705041
Epoch:117,Loss:0.704583
Epoch:118,Loss:0.704132
Epoch:119,Loss:0.703686
Epoch:120,Loss:0.703246
Epoch:121,Loss:0.702812
Epoch:122,Loss:0.702383
Epoch:123,Loss:0.701960
Epoch:124,Loss:0.701543
Epoch:125,Loss:0.701130
Epoch:126,Loss:0.700724
Epoch:127,Loss:0.700322
Epoch:128,Loss:0.699925
Epoch:129,Loss:0.699534
Epoch:130,Loss:0.699147
Epoch:131,Loss:0.698766
Epoch:132,Loss:0.698389
Epoch:133,Loss:0.698016
Epoch:134,Loss:0.697648
Epoch:135,Loss:0.697285
Epoch:136,Loss:0.696926
Epoch:137,Loss:0.696571
Epoch:138,Loss:0.696221
Epoch:139,Loss:0.695875
Epoch:140,Loss:0.695533
Epoch:141,Loss:0.695194
Epoch:142,Loss:0.694860
Epoch:143,Loss:0.694529
Epoch:144,Loss:0.694202
Epoch:145,Loss:0.693879
Epoch:146,Loss:0.693560
Epoch:147,Loss:0.693243
Epoch:148,Loss:0.692931
Epoch:149,Loss:0.692621
Epoch:150,Loss:0.692315
Epoch:151,Loss:0.692012
Epoch:152,Loss:0.691712
Epoch:153,Loss:0.691416
Epoch:154,Loss:0.691122
Epoch:155,Loss:0.690832
Epoch:156,Loss:0.690544
Epoch:157,Loss:0.690259
Epoch:158,Loss:0.689977
Epoch:159,Loss:0.689698
Epoch:160,Loss:0.689421
Epoch:161,Loss:0.689147
Epoch:162,Loss:0.688875
Epoch:163,Loss:0.688606
Epoch:164,Loss:0.688340
Epoch:165,Loss:0.688076
Epoch:166,Loss:0.687814
Epoch:167,Loss:0.687554
Epoch:168,Loss:0.687297
Epoch:169,Loss:0.687042
Epoch:170,Loss:0.686789
Epoch:171,Loss:0.686539
Epoch:172,Loss:0.686290
Epoch:173,Loss:0.686044
Epoch:174,Loss:0.685799
Epoch:175,Loss:0.685557
Epoch:176,Loss:0.685316
Epoch:177,Loss:0.685077
Epoch:178,Loss:0.684840
Epoch:179,Loss:0.684605
Epoch:180,Loss:0.684372
Epoch:181,Loss:0.684141
Epoch:182,Loss:0.683911
Epoch:183,Loss:0.683683
Epoch:184,Loss:0.683456
Epoch:185,Loss:0.683231
Epoch:186,Loss:0.683008
Epoch:187,Loss:0.682786
Epoch:188,Loss:0.682566
Epoch:189,Loss:0.682347
Epoch:190,Loss:0.682130
Epoch:191,Loss:0.681914
Epoch:192,Loss:0.681700
Epoch:193,Loss:0.681486
Epoch:194,Loss:0.681275
Epoch:195,Loss:0.681064
Epoch:196,Loss:0.680855
Epoch:197,Loss:0.680647
Epoch:198,Loss:0.680441
Epoch:199,Loss:0.680236
Epoch:200,Loss:0.680032
Epoch:201,Loss:0.679829
Epoch:202,Loss:0.679627
Epoch:203,Loss:0.679426
Epoch:204,Loss:0.679227
Epoch:205,Loss:0.679029
Epoch:206,Loss:0.678831
Epoch:207,Loss:0.678635
Epoch:208,Loss:0.678440
Epoch:209,Loss:0.678246
Epoch:210,Loss:0.678053
Epoch:211,Loss:0.677861
Epoch:212,Loss:0.677670
Epoch:213,Loss:0.677480
Epoch:214,Loss:0.677291
Epoch:215,Loss:0.677102
Epoch:216,Loss:0.676915
Epoch:217,Loss:0.676729
Epoch:218,Loss:0.676543
Epoch:219,Loss:0.676359
Epoch:220,Loss:0.676175
Epoch:221,Loss:0.675992
Epoch:222,Loss:0.675810
Epoch:223,Loss:0.675628
Epoch:224,Loss:0.675448
Epoch:225,Loss:0.675268
Epoch:226,Loss:0.675089
Epoch:227,Loss:0.674911
Epoch:228,Loss:0.674734
Epoch:229,Loss:0.674557
Epoch:230,Loss:0.674381
Epoch:231,Loss:0.674206
Epoch:232,Loss:0.674032
Epoch:233,Loss:0.673858
Epoch:234,Loss:0.673685
Epoch:235,Loss:0.673513
Epoch:236,Loss:0.673341
Epoch:237,Loss:0.673170
Epoch:238,Loss:0.673000
Epoch:239,Loss:0.672830
Epoch:240,Loss:0.672661
Epoch:241,Loss:0.672493
Epoch:242,Loss:0.672325
Epoch:243,Loss:0.672158
Epoch:244,Loss:0.671991
Epoch:245,Loss:0.671825
Epoch:246,Loss:0.671660
Epoch:247,Loss:0.671495
Epoch:248,Loss:0.671331
Epoch:249,Loss:0.671167
Epoch:250,Loss:0.671004
Epoch:251,Loss:0.670842
Epoch:252,Loss:0.670680
Epoch:253,Loss:0.670518
Epoch:254,Loss:0.670357
Epoch:255,Loss:0.670197
Epoch:256,Loss:0.670037
Epoch:257,Loss:0.669878
Epoch:258,Loss:0.669719
Epoch:259,Loss:0.669561
Epoch:260,Loss:0.669403
Epoch:261,Loss:0.669246
Epoch:262,Loss:0.669089
Epoch:263,Loss:0.668932
Epoch:264,Loss:0.668777
Epoch:265,Loss:0.668621
Epoch:266,Loss:0.668466
Epoch:267,Loss:0.668312
Epoch:268,Loss:0.668158
Epoch:269,Loss:0.668004
Epoch:270,Loss:0.667851
Epoch:271,Loss:0.667699
Epoch:272,Loss:0.667547
Epoch:273,Loss:0.667395
Epoch:274,Loss:0.667244
Epoch:275,Loss:0.667093
Epoch:276,Loss:0.666942
Epoch:277,Loss:0.666792
Epoch:278,Loss:0.666643
Epoch:279,Loss:0.666493
Epoch:280,Loss:0.666345
Epoch:281,Loss:0.666196
Epoch:282,Loss:0.666048
Epoch:283,Loss:0.665901
Epoch:284,Loss:0.665754
Epoch:285,Loss:0.665607
Epoch:286,Loss:0.665460
Epoch:287,Loss:0.665314
Epoch:288,Loss:0.665169
Epoch:289,Loss:0.665023
Epoch:290,Loss:0.664879
Epoch:291,Loss:0.664734
Epoch:292,Loss:0.664590
Epoch:293,Loss:0.664446
Epoch:294,Loss:0.664303
Epoch:295,Loss:0.664160
Epoch:296,Loss:0.664017
Epoch:297,Loss:0.663875
Epoch:298,Loss:0.663733
Epoch:299,Loss:0.663591
Epoch:300,Loss:0.663450
Epoch:301,Loss:0.663309
Epoch:302,Loss:0.663169
Epoch:303,Loss:0.663028
Epoch:304,Loss:0.662889
Epoch:305,Loss:0.662749
Epoch:306,Loss:0.662610
Epoch:307,Loss:0.662471
Epoch:308,Loss:0.662332
Epoch:309,Loss:0.662194
Epoch:310,Loss:0.662056
Epoch:311,Loss:0.661919
Epoch:312,Loss:0.661781
Epoch:313,Loss:0.661644
Epoch:314,Loss:0.661508
Epoch:315,Loss:0.661372
Epoch:316,Loss:0.661236
Epoch:317,Loss:0.661100
Epoch:318,Loss:0.660964
Epoch:319,Loss:0.660829
Epoch:320,Loss:0.660695
Epoch:321,Loss:0.660560
Epoch:322,Loss:0.660426
Epoch:323,Loss:0.660292
Epoch:324,Loss:0.660159
Epoch:325,Loss:0.660026
Epoch:326,Loss:0.659893
Epoch:327,Loss:0.659760
Epoch:328,Loss:0.659628
Epoch:329,Loss:0.659496
Epoch:330,Loss:0.659364
Epoch:331,Loss:0.659232
Epoch:332,Loss:0.659101
Epoch:333,Loss:0.658970
Epoch:334,Loss:0.658840
Epoch:335,Loss:0.658709
Epoch:336,Loss:0.658579
Epoch:337,Loss:0.658450
Epoch:338,Loss:0.658320
Epoch:339,Loss:0.658191
Epoch:340,Loss:0.658062
Epoch:341,Loss:0.657933
Epoch:342,Loss:0.657805
Epoch:343,Loss:0.657677
Epoch:344,Loss:0.657549
Epoch:345,Loss:0.657421
Epoch:346,Loss:0.657294
Epoch:347,Loss:0.657167
Epoch:348,Loss:0.657040
Epoch:349,Loss:0.656914
Epoch:350,Loss:0.656788
Epoch:351,Loss:0.656662
Epoch:352,Loss:0.656536
Epoch:353,Loss:0.656411
Epoch:354,Loss:0.656285
Epoch:355,Loss:0.656161
Epoch:356,Loss:0.656036
Epoch:357,Loss:0.655911
Epoch:358,Loss:0.655787
Epoch:359,Loss:0.655663
Epoch:360,Loss:0.655540
Epoch:361,Loss:0.655416
Epoch:362,Loss:0.655293
Epoch:363,Loss:0.655171
Epoch:364,Loss:0.655048
Epoch:365,Loss:0.654925
Epoch:366,Loss:0.654803
Epoch:367,Loss:0.654682
Epoch:368,Loss:0.654560
Epoch:369,Loss:0.654438
Epoch:370,Loss:0.654317
Epoch:371,Loss:0.654196
Epoch:372,Loss:0.654076
Epoch:373,Loss:0.653955
Epoch:374,Loss:0.653835
Epoch:375,Loss:0.653715
Epoch:376,Loss:0.653596
Epoch:377,Loss:0.653476
Epoch:378,Loss:0.653357
Epoch:379,Loss:0.653238
Epoch:380,Loss:0.653119
Epoch:381,Loss:0.653001
Epoch:382,Loss:0.652883
Epoch:383,Loss:0.652765
Epoch:384,Loss:0.652647
Epoch:385,Loss:0.652529
Epoch:386,Loss:0.652412
Epoch:387,Loss:0.652295
Epoch:388,Loss:0.652178
Epoch:389,Loss:0.652062
Epoch:390,Loss:0.651945
Epoch:391,Loss:0.651829
Epoch:392,Loss:0.651713
Epoch:393,Loss:0.651597
Epoch:394,Loss:0.651482
Epoch:395,Loss:0.651367
Epoch:396,Loss:0.651252
Epoch:397,Loss:0.651137
Epoch:398,Loss:0.651022
Epoch:399,Loss:0.650908
Epoch:400,Loss:0.650794
Epoch:401,Loss:0.650680
Epoch:402,Loss:0.650566
Epoch:403,Loss:0.650453
Epoch:404,Loss:0.650340
Epoch:405,Loss:0.650227
Epoch:406,Loss:0.650114
Epoch:407,Loss:0.650001
Epoch:408,Loss:0.649889
Epoch:409,Loss:0.649777
Epoch:410,Loss:0.649665
Epoch:411,Loss:0.649553
Epoch:412,Loss:0.649442
Epoch:413,Loss:0.649331
Epoch:414,Loss:0.649220
Epoch:415,Loss:0.649109
Epoch:416,Loss:0.648998
Epoch:417,Loss:0.648888
Epoch:418,Loss:0.648778
Epoch:419,Loss:0.648668
Epoch:420,Loss:0.648558
Epoch:421,Loss:0.648448
Epoch:422,Loss:0.648339
Epoch:423,Loss:0.648230
Epoch:424,Loss:0.648121
Epoch:425,Loss:0.648013
Epoch:426,Loss:0.647904
Epoch:427,Loss:0.647796
Epoch:428,Loss:0.647688
Epoch:429,Loss:0.647580
Epoch:430,Loss:0.647472
Epoch:431,Loss:0.647365
Epoch:432,Loss:0.647258
Epoch:433,Loss:0.647151
Epoch:434,Loss:0.647044
Epoch:435,Loss:0.646937
Epoch:436,Loss:0.646831
Epoch:437,Loss:0.646725
Epoch:438,Loss:0.646619
Epoch:439,Loss:0.646513
Epoch:440,Loss:0.646407
Epoch:441,Loss:0.646302
Epoch:442,Loss:0.646197
Epoch:443,Loss:0.646092
Epoch:444,Loss:0.645987
Epoch:445,Loss:0.645882
Epoch:446,Loss:0.645778
Epoch:447,Loss:0.645674
Epoch:448,Loss:0.645570
Epoch:449,Loss:0.645466
Epoch:450,Loss:0.645362
Epoch:451,Loss:0.645259
Epoch:452,Loss:0.645156
Epoch:453,Loss:0.645053
Epoch:454,Loss:0.644950
Epoch:455,Loss:0.644848
Epoch:456,Loss:0.644745
Epoch:457,Loss:0.644643
Epoch:458,Loss:0.644541
Epoch:459,Loss:0.644439
Epoch:460,Loss:0.644338
Epoch:461,Loss:0.644236
Epoch:462,Loss:0.644135
Epoch:463,Loss:0.644034
Epoch:464,Loss:0.643933
Epoch:465,Loss:0.643833
Epoch:466,Loss:0.643732
Epoch:467,Loss:0.643632
Epoch:468,Loss:0.643532
Epoch:469,Loss:0.643432
Epoch:470,Loss:0.643332
Epoch:471,Loss:0.643233
Epoch:472,Loss:0.643133
Epoch:473,Loss:0.643034
Epoch:474,Loss:0.642935
Epoch:475,Loss:0.642837
Epoch:476,Loss:0.642738
Epoch:477,Loss:0.642640
Epoch:478,Loss:0.642542
Epoch:479,Loss:0.642444
Epoch:480,Loss:0.642346
Epoch:481,Loss:0.642248
Epoch:482,Loss:0.642151
Epoch:483,Loss:0.642054
Epoch:484,Loss:0.641956
Epoch:485,Loss:0.641860
Epoch:486,Loss:0.641763
Epoch:487,Loss:0.641666
Epoch:488,Loss:0.641570
Epoch:489,Loss:0.641474
Epoch:490,Loss:0.641378
Epoch:491,Loss:0.641282
Epoch:492,Loss:0.641187
Epoch:493,Loss:0.641091
Epoch:494,Loss:0.640996
Epoch:495,Loss:0.640901
Epoch:496,Loss:0.640806
Epoch:497,Loss:0.640712
Epoch:498,Loss:0.640617
Epoch:499,Loss:0.640523
Epoch:500,Loss:0.640429
Epoch:501,Loss:0.640335
Epoch:502,Loss:0.640241
Epoch:503,Loss:0.640147
Epoch:504,Loss:0.640054
Epoch:505,Loss:0.639961
Epoch:506,Loss:0.639867
Epoch:507,Loss:0.639775
Epoch:508,Loss:0.639682
Epoch:509,Loss:0.639589
Epoch:510,Loss:0.639497
Epoch:511,Loss:0.639405
Epoch:512,Loss:0.639313
Epoch:513,Loss:0.639221
Epoch:514,Loss:0.639129
Epoch:515,Loss:0.639038
Epoch:516,Loss:0.638947
Epoch:517,Loss:0.638855
Epoch:518,Loss:0.638764
Epoch:519,Loss:0.638674
Epoch:520,Loss:0.638583
Epoch:521,Loss:0.638493
Epoch:522,Loss:0.638402
Epoch:523,Loss:0.638312
Epoch:524,Loss:0.638222
Epoch:525,Loss:0.638133
Epoch:526,Loss:0.638043
Epoch:527,Loss:0.637954
Epoch:528,Loss:0.637864
Epoch:529,Loss:0.637775
Epoch:530,Loss:0.637686
Epoch:531,Loss:0.637598
Epoch:532,Loss:0.637509
Epoch:533,Loss:0.637421
Epoch:534,Loss:0.637332
Epoch:535,Loss:0.637244
Epoch:536,Loss:0.637156
Epoch:537,Loss:0.637069
Epoch:538,Loss:0.636981
Epoch:539,Loss:0.636894
Epoch:540,Loss:0.636806
Epoch:541,Loss:0.636719
Epoch:542,Loss:0.636632
Epoch:543,Loss:0.636546
Epoch:544,Loss:0.636459
Epoch:545,Loss:0.636373
Epoch:546,Loss:0.636286
Epoch:547,Loss:0.636200
Epoch:548,Loss:0.636114
Epoch:549,Loss:0.636029
Epoch:550,Loss:0.635943
Epoch:551,Loss:0.635858
Epoch:552,Loss:0.635772
Epoch:553,Loss:0.635687
Epoch:554,Loss:0.635602
Epoch:555,Loss:0.635517
Epoch:556,Loss:0.635433
Epoch:557,Loss:0.635348
Epoch:558,Loss:0.635264
Epoch:559,Loss:0.635180
Epoch:560,Loss:0.635096
Epoch:561,Loss:0.635012
Epoch:562,Loss:0.634928
Epoch:563,Loss:0.634845
Epoch:564,Loss:0.634761
Epoch:565,Loss:0.634678
Epoch:566,Loss:0.634595
Epoch:567,Loss:0.634512
Epoch:568,Loss:0.634430
Epoch:569,Loss:0.634347
Epoch:570,Loss:0.634265
Epoch:571,Loss:0.634182
Epoch:572,Loss:0.634100
Epoch:573,Loss:0.634018
Epoch:574,Loss:0.633937
Epoch:575,Loss:0.633855
Epoch:576,Loss:0.633773
Epoch:577,Loss:0.633692
Epoch:578,Loss:0.633611
Epoch:579,Loss:0.633530
Epoch:580,Loss:0.633449
Epoch:581,Loss:0.633368
Epoch:582,Loss:0.633288
Epoch:583,Loss:0.633207
Epoch:584,Loss:0.633127
Epoch:585,Loss:0.633047
Epoch:586,Loss:0.632967
Epoch:587,Loss:0.632887
Epoch:588,Loss:0.632807
Epoch:589,Loss:0.632728
Epoch:590,Loss:0.632648
Epoch:591,Loss:0.632569
Epoch:592,Loss:0.632490
Epoch:593,Loss:0.632411
Epoch:594,Loss:0.632332
Epoch:595,Loss:0.632254
Epoch:596,Loss:0.632175
Epoch:597,Loss:0.632097
Epoch:598,Loss:0.632019
Epoch:599,Loss:0.631941
Epoch:600,Loss:0.631863
Epoch:601,Loss:0.631785
Epoch:602,Loss:0.631708
Epoch:603,Loss:0.631630
Epoch:604,Loss:0.631553
Epoch:605,Loss:0.631476
Epoch:606,Loss:0.631399
Epoch:607,Loss:0.631322
Epoch:608,Loss:0.631245
Epoch:609,Loss:0.631169
Epoch:610,Loss:0.631092
Epoch:611,Loss:0.631016
Epoch:612,Loss:0.630940
Epoch:613,Loss:0.630864
Epoch:614,Loss:0.630788
Epoch:615,Loss:0.630712
Epoch:616,Loss:0.630637
Epoch:617,Loss:0.630561
Epoch:618,Loss:0.630486
Epoch:619,Loss:0.630411
Epoch:620,Loss:0.630336
Epoch:621,Loss:0.630261
Epoch:622,Loss:0.630186
Epoch:623,Loss:0.630111
Epoch:624,Loss:0.630037
Epoch:625,Loss:0.629963
Epoch:626,Loss:0.629888
Epoch:627,Loss:0.629814
Epoch:628,Loss:0.629741
Epoch:629,Loss:0.629667
Epoch:630,Loss:0.629593
Epoch:631,Loss:0.629520
Epoch:632,Loss:0.629446
Epoch:633,Loss:0.629373
Epoch:634,Loss:0.629300
Epoch:635,Loss:0.629227
Epoch:636,Loss:0.629154
Epoch:637,Loss:0.629082
Epoch:638,Loss:0.629009
Epoch:639,Loss:0.628937
Epoch:640,Loss:0.628864
Epoch:641,Loss:0.628792
Epoch:642,Loss:0.628720
Epoch:643,Loss:0.628649
Epoch:644,Loss:0.628577
Epoch:645,Loss:0.628505
Epoch:646,Loss:0.628434
Epoch:647,Loss:0.628362
Epoch:648,Loss:0.628291
Epoch:649,Loss:0.628220
Epoch:650,Loss:0.628149
Epoch:651,Loss:0.628078
Epoch:652,Loss:0.628008
Epoch:653,Loss:0.627937
Epoch:654,Loss:0.627867
Epoch:655,Loss:0.627797
Epoch:656,Loss:0.627726
Epoch:657,Loss:0.627656
Epoch:658,Loss:0.627587
Epoch:659,Loss:0.627517
Epoch:660,Loss:0.627447
Epoch:661,Loss:0.627378
Epoch:662,Loss:0.627308
Epoch:663,Loss:0.627239
Epoch:664,Loss:0.627170
Epoch:665,Loss:0.627101
Epoch:666,Loss:0.627032
Epoch:667,Loss:0.626963
Epoch:668,Loss:0.626895
Epoch:669,Loss:0.626826
Epoch:670,Loss:0.626758
Epoch:671,Loss:0.626690
Epoch:672,Loss:0.626622
Epoch:673,Loss:0.626554
Epoch:674,Loss:0.626486
Epoch:675,Loss:0.626418
Epoch:676,Loss:0.626351
Epoch:677,Loss:0.626283
Epoch:678,Loss:0.626216
Epoch:679,Loss:0.626149
Epoch:680,Loss:0.626082
Epoch:681,Loss:0.626015
Epoch:682,Loss:0.625948
Epoch:683,Loss:0.625881
Epoch:684,Loss:0.625814
Epoch:685,Loss:0.625748
Epoch:686,Loss:0.625682
Epoch:687,Loss:0.625615
Epoch:688,Loss:0.625549
Epoch:689,Loss:0.625483
Epoch:690,Loss:0.625417
Epoch:691,Loss:0.625352
Epoch:692,Loss:0.625286
Epoch:693,Loss:0.625221
Epoch:694,Loss:0.625155
Epoch:695,Loss:0.625090
Epoch:696,Loss:0.625025
Epoch:697,Loss:0.624960
Epoch:698,Loss:0.624895
Epoch:699,Loss:0.624830
Epoch:700,Loss:0.624765
Epoch:701,Loss:0.624701
Epoch:702,Loss:0.624637
Epoch:703,Loss:0.624572
Epoch:704,Loss:0.624508
Epoch:705,Loss:0.624444
Epoch:706,Loss:0.624380
Epoch:707,Loss:0.624316
Epoch:708,Loss:0.624252
Epoch:709,Loss:0.624189
Epoch:710,Loss:0.624125
Epoch:711,Loss:0.624062
Epoch:712,Loss:0.623999
Epoch:713,Loss:0.623936
Epoch:714,Loss:0.623873
Epoch:715,Loss:0.623810
Epoch:716,Loss:0.623747
Epoch:717,Loss:0.623684
Epoch:718,Loss:0.623622
Epoch:719,Loss:0.623559
Epoch:720,Loss:0.623497
Epoch:721,Loss:0.623435
Epoch:722,Loss:0.623372
Epoch:723,Loss:0.623310
Epoch:724,Loss:0.623249
Epoch:725,Loss:0.623187
Epoch:726,Loss:0.623125
Epoch:727,Loss:0.623064
Epoch:728,Loss:0.623002
Epoch:729,Loss:0.622941
Epoch:730,Loss:0.622880
Epoch:731,Loss:0.622819
Epoch:732,Loss:0.622757
Epoch:733,Loss:0.622697
Epoch:734,Loss:0.622636
Epoch:735,Loss:0.622575
Epoch:736,Loss:0.622515
Epoch:737,Loss:0.622454
Epoch:738,Loss:0.622394
Epoch:739,Loss:0.622334
Epoch:740,Loss:0.622274
Epoch:741,Loss:0.622214
Epoch:742,Loss:0.622154
Epoch:743,Loss:0.622094
Epoch:744,Loss:0.622034
Epoch:745,Loss:0.621975
Epoch:746,Loss:0.621915
Epoch:747,Loss:0.621856
Epoch:748,Loss:0.621796
Epoch:749,Loss:0.621737
Epoch:750,Loss:0.621678
Epoch:751,Loss:0.621619
Epoch:752,Loss:0.621561
Epoch:753,Loss:0.621502
Epoch:754,Loss:0.621443
Epoch:755,Loss:0.621385
Epoch:756,Loss:0.621326
Epoch:757,Loss:0.621268
Epoch:758,Loss:0.621210
Epoch:759,Loss:0.621152
Epoch:760,Loss:0.621094
Epoch:761,Loss:0.621036
Epoch:762,Loss:0.620978
Epoch:763,Loss:0.620920
Epoch:764,Loss:0.620863
Epoch:765,Loss:0.620805
Epoch:766,Loss:0.620748
Epoch:767,Loss:0.620691
Epoch:768,Loss:0.620634
Epoch:769,Loss:0.620577
Epoch:770,Loss:0.620520
Epoch:771,Loss:0.620463
Epoch:772,Loss:0.620406
Epoch:773,Loss:0.620349
Epoch:774,Loss:0.620293
Epoch:775,Loss:0.620236
Epoch:776,Loss:0.620180
Epoch:777,Loss:0.620124
Epoch:778,Loss:0.620068
Epoch:779,Loss:0.620012
Epoch:780,Loss:0.619956
Epoch:781,Loss:0.619900
Epoch:782,Loss:0.619844
Epoch:783,Loss:0.619788
Epoch:784,Loss:0.619733
Epoch:785,Loss:0.619677
Epoch:786,Loss:0.619622
Epoch:787,Loss:0.619567
Epoch:788,Loss:0.619512
Epoch:789,Loss:0.619457
Epoch:790,Loss:0.619402
Epoch:791,Loss:0.619347
Epoch:792,Loss:0.619292
Epoch:793,Loss:0.619237
Epoch:794,Loss:0.619183
Epoch:795,Loss:0.619128
Epoch:796,Loss:0.619074
Epoch:797,Loss:0.619020
Epoch:798,Loss:0.618965
Epoch:799,Loss:0.618911
Epoch:800,Loss:0.618857
Epoch:801,Loss:0.618803
Epoch:802,Loss:0.618750
Epoch:803,Loss:0.618696
Epoch:804,Loss:0.618642
Epoch:805,Loss:0.618589
Epoch:806,Loss:0.618535
Epoch:807,Loss:0.618482
Epoch:808,Loss:0.618429
Epoch:809,Loss:0.618376
Epoch:810,Loss:0.618322
Epoch:811,Loss:0.618270
Epoch:812,Loss:0.618217
Epoch:813,Loss:0.618164
Epoch:814,Loss:0.618111
Epoch:815,Loss:0.618059
Epoch:816,Loss:0.618006
Epoch:817,Loss:0.617954
Epoch:818,Loss:0.617901
Epoch:819,Loss:0.617849
Epoch:820,Loss:0.617797
Epoch:821,Loss:0.617745
Epoch:822,Loss:0.617693
Epoch:823,Loss:0.617641
Epoch:824,Loss:0.617589
Epoch:825,Loss:0.617538
Epoch:826,Loss:0.617486
Epoch:827,Loss:0.617435
Epoch:828,Loss:0.617383
Epoch:829,Loss:0.617332
Epoch:830,Loss:0.617281
Epoch:831,Loss:0.617230
Epoch:832,Loss:0.617179
Epoch:833,Loss:0.617128
Epoch:834,Loss:0.617077
Epoch:835,Loss:0.617026
Epoch:836,Loss:0.616975
Epoch:837,Loss:0.616925
Epoch:838,Loss:0.616874
Epoch:839,Loss:0.616824
Epoch:840,Loss:0.616773
Epoch:841,Loss:0.616723
Epoch:842,Loss:0.616673
Epoch:843,Loss:0.616623
Epoch:844,Loss:0.616573
Epoch:845,Loss:0.616523
Epoch:846,Loss:0.616473
Epoch:847,Loss:0.616423
Epoch:848,Loss:0.616374
Epoch:849,Loss:0.616324
Epoch:850,Loss:0.616275
Epoch:851,Loss:0.616225
Epoch:852,Loss:0.616176
Epoch:853,Loss:0.616127
Epoch:854,Loss:0.616078
Epoch:855,Loss:0.616029
Epoch:856,Loss:0.615980
Epoch:857,Loss:0.615931
Epoch:858,Loss:0.615882
Epoch:859,Loss:0.615833
Epoch:860,Loss:0.615785
Epoch:861,Loss:0.615736
Epoch:862,Loss:0.615688
Epoch:863,Loss:0.615639
Epoch:864,Loss:0.615591
Epoch:865,Loss:0.615543
Epoch:866,Loss:0.615495
Epoch:867,Loss:0.615447
Epoch:868,Loss:0.615399
Epoch:869,Loss:0.615351
Epoch:870,Loss:0.615303
Epoch:871,Loss:0.615255
Epoch:872,Loss:0.615208
Epoch:873,Loss:0.615160
Epoch:874,Loss:0.615113
Epoch:875,Loss:0.615065
Epoch:876,Loss:0.615018
Epoch:877,Loss:0.614971
Epoch:878,Loss:0.614923
Epoch:879,Loss:0.614876
Epoch:880,Loss:0.614829
Epoch:881,Loss:0.614783
Epoch:882,Loss:0.614736
Epoch:883,Loss:0.614689
Epoch:884,Loss:0.614642
Epoch:885,Loss:0.614596
Epoch:886,Loss:0.614549
Epoch:887,Loss:0.614503
Epoch:888,Loss:0.614456
Epoch:889,Loss:0.614410
Epoch:890,Loss:0.614364
Epoch:891,Loss:0.614318
Epoch:892,Loss:0.614272
Epoch:893,Loss:0.614226
Epoch:894,Loss:0.614180
Epoch:895,Loss:0.614134
Epoch:896,Loss:0.614088
Epoch:897,Loss:0.614043
Epoch:898,Loss:0.613997
Epoch:899,Loss:0.613952
Epoch:900,Loss:0.613906
Epoch:901,Loss:0.613861
Epoch:902,Loss:0.613816
Epoch:903,Loss:0.613770
Epoch:904,Loss:0.613725
Epoch:905,Loss:0.613680
Epoch:906,Loss:0.613635
Epoch:907,Loss:0.613590
Epoch:908,Loss:0.613545
Epoch:909,Loss:0.613501
Epoch:910,Loss:0.613456
Epoch:911,Loss:0.613411
Epoch:912,Loss:0.613367
Epoch:913,Loss:0.613323
Epoch:914,Loss:0.613278
Epoch:915,Loss:0.613234
Epoch:916,Loss:0.613190
Epoch:917,Loss:0.613145
Epoch:918,Loss:0.613101
Epoch:919,Loss:0.613057
Epoch:920,Loss:0.613014
Epoch:921,Loss:0.612970
Epoch:922,Loss:0.612926
Epoch:923,Loss:0.612882
Epoch:924,Loss:0.612839
Epoch:925,Loss:0.612795
Epoch:926,Loss:0.612752
Epoch:927,Loss:0.612708
Epoch:928,Loss:0.612665
Epoch:929,Loss:0.612621
Epoch:930,Loss:0.612578
Epoch:931,Loss:0.612535
Epoch:932,Loss:0.612492
Epoch:933,Loss:0.612449
Epoch:934,Loss:0.612406
Epoch:935,Loss:0.612363
Epoch:936,Loss:0.612321
Epoch:937,Loss:0.612278
Epoch:938,Loss:0.612235
Epoch:939,Loss:0.612193
Epoch:940,Loss:0.612150
Epoch:941,Loss:0.612108
Epoch:942,Loss:0.612066
Epoch:943,Loss:0.612023
Epoch:944,Loss:0.611981
Epoch:945,Loss:0.611939
Epoch:946,Loss:0.611897
Epoch:947,Loss:0.611855
Epoch:948,Loss:0.611813
Epoch:949,Loss:0.611771
Epoch:950,Loss:0.611729
Epoch:951,Loss:0.611688
Epoch:952,Loss:0.611646
Epoch:953,Loss:0.611604
Epoch:954,Loss:0.611563
Epoch:955,Loss:0.611521
Epoch:956,Loss:0.611480
Epoch:957,Loss:0.611439
Epoch:958,Loss:0.611398
Epoch:959,Loss:0.611356
Epoch:960,Loss:0.611315
Epoch:961,Loss:0.611274
Epoch:962,Loss:0.611233
Epoch:963,Loss:0.611192
Epoch:964,Loss:0.611151
Epoch:965,Loss:0.611111
Epoch:966,Loss:0.611070
Epoch:967,Loss:0.611029
Epoch:968,Loss:0.610989
Epoch:969,Loss:0.610948
Epoch:970,Loss:0.610908
Epoch:971,Loss:0.610868
Epoch:972,Loss:0.610827
Epoch:973,Loss:0.610787
Epoch:974,Loss:0.610747
Epoch:975,Loss:0.610707
Epoch:976,Loss:0.610667
Epoch:977,Loss:0.610627
Epoch:978,Loss:0.610587
Epoch:979,Loss:0.610547
Epoch:980,Loss:0.610507
Epoch:981,Loss:0.610467
Epoch:982,Loss:0.610428
Epoch:983,Loss:0.610388
Epoch:984,Loss:0.610349
Epoch:985,Loss:0.610309
Epoch:986,Loss:0.610270
Epoch:987,Loss:0.610231
Epoch:988,Loss:0.610191
Epoch:989,Loss:0.610152
Epoch:990,Loss:0.610113
Epoch:991,Loss:0.610074
Epoch:992,Loss:0.610035
Epoch:993,Loss:0.609996
Epoch:994,Loss:0.609957
Epoch:995,Loss:0.609918
Epoch:996,Loss:0.609879
Epoch:997,Loss:0.609841
Epoch:998,Loss:0.609802
Epoch:999,Loss:0.609763
loss=critier(model(X_test),y_test)
loss
tensor(0.6142, grad_fn=<BinaryCrossEntropyBackward0>)
相关文章:

【Python机器学习】实验13 基于神经网络的回归-分类实验
文章目录 神经网络例1 基于神经网络的回归(简单例子)1.1 导入包1.2 构造数据集(随机构造的)1.3 构造训练集和测试集1.4 构建神经网络模型1.5 采用训练数据来训练神经网络模型 实验:基于神经网络的分类(鸢尾花数据集)1. 导入包2. 构造数据集3.…...

【数据结构】二叉树的链式结构的实现 -- 详解
一、前置说明 在学习二叉树的基本操作前,需先要创建一棵二叉树,然后才能学习其相关的基本操作。为了降低大家学习成本,此处手动快速创建一棵简单的二叉树,快速进入二叉树操作学习。 typedef char BTDataType;typedef struct Binar…...

【C语言】什么是结构体内存对齐?结构体的大小怎么计算?
目录 1.结构体内存对齐 对偏移量的理解: 2.结构体的大小计算 2.1结构体中只有普通的数据类型的大小计算 2.2 结构体中有嵌套的结构体的大小计算 3.修改默认对齐数 4.为什么存在内存对齐? 这篇文章主要介绍结构体内存对齐和如何计算大小。 在学习结构体内存…...

【Redis】Redis中的布隆过滤器
【Redis】Redis中的布隆过滤器 前言 在实际开发中,会遇到很多要判断一个元素是否在某个集合中的业务场景,类似于垃圾邮件的识别,恶意IP地址的访问,缓存穿透等情况。类似于缓存穿透这种情况,有许多的解决方法…...

接口测试 —— Jmeter 参数加密实现
Jmeter有两种方法可以实现算法加密 1、使用__digest自带函数 参数说明: Digest algorithm:算法摘要,可输入值:MD2、MD5、SHA-1、SHA-224、SHA-256、SHA-384、SHA-512 String to be hashed:要加密的数据 Salt to be…...

Linux c语言字节序
文章目录 一、简介二、大小端判断2.1 联合体2.2 指针2.3 网络字节序 一、简介 字节序(Byte Order)指的是在存储和表示多字节数据类型(如整数和浮点数)时,字节的排列顺序。常见的字节序有大端字节序(Big En…...

批量将excel中第5列中内容将人名和电话号码进行分列
使用Python可以使用openpyxl库来实现批量将Excel中第5列的内容分列为人名和电话号码的操作。下面是示例代码: import openpyxl def split_names_and_phone_numbers(file_path, sheet_name): # 加载Excel文件 workbook openpyxl.load_workbook(file_path) …...

WPF DataGrid columns表头根据数据集动态动态生成Demo
思路是这样的,数组集合装表头的信息,遍历这个集合,遍历过程中处理一下数据,然后就把每表头信息添加到dataGrid2.Columns.Add(templateColumn); 1,页面Xaml代码: <DataGrid x:Name"dataGrid" …...

1339. 分裂二叉树的最大乘积
链接: 1339. 分裂二叉树的最大乘积 题解: /*** Definition for a binary tree node.* struct TreeNode {* int val;* TreeNode *left;* TreeNode *right;* TreeNode() : val(0), left(nullptr), right(nullptr) {}* …...

【C++】Stack和Queue
欢迎来到Cefler的博客😁 🕌博客主页:那个传说中的man的主页 🏠个人专栏:题目解析 🌎推荐文章:题目大解析3 目录 👉🏻Stack Constructor👉🏻Stack …...

Maven之tomcat7-maven-plugin 版本低的问题
tomcat7-maven-plugin 版本『低』的问题 相较于当前最新版的 tomcat 10 而言,tomcat7-maven-plugin 确实看起来很显老旧。但是,这个问题并不是问题,至少不是大问题。 原因 1:tomcat7-maven-plugin 仅用于我们(程序员&…...

在项目中如何解除idea和Git的绑定
在项目中如何解除idea和Git的绑定 1、点击File--->Settings...(CtrlAltS)--->Version Control--->Directory Mappings--->点击取消Git的注册根路径: 2、回到idea界面就没有Git了: 3、给这个项目初始化 这样就可以重新绑定远程仓库了&#x…...

AGI 在网易云信的技术提效和业务创新
We believe our research will eventually lead to artificial general intelligence, a system that can solve human-level problems. Building safe and beneficial AGI is our mission. ---- OpenAI 通用人工智能 AGI 作为 AI 的终极形态,是 AI 行业内追求的演…...

线性代数的学习和整理9(草稿-----未完成)
3.3 特征值和特征向量是什么? 直接说现在:特征向量这个块往哪个方向进行了拉伸,各个方向拉伸了几倍。这也让人很容易理解为什么,行列式的值就是特征值的乘积。 特征向量也代表了一些良好的性质,即这些线在线性变换后…...

React的useReducer与Reudx对比
useReducer 和 Redux 都是用于处理应用程序的状态管理的工具,但它们在概念和使用场景上存在一些区别。 useReducer: useReducer 是 React 提供的一个 Hook,用于管理局部状态。它接受一个 reducer 函数和初始状态,并返回一个包含当…...

深度学习环境搭建 cuda、模型量化bitsandbytes安装教程 windows、linux
cuda、cudann、conda安装教程 输入以下命令,查看 GPU 支持的最高 CUDA 版本。 nvidia-smi cuda安装(cudatoolkit) 前往 Nvidia 的 CUDA 官网:CUDA Toolkit Archive | NVIDIA Developer CUDA Toolkit 11.8 Downloads | NVIDIA …...

pythond assert 0 <= colx < X12_MAX_COLS AssertionError
python使用xlrd读取excel时,报错: assert 0 < colx < X12_MAX_COLS AssertionError 大意是excel列太多了。主要是xlrd库的问题。最好的方法是不用它,但是我用的其他人提供的工具用到它,没法改。 尝试手动删除excel的列&am…...

js简介以及在html中的2种使用方式(hello world)
简介 javascript :是一个跨平台的脚本语言;是一种轻量级的编程语言。 JavaScript 是 Web 的编程语言。所有现代的 HTML 页面都使用 JavaScript。 HTML: 结构 css: 表现 JS: 行为 HTMLCSS 只能称之为静态网页࿰…...

vsCode使用cuda
一、vsCode使用cuda 前情提要:配置好mingw: 1.安装cuda 参考: **CUDA Toolkit安装教程(Windows):**https://blog.csdn.net/qq_42951560/article/details/116131410 2.在vscode中添加includePath c_cp…...

ubuntu无法使用apt命令时怎么安装库
如题 因为某些原因,不能直接联网使用apt命令安装库。只能手动去ubuntu镜像源里 找对应的包的deb安装文件 镜像源地址(适用于AMD64架构,就是常见的PC的X86-64啦) 镜像源地址(适用于ARM64,armhf,ppc64el,riscv64,s390x架构ÿ…...

防火墙firewall
一、什么是防火墙 二、iptables 1、iptables介绍 2、实验 138的已经被拒绝,1可以 三、firewalld 1、firewalld简介 关闭iptables,开启firewalld,curl不能使用,远程连接ssh可以使用 添加80端口 这样写也可以:添加http…...

拿来即用,自己封装的 axios
文章目录 一、需求二、分析1. 安装axios2. 新建一个 ts 文件,封装 axios3. store 存放 token 信息4. 使用5. 文件 type.js 一、需求 在日常开发中,我们会经常用到 axios ,那么如何在自己的项目中自己封装 axios 二、分析 1. 安装axios np…...

Hadoop小结(下)
HDFS 集群 HDFS 集群是建立在 Hadoop 集群之上的,由于 HDFS 是 Hadoop 最主要的守护进程,所以 HDFS 集群的配置过程是 Hadoop 集群配置过程的代表。 使用 Docker 可以更加方便地、高效地构建出一个集群环境。 每台计算机中的配置 Hadoop 如何配置集群…...

使用老北鼻AI免费GPT对话解决gun make安装和解析iso9660的问题
在学习解析ISO9660镜像文件时,使用了GPT来了解相关的库和gun make编译器的相关知识。这个过程可真是一言难尽,每个问题的回答都模棱两可都需要去证实,不能直接复制粘贴,也不能说GPT的回答一点用也没有,至少GPT给出了一…...

shell脚本语句
一、语句 一、条件语句 一、以用户为例演示 一、显示当前登录系统的用户信息 w命令 二、显示有多少个用户 w | wc -l 显示有7个用户 前两个是固定标题,从第三个开始才是登录用户,所以要统计数量需要 命令:echo $[$(w | wc -l) -2] 显示…...

【LeetCode】2235.两整数相加
题目 给你两个整数 num1 和 num2,返回这两个整数的和。 示例 1: 输入:num1 12, num2 5 输出:17 解释:num1 是 12,num2 是 5 ,它们的和是 12 5 17 ,因此返回 17 。示例 2&…...

springboot sl4j2 写入日志到mysql
问题描述 springboot初始化的时候,会先初始化日志然后再加载数据源如果用配置文件进行初始化,那么会出现数据源没有加载成功,导致空指针异常 报错排查如下: 搜索报错信息,OBjects.invoke is Null打断点发现。dataso…...

用 PyTorch 编写分布式应用程序
用 PyTorch 编写分布式应用程序 在这个简短的教程中,我们将介绍 PyTorch 的分布式软件包。 我们将了解如何设置分布式设置,使用不同的交流策略以及如何仔细查看软件包的内部结构。 设定 PyTorch 中包含的分布式软件包(即torch.distributed)…...

空间分析专属 Python 学习资料
空间数据分析能够帮助我们更好地理解地理空间中的模式和关系,从而为决策提供支持。例如,城市规划者可以使用空间数据分析来确定城市发展的最佳方向,环境科学家可以使用空间数据分析来评估污染的影响,而商业分析师可以使用空间数据…...

2. Linux Server 20.04 Qt5.14.2配置Jetson Orin Nano Developer Kit 交叉编译环境
最近公司给了我一块Jetson Orin Nano的板子,先刷了系统(1.Jetson Orin Nano Developer Kit系统刷机)又让我搭建交叉编译环境,所以有了下面的文章 一 :Qt5.14.2交叉编译环境安装 1.准备 1.1设备环境 1.1.1 Server: Ubuntu20.0…...