qq发网站链接怎么做/公司seo排名优化
Pytorch使用多层神经网络模型实现经典波士顿boston房价预测问题
波士顿房价数据集介绍
波士顿房价数据集是一个经典的机器学习数据集,用于预测波士顿地区房屋的中位数价格。该数据集包含了506个样本,每个样本有13个特征,包括城镇的各种指标,如犯罪率、住宅用地比例、每个城镇的非零售商业用地比例等。目标变量是房屋的中位数价格(以千美元为单位)。
以下是波士顿房价数据集的特征列表:
CRIM:城镇的犯罪率 ZN:住宅用地超过 25000 平方英尺的比例 INDUS:每个城镇的非零售商业用地比例
CHAS:查尔斯河虚拟变量(如果附近是河流,则为1;否则为0) NOX:一氧化氮浓度(每千万份) RM:每个住宅的平均房间数
AGE:1940 年之前建造的自住单位比例 DIS:到波士顿五个就业中心的加权距离 RAD:径向公路的可达性指数 TAX:每 10000
美元的全价值财产税率 PTRATIO:每个城镇的学生与教师比例 B:计算公式为1000(Bk - 0.63)^2,其中Bk是城镇的黑人比例
LSTAT:低收入人群的百分比
波士顿房价数据集通常用于回归问题的训练和测试,旨在预测房屋的中位数价格。这个数据集被广泛应用于机器学习和数据科学的教学和实践中,用于评估不同算法和模型的性能。
1、引入依赖库和模块
import torch
import torch.nn as nn
import torch.optim as optim
import numpy as np
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
2: 准备数据集
# 加载Boston房价数据集
data = load_boston()
X, y = data['data'], data['target']
C:\Users\Admin\AppData\Roaming\Python\Python37\site-packages\sklearn\utils\deprecation.py:87: FutureWarning: Function load_boston is deprecated; `load_boston` is deprecated in 1.0 and will be removed in 1.2.The Boston housing prices dataset has an ethical problem. You can refer tothe documentation of this function for further details.The scikit-learn maintainers therefore strongly discourage the use of thisdataset unless the purpose of the code is to study and educate aboutethical issues in data science and machine learning.In this case special case, you can fetch the dataset from the originalsource::import pandas as pdimport numpy as npdata_url = "http://lib.stat.cmu.edu/datasets/boston"raw_df = pd.read_csv(data_url, sep="\s+", skiprows=22, header=None)data = np.hstack([raw_df.values[::2, :], raw_df.values[1::2, :2]])target = raw_df.values[1::2, 2]Alternative datasets include the California housing dataset (i.e.func:`~sklearn.datasets.fetch_california_housing`) and the Ames housingdataset. You can load the datasets as follows:from sklearn.datasets import fetch_california_housinghousing = fetch_california_housing()for the California housing dataset and:from sklearn.datasets import fetch_openmlhousing = fetch_openml(name="house_prices", as_frame=True)for the Ames housing dataset.warnings.warn(msg, category=FutureWarning)
警告不用理会,也可以按照警告里的内容进行修改
#3、 划分训练集和测试集
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
4、数据归一化
scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)
5、转换为PyTorch张量
X_train = torch.tensor(X_train, dtype=torch.float32)
X_test = torch.tensor(X_test, dtype=torch.float32)
y_train = torch.tensor(y_train, dtype=torch.float32).view(-1, 1)
y_test = torch.tensor(y_test, dtype=torch.float32).view(-1, 1)
4、定义神经网络模型
class FeedforwardNN(nn.Module):def __init__(self, input_dim, hidden_dim, output_dim):super(FeedforwardNN, self).__init__()self.fc1 = nn.Linear(input_dim, hidden_dim)self.relu = nn.ReLU()self.fc2 = nn.Linear(hidden_dim, output_dim)def forward(self, x):x = self.fc1(x)x = self.relu(x)x = self.fc2(x)return x
5、定义训练和评估函数
def train(model, criterion, optimizer, X, y, num_epochs=100, batch_size=32):model.train()num_samples = X.shape[0]num_batches = num_samples // batch_sizefor epoch in range(num_epochs):total_loss = 0for batch_idx in range(num_batches):start_idx = batch_idx * batch_sizeend_idx = start_idx + batch_sizebatch_X = X[start_idx:end_idx]batch_y = y[start_idx:end_idx]optimizer.zero_grad()outputs = model(batch_X)loss = criterion(outputs, batch_y)loss.backward()optimizer.step()total_loss += loss.item()print(f"Epoch {epoch + 1}/{num_epochs}, Loss: {total_loss / num_batches:.4f}")def evaluate(model, criterion, X, y):model.eval()with torch.no_grad():outputs = model(X)loss = criterion(outputs, y)rmse = torch.sqrt(loss)mae = torch.mean(torch.abs(outputs - y))return loss.item(), rmse.item(), mae.item()
6: 运行训练和评估
# 设置模型参数
input_dim = X_train.shape[1]
hidden_dim = 64
output_dim = 1
7、初始化模型
model = FeedforwardNN(input_dim, hidden_dim, output_dim)
8、定义损失函数和优化器
criterion = nn.MSELoss()
optimizer = optim.Adam(model.parameters(), lr=0.001)
9、训练模型
train(model, criterion, optimizer, X_train, y_train, num_epochs=500, batch_size=32)
Epoch 1/500, Loss: 7.6713
Epoch 2/500, Loss: 7.6533
Epoch 3/500, Loss: 7.6367
Epoch 4/500, Loss: 7.6191
Epoch 5/500, Loss: 7.6038
Epoch 6/500, Loss: 7.5861
Epoch 7/500, Loss: 7.5701
Epoch 8/500, Loss: 7.5533
Epoch 9/500, Loss: 7.5396
Epoch 10/500, Loss: 7.5219
Epoch 11/500, Loss: 7.5071
Epoch 12/500, Loss: 7.4910
Epoch 13/500, Loss: 7.4769
Epoch 14/500, Loss: 7.4604
Epoch 15/500, Loss: 7.4455
Epoch 16/500, Loss: 7.4301
Epoch 17/500, Loss: 7.4159
Epoch 18/500, Loss: 7.3997
Epoch 19/500, Loss: 7.3860
Epoch 20/500, Loss: 7.3718
Epoch 21/500, Loss: 7.3559
Epoch 22/500, Loss: 7.3419
Epoch 23/500, Loss: 7.3277
Epoch 24/500, Loss: 7.3155
Epoch 25/500, Loss: 7.3003
Epoch 26/500, Loss: 7.2862
Epoch 27/500, Loss: 7.2728
Epoch 28/500, Loss: 7.2588
Epoch 29/500, Loss: 7.2454
Epoch 30/500, Loss: 7.2323
Epoch 31/500, Loss: 7.2186
Epoch 32/500, Loss: 7.2040
Epoch 33/500, Loss: 7.1909
Epoch 34/500, Loss: 7.1771
Epoch 35/500, Loss: 7.1646
Epoch 36/500, Loss: 7.1500
Epoch 37/500, Loss: 7.1361
Epoch 38/500, Loss: 7.1248
Epoch 39/500, Loss: 7.1110
Epoch 40/500, Loss: 7.0965
Epoch 41/500, Loss: 7.0860
Epoch 42/500, Loss: 7.0732
Epoch 43/500, Loss: 7.0594
Epoch 44/500, Loss: 7.0482
Epoch 45/500, Loss: 7.0353
Epoch 46/500, Loss: 7.0239
Epoch 47/500, Loss: 7.0115
Epoch 48/500, Loss: 6.9981
Epoch 49/500, Loss: 6.9871
Epoch 50/500, Loss: 6.9741
Epoch 51/500, Loss: 6.9618
Epoch 52/500, Loss: 6.9509
Epoch 53/500, Loss: 6.9365
Epoch 54/500, Loss: 6.9262
Epoch 55/500, Loss: 6.9139
Epoch 56/500, Loss: 6.9012
Epoch 57/500, Loss: 6.8910
Epoch 58/500, Loss: 6.8762
Epoch 59/500, Loss: 6.8628
Epoch 60/500, Loss: 6.8507
Epoch 61/500, Loss: 6.8350
Epoch 62/500, Loss: 6.8210
Epoch 63/500, Loss: 6.8089
Epoch 64/500, Loss: 6.7953
Epoch 65/500, Loss: 6.7840
Epoch 66/500, Loss: 6.7698
Epoch 67/500, Loss: 6.7570
Epoch 68/500, Loss: 6.7478
Epoch 69/500, Loss: 6.7344
Epoch 70/500, Loss: 6.7222
Epoch 71/500, Loss: 6.7105
Epoch 72/500, Loss: 6.6986
Epoch 73/500, Loss: 6.6863
Epoch 74/500, Loss: 6.6732
Epoch 75/500, Loss: 6.6629
Epoch 76/500, Loss: 6.6499
Epoch 77/500, Loss: 6.6392
Epoch 78/500, Loss: 6.6261
Epoch 79/500, Loss: 6.6136
Epoch 80/500, Loss: 6.6037
Epoch 81/500, Loss: 6.5918
Epoch 82/500, Loss: 6.5786
Epoch 83/500, Loss: 6.5673
Epoch 84/500, Loss: 6.5566
Epoch 85/500, Loss: 6.5462
Epoch 86/500, Loss: 6.5339
Epoch 87/500, Loss: 6.5224
Epoch 88/500, Loss: 6.5124
Epoch 89/500, Loss: 6.5001
Epoch 90/500, Loss: 6.4900
Epoch 91/500, Loss: 6.4794
Epoch 92/500, Loss: 6.4690
Epoch 93/500, Loss: 6.4578
Epoch 94/500, Loss: 6.4471
Epoch 95/500, Loss: 6.4381
Epoch 96/500, Loss: 6.4284
Epoch 97/500, Loss: 6.4176
Epoch 98/500, Loss: 6.4070
Epoch 99/500, Loss: 6.3981
Epoch 100/500, Loss: 6.3892
Epoch 101/500, Loss: 6.3782
Epoch 102/500, Loss: 6.3686
Epoch 103/500, Loss: 6.3586
Epoch 104/500, Loss: 6.3521
Epoch 105/500, Loss: 6.3401
Epoch 106/500, Loss: 6.3315
Epoch 107/500, Loss: 6.3212
Epoch 108/500, Loss: 6.3127
Epoch 109/500, Loss: 6.3046
Epoch 110/500, Loss: 6.2946
Epoch 111/500, Loss: 6.2848
Epoch 112/500, Loss: 6.2760
Epoch 113/500, Loss: 6.2675
Epoch 114/500, Loss: 6.2588
Epoch 115/500, Loss: 6.2495
Epoch 116/500, Loss: 6.2413
Epoch 117/500, Loss: 6.2320
Epoch 118/500, Loss: 6.2219
Epoch 119/500, Loss: 6.2147
Epoch 120/500, Loss: 6.2047
Epoch 121/500, Loss: 6.1957
Epoch 122/500, Loss: 6.1858
Epoch 123/500, Loss: 6.1764
Epoch 124/500, Loss: 6.1671
Epoch 125/500, Loss: 6.1602
Epoch 126/500, Loss: 6.1496
Epoch 127/500, Loss: 6.1408
Epoch 128/500, Loss: 6.1315
Epoch 129/500, Loss: 6.1248
Epoch 130/500, Loss: 6.1140
Epoch 131/500, Loss: 6.1068
Epoch 132/500, Loss: 6.0980
Epoch 133/500, Loss: 6.0892
Epoch 134/500, Loss: 6.0806
Epoch 135/500, Loss: 6.0731
Epoch 136/500, Loss: 6.0651
Epoch 137/500, Loss: 6.0563
Epoch 138/500, Loss: 6.0487
Epoch 139/500, Loss: 6.0428
Epoch 140/500, Loss: 6.0331
Epoch 141/500, Loss: 6.0275
Epoch 142/500, Loss: 6.0188
Epoch 143/500, Loss: 6.0125
Epoch 144/500, Loss: 6.0041
Epoch 145/500, Loss: 5.9995
Epoch 146/500, Loss: 5.9901
Epoch 147/500, Loss: 5.9834
Epoch 148/500, Loss: 5.9781
Epoch 149/500, Loss: 5.9689
Epoch 150/500, Loss: 5.9638
Epoch 151/500, Loss: 5.9542
Epoch 152/500, Loss: 5.9498
Epoch 153/500, Loss: 5.9417
Epoch 154/500, Loss: 5.9355
Epoch 155/500, Loss: 5.9283
Epoch 156/500, Loss: 5.9228
Epoch 157/500, Loss: 5.9137
Epoch 158/500, Loss: 5.9079
Epoch 159/500, Loss: 5.8998
Epoch 160/500, Loss: 5.8935
Epoch 161/500, Loss: 5.8862
Epoch 162/500, Loss: 5.8799
Epoch 163/500, Loss: 5.8727
Epoch 164/500, Loss: 5.8673
Epoch 165/500, Loss: 5.8595
Epoch 166/500, Loss: 5.8540
Epoch 167/500, Loss: 5.8460
Epoch 168/500, Loss: 5.8405
Epoch 169/500, Loss: 5.8328
Epoch 170/500, Loss: 5.8278
Epoch 171/500, Loss: 5.8194
Epoch 172/500, Loss: 5.8159
Epoch 173/500, Loss: 5.8087
Epoch 174/500, Loss: 5.8011
Epoch 175/500, Loss: 5.7945
Epoch 176/500, Loss: 5.7897
Epoch 177/500, Loss: 5.7834
Epoch 178/500, Loss: 5.7748
Epoch 179/500, Loss: 5.7701
Epoch 180/500, Loss: 5.7621
Epoch 181/500, Loss: 5.7586
Epoch 182/500, Loss: 5.7515
Epoch 183/500, Loss: 5.7426
Epoch 184/500, Loss: 5.7382
Epoch 185/500, Loss: 5.7301
Epoch 186/500, Loss: 5.7249
Epoch 187/500, Loss: 5.7165
Epoch 188/500, Loss: 5.7118
Epoch 189/500, Loss: 5.7042
Epoch 190/500, Loss: 5.6969
Epoch 191/500, Loss: 5.6916
Epoch 192/500, Loss: 5.6836
Epoch 193/500, Loss: 5.6790
Epoch 194/500, Loss: 5.6699
Epoch 195/500, Loss: 5.6653
Epoch 196/500, Loss: 5.6584
Epoch 197/500, Loss: 5.6511
Epoch 198/500, Loss: 5.6476
Epoch 199/500, Loss: 5.6388
Epoch 200/500, Loss: 5.6354
Epoch 201/500, Loss: 5.6268
Epoch 202/500, Loss: 5.6211
Epoch 203/500, Loss: 5.6145
Epoch 204/500, Loss: 5.6094
Epoch 205/500, Loss: 5.6006
Epoch 206/500, Loss: 5.5967
Epoch 207/500, Loss: 5.5900
Epoch 208/500, Loss: 5.5822
Epoch 209/500, Loss: 5.5770
Epoch 210/500, Loss: 5.5698
Epoch 211/500, Loss: 5.5644
Epoch 212/500, Loss: 5.5561
Epoch 213/500, Loss: 5.5518
Epoch 214/500, Loss: 5.5444
Epoch 215/500, Loss: 5.5366
Epoch 216/500, Loss: 5.5314
Epoch 217/500, Loss: 5.5268
Epoch 218/500, Loss: 5.5187
Epoch 219/500, Loss: 5.5131
Epoch 220/500, Loss: 5.5068
Epoch 221/500, Loss: 5.5014
Epoch 222/500, Loss: 5.4941
Epoch 223/500, Loss: 5.4913
Epoch 224/500, Loss: 5.4829
Epoch 225/500, Loss: 5.4784
Epoch 226/500, Loss: 5.4715
Epoch 227/500, Loss: 5.4671
Epoch 228/500, Loss: 5.4601
Epoch 229/500, Loss: 5.4572
Epoch 230/500, Loss: 5.4490
Epoch 231/500, Loss: 5.4446
Epoch 232/500, Loss: 5.4384
Epoch 233/500, Loss: 5.4348
Epoch 234/500, Loss: 5.4285
Epoch 235/500, Loss: 5.4223
Epoch 236/500, Loss: 5.4176
Epoch 237/500, Loss: 5.4119
Epoch 238/500, Loss: 5.4079
Epoch 239/500, Loss: 5.4014
Epoch 240/500, Loss: 5.3977
Epoch 241/500, Loss: 5.3904
Epoch 242/500, Loss: 5.3862
Epoch 243/500, Loss: 5.3814
Epoch 244/500, Loss: 5.3757
Epoch 245/500, Loss: 5.3704
Epoch 246/500, Loss: 5.3649
Epoch 247/500, Loss: 5.3605
Epoch 248/500, Loss: 5.3544
Epoch 249/500, Loss: 5.3508
Epoch 250/500, Loss: 5.3437
Epoch 251/500, Loss: 5.3401
Epoch 252/500, Loss: 5.3322
Epoch 253/500, Loss: 5.3285
Epoch 254/500, Loss: 5.3220
Epoch 255/500, Loss: 5.3158
Epoch 256/500, Loss: 5.3105
Epoch 257/500, Loss: 5.3039
Epoch 258/500, Loss: 5.2994
Epoch 259/500, Loss: 5.2937
Epoch 260/500, Loss: 5.2889
Epoch 261/500, Loss: 5.2810
Epoch 262/500, Loss: 5.2789
Epoch 263/500, Loss: 5.2728
Epoch 264/500, Loss: 5.2654
Epoch 265/500, Loss: 5.2600
Epoch 266/500, Loss: 5.2539
Epoch 267/500, Loss: 5.2494
Epoch 268/500, Loss: 5.2418
Epoch 269/500, Loss: 5.2374
Epoch 270/500, Loss: 5.2297
Epoch 271/500, Loss: 5.2260
Epoch 272/500, Loss: 5.2195
Epoch 273/500, Loss: 5.2145
Epoch 274/500, Loss: 5.2074
Epoch 275/500, Loss: 5.2024
Epoch 276/500, Loss: 5.1976
Epoch 277/500, Loss: 5.1900
Epoch 278/500, Loss: 5.1856
Epoch 279/500, Loss: 5.1795
Epoch 280/500, Loss: 5.1757
Epoch 281/500, Loss: 5.1690
Epoch 282/500, Loss: 5.1647
Epoch 283/500, Loss: 5.1580
Epoch 284/500, Loss: 5.1540
Epoch 285/500, Loss: 5.1486
Epoch 286/500, Loss: 5.1452
Epoch 287/500, Loss: 5.1385
Epoch 288/500, Loss: 5.1349
Epoch 289/500, Loss: 5.1301
Epoch 290/500, Loss: 5.1254
Epoch 291/500, Loss: 5.1208
Epoch 292/500, Loss: 5.1149
Epoch 293/500, Loss: 5.1120
Epoch 294/500, Loss: 5.1068
Epoch 295/500, Loss: 5.1030
Epoch 296/500, Loss: 5.0981
Epoch 297/500, Loss: 5.0925
Epoch 298/500, Loss: 5.0896
Epoch 299/500, Loss: 5.0844
Epoch 300/500, Loss: 5.0810
Epoch 301/500, Loss: 5.0757
Epoch 302/500, Loss: 5.0706
Epoch 303/500, Loss: 5.0670
Epoch 304/500, Loss: 5.0618
Epoch 305/500, Loss: 5.0584
Epoch 306/500, Loss: 5.0533
Epoch 307/500, Loss: 5.0499
Epoch 308/500, Loss: 5.0440
Epoch 309/500, Loss: 5.0412
Epoch 310/500, Loss: 5.0359
Epoch 311/500, Loss: 5.0297
Epoch 312/500, Loss: 5.0271
Epoch 313/500, Loss: 5.0206
Epoch 314/500, Loss: 5.0179
Epoch 315/500, Loss: 5.0127
Epoch 316/500, Loss: 5.0063
Epoch 317/500, Loss: 5.0025
Epoch 318/500, Loss: 4.9961
Epoch 319/500, Loss: 4.9925
Epoch 320/500, Loss: 4.9870
Epoch 321/500, Loss: 4.9816
Epoch 322/500, Loss: 4.9774
Epoch 323/500, Loss: 4.9718
Epoch 324/500, Loss: 4.9690
Epoch 325/500, Loss: 4.9634
Epoch 326/500, Loss: 4.9600
Epoch 327/500, Loss: 4.9557
Epoch 328/500, Loss: 4.9497
Epoch 329/500, Loss: 4.9470
Epoch 330/500, Loss: 4.9420
Epoch 331/500, Loss: 4.9392
Epoch 332/500, Loss: 4.9343
Epoch 333/500, Loss: 4.9289
Epoch 334/500, Loss: 4.9265
Epoch 335/500, Loss: 4.9225
Epoch 336/500, Loss: 4.9191
Epoch 337/500, Loss: 4.9143
Epoch 338/500, Loss: 4.9098
Epoch 339/500, Loss: 4.9061
Epoch 340/500, Loss: 4.9012
Epoch 341/500, Loss: 4.8987
Epoch 342/500, Loss: 4.8925
Epoch 343/500, Loss: 4.8909
Epoch 344/500, Loss: 4.8861
Epoch 345/500, Loss: 4.8809
Epoch 346/500, Loss: 4.8776
Epoch 347/500, Loss: 4.8720
Epoch 348/500, Loss: 4.8688
Epoch 349/500, Loss: 4.8648
Epoch 350/500, Loss: 4.8588
Epoch 351/500, Loss: 4.8551
Epoch 352/500, Loss: 4.8507
Epoch 353/500, Loss: 4.8480
Epoch 354/500, Loss: 4.8435
Epoch 355/500, Loss: 4.8379
Epoch 356/500, Loss: 4.8354
Epoch 357/500, Loss: 4.8316
Epoch 358/500, Loss: 4.8261
Epoch 359/500, Loss: 4.8241
Epoch 360/500, Loss: 4.8184
Epoch 361/500, Loss: 4.8157
Epoch 362/500, Loss: 4.8125
Epoch 363/500, Loss: 4.8074
Epoch 364/500, Loss: 4.8043
Epoch 365/500, Loss: 4.7990
Epoch 366/500, Loss: 4.7977
Epoch 367/500, Loss: 4.7932
Epoch 368/500, Loss: 4.7878
Epoch 369/500, Loss: 4.7859
Epoch 370/500, Loss: 4.7827
Epoch 371/500, Loss: 4.7775
Epoch 372/500, Loss: 4.7755
Epoch 373/500, Loss: 4.7704
Epoch 374/500, Loss: 4.7683
Epoch 375/500, Loss: 4.7643
Epoch 376/500, Loss: 4.7599
Epoch 377/500, Loss: 4.7584
Epoch 378/500, Loss: 4.7536
Epoch 379/500, Loss: 4.7489
Epoch 380/500, Loss: 4.7471
Epoch 381/500, Loss: 4.7434
Epoch 382/500, Loss: 4.7381
Epoch 383/500, Loss: 4.7366
Epoch 384/500, Loss: 4.7324
Epoch 385/500, Loss: 4.7282
Epoch 386/500, Loss: 4.7255
Epoch 387/500, Loss: 4.7224
Epoch 388/500, Loss: 4.7180
Epoch 389/500, Loss: 4.7157
Epoch 390/500, Loss: 4.7106
Epoch 391/500, Loss: 4.7096
Epoch 392/500, Loss: 4.7055
Epoch 393/500, Loss: 4.7005
Epoch 394/500, Loss: 4.6988
Epoch 395/500, Loss: 4.6958
Epoch 396/500, Loss: 4.6909
Epoch 397/500, Loss: 4.6896
Epoch 398/500, Loss: 4.6861
Epoch 399/500, Loss: 4.6805
Epoch 400/500, Loss: 4.6785
Epoch 401/500, Loss: 4.6765
Epoch 402/500, Loss: 4.6718
Epoch 403/500, Loss: 4.6694
Epoch 404/500, Loss: 4.6659
Epoch 405/500, Loss: 4.6616
Epoch 406/500, Loss: 4.6601
Epoch 407/500, Loss: 4.6558
Epoch 408/500, Loss: 4.6520
Epoch 409/500, Loss: 4.6503
Epoch 410/500, Loss: 4.6458
Epoch 411/500, Loss: 4.6415
Epoch 412/500, Loss: 4.6393
Epoch 413/500, Loss: 4.6360
Epoch 414/500, Loss: 4.6319
Epoch 415/500, Loss: 4.6295
Epoch 416/500, Loss: 4.6258
Epoch 417/500, Loss: 4.6210
Epoch 418/500, Loss: 4.6195
Epoch 419/500, Loss: 4.6164
Epoch 420/500, Loss: 4.6110
Epoch 421/500, Loss: 4.6090
Epoch 422/500, Loss: 4.6056
Epoch 423/500, Loss: 4.6016
Epoch 424/500, Loss: 4.5987
Epoch 425/500, Loss: 4.5957
Epoch 426/500, Loss: 4.5912
Epoch 427/500, Loss: 4.5901
Epoch 428/500, Loss: 4.5860
Epoch 429/500, Loss: 4.5819
Epoch 430/500, Loss: 4.5790
Epoch 431/500, Loss: 4.5764
Epoch 432/500, Loss: 4.5725
Epoch 433/500, Loss: 4.5704
Epoch 434/500, Loss: 4.5670
Epoch 435/500, Loss: 4.5631
Epoch 436/500, Loss: 4.5614
Epoch 437/500, Loss: 4.5584
Epoch 438/500, Loss: 4.5543
Epoch 439/500, Loss: 4.5525
Epoch 440/500, Loss: 4.5480
Epoch 441/500, Loss: 4.5468
Epoch 442/500, Loss: 4.5425
Epoch 443/500, Loss: 4.5391
Epoch 444/500, Loss: 4.5377
Epoch 445/500, Loss: 4.5347
Epoch 446/500, Loss: 4.5304
Epoch 447/500, Loss: 4.5291
Epoch 448/500, Loss: 4.5257
Epoch 449/500, Loss: 4.5212
Epoch 450/500, Loss: 4.5206
Epoch 451/500, Loss: 4.5174
Epoch 452/500, Loss: 4.5135
Epoch 453/500, Loss: 4.5117
Epoch 454/500, Loss: 4.5086
Epoch 455/500, Loss: 4.5052
Epoch 456/500, Loss: 4.5027
Epoch 457/500, Loss: 4.4995
Epoch 458/500, Loss: 4.4963
Epoch 459/500, Loss: 4.4940
Epoch 460/500, Loss: 4.4895
Epoch 461/500, Loss: 4.4880
Epoch 462/500, Loss: 4.4848
Epoch 463/500, Loss: 4.4806
Epoch 464/500, Loss: 4.4787
Epoch 465/500, Loss: 4.4740
Epoch 466/500, Loss: 4.4733
Epoch 467/500, Loss: 4.4693
Epoch 468/500, Loss: 4.4659
Epoch 469/500, Loss: 4.4641
Epoch 470/500, Loss: 4.4593
Epoch 471/500, Loss: 4.4580
Epoch 472/500, Loss: 4.4546
Epoch 473/500, Loss: 4.4510
Epoch 474/500, Loss: 4.4487
Epoch 475/500, Loss: 4.4448
Epoch 476/500, Loss: 4.4432
Epoch 477/500, Loss: 4.4394
Epoch 478/500, Loss: 4.4366
Epoch 479/500, Loss: 4.4344
Epoch 480/500, Loss: 4.4296
Epoch 481/500, Loss: 4.4279
Epoch 482/500, Loss: 4.4243
Epoch 483/500, Loss: 4.4202
Epoch 484/500, Loss: 4.4172
Epoch 485/500, Loss: 4.4134
Epoch 486/500, Loss: 4.4121
Epoch 487/500, Loss: 4.4074
Epoch 488/500, Loss: 4.4039
Epoch 489/500, Loss: 4.4017
Epoch 490/500, Loss: 4.3980
Epoch 491/500, Loss: 4.3955
Epoch 492/500, Loss: 4.3911
Epoch 493/500, Loss: 4.3900
Epoch 494/500, Loss: 4.3859
Epoch 495/500, Loss: 4.3812
Epoch 496/500, Loss: 4.3805
Epoch 497/500, Loss: 4.3768
Epoch 498/500, Loss: 4.3754
Epoch 499/500, Loss: 4.3711
Epoch 500/500, Loss: 4.3695
10、评估模型
test_loss, test_rmse, test_mae = evaluate(model, criterion, X_test, y_test)
print(f"Test Loss: {test_loss:.4f}, Test RMSE: {test_rmse:.4f}, Test MAE: {test_mae:.4f}")
Test Loss: 12.0768, Test RMSE: 3.4752, Test MAE: 2.2279
相关文章:

Pytorch使用NN神经网络模型实现经典波士顿boston房价预测问题
Pytorch使用多层神经网络模型实现经典波士顿boston房价预测问题 波士顿房价数据集介绍 波士顿房价数据集是一个经典的机器学习数据集,用于预测波士顿地区房屋的中位数价格。该数据集包含了506个样本,每个样本有13个特征,包括城镇的各种指标&…...

微服务间消息传递
微服务间消息传递 微服务是一种软件开发架构,它将一个大型应用程序拆分为一系列小型、独立的服务。每个服务都可以独立开发、部署和扩展,并通过轻量级的通信机制进行交互。 应用开发 common模块中包含服务提供者和服务消费者共享的内容provider模块是…...

python——案例16:约瑟夫生者死者链队列
约瑟夫游戏的大意是:一条船上有30个人,因为在海上遇到风暴 因此船长告诉乘客,必须牺牲15个人,并议定30个人围成一圈, 由第一个人数起,依次报数,数到第9人,便把他投入大海中ÿ…...

【人工智能前沿弄潮】—— 玩转SAM(Segment Anything)
玩转SAM(Segment Anything) 官网链接: Segment Anything | Meta AI (segment-anything.com) github链接: facebookresearch/segment-anything: The repository provides code for running inference with the SegmentAnything Model (SAM), links fo…...

每日一题——合并两个有序的数组
题目 给出一个有序的整数数组 A 和有序的整数数组 B ,请将数组 B 合并到数组 A 中,变成一个有序的升序数组 数据范围:0≤n,m≤100,∣Ai∣<100,∣Bi∣<100 注意: 1.保证 A 数组有足够的空间存放 B …...

MPP架构和Hadoop架构的区别
1. 架构的介绍 mpp架构是将许多数据库通过网络连接起来,相当于将一个个垂直系统横向连接,形成一个统一对外的服务的分布式数据库系统。每个节点由一个单机数据库系统独立管理和操作该物理机上的的所有资源(CPU,内存等)…...

Java02-迭代器,数据结构,List,Set ,Map,Collections工具类
目录 什么是遍历? 一、Collection集合的遍历方式 1.迭代器遍历 方法 流程 案例 2. foreach(增强for循环)遍历 案例 3.Lamdba表达式遍历 案例 二、数据结构 数据结构介绍 常见数据结构 栈(Stack) 队列&a…...

福布斯发布2023云计算100强榜单,全球流程挖掘领导者Celonis排名17
近日,全球流程挖掘领导者Celonis入选福布斯2023 年云计算 100 强榜单,估值130亿美元,排名第17,Celonis已经是连续三年跻身榜单前20名。 本次榜单由福布斯与Bessemer Venture Partners和Salesforce Ventures联合发布,旨…...

计算机网络 MAC地址
...

Jay17 2023.8.10日报
笔记 【python反序列化】 序列化 类对象->字节流(字符串) 反序列化 字节流->对象 python反序列化没PHP这么灵活,没这么多魔术方法。 import pickle import os class ctfshow(): def init(self): self.username0 self.password0 d…...

Winform中DatagridView 表头实现一个加上一个checkBox,实现全选选项功能
实现效果 点击checkBox1或者直接在第一列列表头点击即可实现 代码实现 我的datagridview叫dgv 我在datagridview已经默认添加了一个DataGridViewCheckBoxColumn,勾选时value为1,不勾选时value为0 第一种通过可视化拖动一个checkBox来实现 拖动组…...

rust基础
这是笔者学习rust的学习笔记(如有谬误,请君轻喷) 参考视频: https://www.bilibili.com/video/BV1hp4y1k7SV参考书籍:rust程序设计语言:https://rust.bootcss.com/title-page.htmlmarkdown地址:h…...

剑指offer39.数组中出现次数超过一半的数字
这个题非常简单,解法有很多种,我用的是HashMap记录每个元素出现的次数,只要次数大于数组长度的一半就返回。下面是我的代码: class Solution {public int majorityElement(int[] nums) {int len nums.length/2;HashMap<Integ…...

spring技术栈面试题
1 Spring支持的事务管理类型有哪些?你在项目中使用哪种方式? Spring支持两种类型的事务管理: 编程式事务管理:这意味你通过编程的方式管理事务,给你带来极大的灵活性,但是难维护。声明式事务管理&#x…...
Android Glide MemorySizeCalculator计算值,Kotlin
Android Glide MemorySizeCalculator计算值,Kotlin for (i in 100..1000 step 50) {val calculator MemorySizeCalculator.Builder(this).setMemoryCacheScreens(i.toFloat()).setBitmapPoolScreens(i.toFloat()).setMaxSizeMultiplier(0.8f).setLowMemoryMaxSizeMultiplier(0…...

KEIL自带的Jlink怎么升级更换版本
问题背景 V4.20以上的keil安装包中都自带Jlink驱动包,即当你安装了KEIL后,Debug或Download就是用的安装KEIL时附带安装的Jlink版本。 那如果存在这种情况,你正在开发的芯片比较新,只有比较新的Jlink驱动软件才能支持,…...

图的遍历之 深度优先搜索和广度优先搜索
深度优先搜索的图文介绍 1. 深度优先搜索介绍 图的深度优先搜索(Depth First Search),和树的先序遍历比较类似。 它的思想:假设初始状态是图中所有顶点均未被访问,则从某个顶点v出发,首先访问该顶点,然后依次从它的各…...

Java学习笔记27——file类
File类 概述和构造方法概述构造方法 File的创建功能File类判断和获取功能File的删除功能 概述和构造方法 概述 在java.io下 具体的类 file是文件和目录路径名的抽象表示 文件和目录是可以封装成对象的对于file而言,其封装的并不是真正存在的文件(可以…...

细胞——求细胞数量 C++详解
细胞——求细胞数量 C详解 求细胞数量题目描述输入格式输出格式样例样例输入样例输出 提示数据规模与约定 解法代码 求细胞数量 题目描述 一矩形阵列由数字 0 0 0 到 9 9 9 组成,数字 1 1 1 到 9 9 9 代表细胞,细胞的定义为沿细胞数字上下左右若还…...

【计算机视觉】关于图像处理的一些基本操作
目录 图像平滑滤波处理均值滤波计算过程python实现 高斯滤波计算过程python实现 中值滤波计算过程python实现 图像的边缘检测Robert算子计算过程python实现 图像处理腐蚀算子计算过程python实现 Hog(梯度方向直方图)特征计算流程:Hog的特征维…...

Android Animation Made Easy
原文链接 Android Animation Made Easy 动画在任何一个GUI系统中都是一个非常重要的设计元素,它可以让交互变得优雅,让界面变得炫酷,让操作变得更加的舒畅,让状态过渡变得更加的顺滑,对视觉效果有极大的提升ÿ…...

56从零开始学Java之与字符串相关的正则表达式
作者:孙玉昌,昵称【一一哥】,另外【壹壹哥】也是我哦 千锋教育高级教研员、CSDN博客专家、万粉博主、阿里云专家博主、掘金优质作者 前言 在上一篇文章中,壹哥给大家介绍了String字符串及其各种常用API方法,接下来壹哥…...

STM32 定时器自动重装载寄存器ARR带来的影响,ARPE0和1区别
ARR是啥 自动重载寄存器是预装载的。对自动重载寄存器执行写入或读取操作时会访问预装载寄存器。预装载寄存器的内容既可以直接传送到影子寄存器,也可以在每次发生更新事件 (UEV) 时传送到影子寄存器,这取决于 TIMx_CR1 寄存器中的自动重载预装载使能位 …...

vue 把<style scoped lang=“less“> 单独写成less文件再导入使用
1 npm npm install less-loader --save-dev2 创建一个单独的 Less 文件,例如 app.less <style scoped lang"less"> import url(./app.less); </style>3 在 app.less 文件中,编写 Less 样式代码 .container {width: 500px;margi…...

C++ 字符串
C 字符串 一、字符串两种写法 c语言的写法,可以延用 const char* str1 "huang"; char str2[] "Hello, World!";c写法 std::string str "Hello, World!";二、字符串计算长度 c语言的计算字符串长度,需要导入库 #inc…...

springboot 报错处理(长期更新 2023.8.10)
目录 一、HTTP 相关1.1、 数据传输方面1.1.1、 HttpMessageNotWritableException1.1.1.1、 springboot + stomp 场景一、HTTP 相关 1.1、 数据传输方面 1.1.1、 HttpMessageNotWritableException 1.1.1.1、 springboot + stomp 场景 报错内容: 使用 spring boot 和 stomp 服…...

Maven出现报错 ; Unable to import maven project: See logs for details错误的多种解决方法
问题现象; IDEA版本: Maven 版本 : 3.3.9 0.检查 maven 的设置 :F:\softeware\maven\apache-maven-3.9.3\conf 检查setting.xml 配置 本地仓库<localRepository>F:\softeware\maven\local\repository</localRepository>镜像…...

33_windows环境debug Nginx 源码-安装WSL
文章目录 前言安装 WSL先决条件启用 windows 更新功能真正安装 WSL133_windows环境debug Nginx 源码-安装WSL前言 虽然很想在纯 windows 环境,基于windows 的生态完成debug,但现实情况是 由于Nginx 源码编写的很多内容都和 linux 更加耦合;且不说使用 Visual-Studio 安装 C/…...

Java中的ZooKeeper是什么?
Java中的ZooKeeper是一个开源的分布式协调服务,它可以帮助我们管理分布式系统中的数据和配置信息。ZooKeeper是由Facebook开发的一个开源项目,它被广泛用于Facebook的分布式系统。 ZooKeeper的名称来源于动物园管理员(Zookeeper)…...

【数学】CF1796 C
Problem - 1796C - Codeforces 题意: 思路: 模拟一下样例可以发现一些规律 Code: #include <bits/stdc.h>#define int long longusing i64 long long;constexpr int N 1e6 10; constexpr int mod 998244353;void solve() {int l…...