在PyTorch中,可以通過以下幾種方式來調整學習率:
import torch.optim as optim
from torch.optim.lr_scheduler import StepLR
optimizer = optim.SGD(model.parameters(), lr=0.1)
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
for epoch in range(num_epochs):
# Train the model
...
# Update learning rate
scheduler.step()
optimizer = optim.SGD(model.parameters(), lr=0.1)
for epoch in range(num_epochs):
# Train the model
...
if epoch == 30:
for param_group in optimizer.param_groups:
param_group['lr'] = 0.01
optimizer = optim.SGD(model.parameters(), lr=0.1)
for epoch in range(num_epochs):
# Train the model
...
if epoch % 10 == 0:
for param_group in optimizer.param_groups:
param_group['lr'] *= 0.1
以上是幾種常見的調整學習率的方法,在訓練神經網絡時可以根據實際情況選擇合適的方式調整學習率。