深度学习优化器

不断优化

Example:

for input, target in dataset:
    optimizer.zero_grad()
    output = model(input)
    loss = loss_fn(output, target)
    loss.backward()
    optimizer.step()
import torch
import torchvision.datasets
from torch import nn
from torch.nn import Sequential,Conv2d,MaxPool2d,Flatten,Linear
from torch.utils.data import DataLoader

dataset = torchvision.datasets.CIFAR10("../data",train=False,transform=torchvision.transforms.ToTensor(),download=True)
dataloader = DataLoader(dataset,batch_size=1)
class XuZhenyu(nn.Module):
    def __init__(self, *args, **kwargs) -> None:
        super().__init__(*args, **kwargs)
        self.model1 = Sequential(
            Conv2d(3,32,5,padding=2),
            MaxPool2d(2),
            Conv2d(32,32,5,padding=2),
            MaxPool2d(2),
            Conv2d(32, 64, 5, padding=2),
            MaxPool2d(2),
            Flatten(),
            Linear(1024,64),
            Linear(64,10),

        )

    def forward(self,x):
        x=self.model1(x)
        return x

loss = nn.CrossEntropyLoss()
xzy = XuZhenyu()
optim = torch.optim.SGD(xzy.parameters(),lr=0.01)
for epoch in range(20):
    running_loss = 0.0
    for data in dataloader:
        imgs,targets = data
        outputs = xzy(imgs)
        result_loss = loss(outputs,targets)
        optim.zero_grad()
        result_loss.backward()
        optim.step()
        running_loss = result_loss + result_loss
    print(running_loss)

相关推荐

  1. 深度学习优化算法

    2023-12-06 08:48:07       25 阅读

最近更新

  1. TCP协议是安全的吗?

    2023-12-06 08:48:07       18 阅读
  2. 阿里云服务器执行yum,一直下载docker-ce-stable失败

    2023-12-06 08:48:07       19 阅读
  3. 【Python教程】压缩PDF文件大小

    2023-12-06 08:48:07       19 阅读
  4. 通过文章id递归查询所有评论(xml)

    2023-12-06 08:48:07       20 阅读

热门阅读

  1. springboot定时任务

    2023-12-06 08:48:07       42 阅读
  2. LeetCode [中等]矩阵置零

    2023-12-06 08:48:07       29 阅读
  3. 基于信息隐藏技术的安全电子邮件设计与开发

    2023-12-06 08:48:07       45 阅读
  4. 【矩阵论】Chapter 5—lambda矩阵与Jordan 标准型

    2023-12-06 08:48:07       22 阅读
  5. Kotlin Lambda使用

    2023-12-06 08:48:07       37 阅读