日常实习-小米计算机视觉算法岗面经

作者 : admin 本文共2202个字,预计阅读时间需要6分钟 发布时间: 2024-06-10 共2人阅读

文章目录

  • 流程
  • 问题
    • 请你写出项目中用到的模型代码,Resnet50
      • (1)网络退化现象:把网络加深之后,效果反而变差了
      • (2)过拟合现象:训练集表现很棒,测试集很差
    • 把你做的工作里面的模型替换成ViT能行吗?
    • 有了解过Stable difussion,transformer吗?
  • 总结

流程

  • 自我介绍
  • 介绍项目
  • 介绍论文
  • 写代码

问题

请你写出项目中用到的模型代码,Resnet50

面试官:写出一个单元就好了。
实际面试过程中写出伪代码就好,
源代码:vision/torchvision/models/resnet.py

class BasicBlock(nn.Module):
    expansion: int = 1

    def __init__(
        self,
        inplanes: int,
        planes: int,
        stride: int = 1,
        downsample: Optional[nn.Module] = None,
        groups: int = 1,
        base_width: int = 64,
        dilation: int = 1,
        norm_layer: Optional[Callable[..., nn.Module]] = None,
    ) -> None:
        super().__init__()
        if norm_layer is None:
            norm_layer = nn.BatchNorm2d
        if groups != 1 or base_width != 64:
            raise ValueError("BasicBlock only supports groups=1 and base_width=64")
        if dilation > 1:
            raise NotImplementedError("Dilation > 1 not supported in BasicBlock")
        # Both self.conv1 and self.downsample layers downsample the input when stride != 1
        self.conv1 = conv3x3(inplanes, planes, stride)
        self.bn1 = norm_layer(planes)
        self.relu = nn.ReLU(inplace=True)
        self.conv2 = conv3x3(planes, planes)
        self.bn2 = norm_layer(planes)
        self.downsample = downsample
        self.stride = stride

    def forward(self, x: Tensor) -> Tensor:
        identity = x

        out = self.conv1(x)
        out = self.bn1(out)
        out = self.relu(out)

        out = self.conv2(out)
        out = self.bn2(out)

        if self.downsample is not None:
            identity = self.downsample(x)

        out += identity
        out = self.relu(out)

        return out

里面的精华部分如下,我跳着写:

def __init__():
	self.conv1 = conv3x3(inplanes, planes, stride)
	self.bn1 = norm_layer(planes)
	self.relu = nn.ReLU(inplace = True)
	self.conv2 = conv3x3(planes, planes)
	self.bn2 = norm_layer(planes)
	self.downsample = downsample
	self.stride = stride

def forward(self, x: Tensor) -> Tensor:
	identity = x
	
	out = self.conv1(x)
	out = self.bn1(out)
	out = self.relu(out)
	
	out = self.conv2(out)
	out = self.bn2(out)
	
	if self.downsample is not None:
		identity = self.downsample(x)
	
	out += identity
	out = self.relu(out)

	return out

在面试官的提示下我写了大概这样的伪代码:

def forward(x):
	identity = x
	out = conv2d(x)
	out = batchnorm(out)
	out = relu(out)
	
	out = conv2d(x)
	out = batchnorm(out)
	
	out += identity
	out = relu(out)
	
	return out

面试结果还没出,不保证我这样写是正确的!

进一步了解Resnet50,来自B站的同济子豪兄【精读AI论文】ResNet深度残差网络

  • 有几种不好的现象:

(1)网络退化现象:把网络加深之后,效果反而变差了

用人话说明:一个孩子报名了课外辅导班,结果不仅作业写得更差了,考试也更差了;(学多了反而导致结果更糟糕)

(2)过拟合现象:训练集表现很棒,测试集很差

用人话说明:一个孩子作业做的很棒,一上考场就发挥失常;

把你做的工作里面的模型替换成ViT能行吗?

有了解过Stable difussion,transformer吗?

有一点点
【渣渣讲课】试图做一个正常讲解Latent / Stable Diffusion的成年人

总结

一面会重点针对简历上写的论文和项目,以及考察一些和岗位相关的前沿知识,坐在实验室是绝对绝对感受不到这些的!要勇敢踏出第一步;


间隔2天,官网显示流程终止 ╥﹏╥…

本站无任何商业行为
个人在线分享 » 日常实习-小米计算机视觉算法岗面经
E-->