深度学习1 — 开头

作者 : admin 本文共2461个字,预计阅读时间需要7分钟 发布时间: 2024-06-17 共1人阅读

深度学习1 — 开头插图

感觉用这玩意越来越多,所以想学学。不过没想好怎么学,也没有提纲,买了两本书,一本是深度学习入门,小日子写的。还有一本就是花书。还有就是回Gatech参加线上课程,提纲大概是这样的。

https://omscs.gatech.edu/sites/default/files/documents/2024/Syllabi-CS%207643%202024-1.pdf

Week1:

Module 1: Introduction to Neural Networks Go through Welcome/Getting Started Lesson 1: Linear Classifiers and Gradient Descent Readings:  DL book: Linear Algebra background  DL book: Probability background  DL book: ML Background  LeCun et al., Nature ’15  Shannon, 1956

Week2:

Lesson 2: Neural Networks Readings:  DL book: Deep Feedforward Nets  Matrix calculus for deep learning  Automatic Differentiation Survey, Baydin et al.

Week3:

Lesson 3: Optimization of Deep Neural Networks Readings:  DL book: Regularization for DL  DL book: Optimization for Training Deep Models

Week4:

Module 2: Convolutional Neural Networks (OPTIONAL) Lesson 6: Data Wrangling Lesson 5: Convolution and Pooling Layers Readings:  Preprocessing for deep learning: from covariance matrix to image whitening  cs231n on preprocessing  DL book: Convolutional Networks  Optional: Khetarpal, Khimya, et al. Reevaluate: Reproducibility in evaluating reinforcement learning algorithms.” (2018). See related blog post

Week5:

Lesson 6: Convolutional Neural Network Architectures

Week6:

Lesson 7: Visualization Lesson 8: PyTorch and Scalable Training Readings:  Understanding Neural Networks Through Deep Visualization  Grad-CAM: Visual Explanations from Deep Networks via Gradient-based Localization

Week7:

Lesson 9: Advanced Computer Vision Architectures Lesson 10: Bias and Fairness Readings:  Fully Convolutional Networks for Semantic Segmentation

Week8:

Module 3: Structured Neural Representations Lesson 11: Introduction to Structured Representations Lesson 12: Language Models Readings:  DL Book: Sequential Modeling and Recurrent Neural Networks (RNNs)

Week9:

Lesson 13: Embeddings Readings:  word2vec tutorial  word2vec paper  StarSpace paper

Week10:

Lesson 14: Neural Attention Models Readings:  Attention is all you need  BERT Paper The Illustrated Transformer 

Week11:

Lesson 15: Neural Machine Translation Lesson 16: Automated Speech Recognition (ASR)

Week12:

Module 4: Advanced Topics Lesson 17: Deep Reinforcement Learning Readings:  MDP Notes (courtesy Byron Boots)  Notes on Q-learning (courtesy Byron Boots)  Policy iteration notes (courtesy Byron Boots)  Policy gradient notes (courtesy Byron Boots)

Week13:

Lesson 18: Unsupervised and Semi-Supervised Learning

Week14:

Lesson 19: Generative Models Readings:  Tutorial on Variational Autoencoder  NIPS 2016 Tutorial: Generative Adversarial Networks

从提纲可以看到,核心还是神经网络

然后就是网络的几种架构。卷积神经网络(CNN):主要用于图像处理和计算机视觉任务。**循环神经网络(RNN)**及其变种(如LSTM、GRU):主要用于处理序列数据,如时间序列分析和自然语言处理。生成对抗网络(GAN):用于生成逼真的数据样本,如图像生成。自编码器(Autoencoder):用于无监督学习和特征提取。

大概就是这些,看起来也不是太多。。。

深度学习1 — 开头插图(1)

待续。。。

本站无任何商业行为
个人在线分享 » 深度学习1 — 开头
E-->