You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-[Q4: PyTorch on CIFAR-10](#q4-pytorch-on-cifar-10)
18
+
-[Q5: Image Captioning with Vanilla RNNs](#q5-image-captioning-with-vanilla-rnns)
19
19
-[Submitting your work](#submitting-your-work)
20
20
21
21
### Setup
@@ -34,38 +34,34 @@ Once you have completed all Colab notebooks **except `collect_submission.ipynb`*
34
34
35
35
In this assignment you will practice writing backpropagation code, and training Neural Networks and Convolutional Neural Networks. The goals of this assignment are as follows:
36
36
37
-
- Understand **Neural Networks** and how they are arranged in layered architectures.
38
-
- Understand and be able to implement (vectorized) **backpropagation**.
39
-
- Implement various **update rules** used to optimize Neural Networks.
40
37
- Implement **Batch Normalization** and **Layer Normalization** for training deep networks.
41
38
- Implement **Dropout** to regularize networks.
42
39
- Understand the architecture of **Convolutional Neural Networks** and get practice with training them.
43
40
- Gain experience with a major deep learning framework, **PyTorch**.
41
+
- Understand and implement RNN networks. Combine them with CNN networks for image captioning.
The notebook `FullyConnectedNets.ipynb` will have you implement fully connected
48
-
networks of arbitrary depth. To optimize these models you will implement several
49
-
popular update rules.
50
-
51
-
### Q2: Batch Normalization
44
+
### Q1: Batch Normalization
52
45
53
46
In notebook `BatchNormalization.ipynb` you will implement batch normalization, and use it to train deep fully connected networks.
54
47
55
-
### Q3: Dropout
48
+
### Q2: Dropout
56
49
57
50
The notebook `Dropout.ipynb` will help you implement dropout and explore its effects on model generalization.
58
51
59
-
### Q4: Convolutional Neural Networks
52
+
### Q3: Convolutional Neural Networks
60
53
61
54
In the notebook `ConvolutionalNetworks.ipynb` you will implement several new layers that are commonly used in convolutional networks.
62
55
63
-
### Q5: PyTorch on CIFAR-10
56
+
### Q4: PyTorch on CIFAR-10
64
57
65
58
For this part, you will be working with PyTorch, a popular and powerful deep learning framework.
66
59
67
60
Open up `PyTorch.ipynb`. There, you will learn how the framework works, culminating in training a convolutional network of your own design on CIFAR-10 to get the best performance you can.
68
61
62
+
### Q5: Image Captioning with Vanilla RNNs
63
+
The notebook `RNN_Captioning_pytorch.ipynb` will walk you through the implementation of vanilla recurrent neural networks and apply them to image captioning on COCO.
64
+
69
65
### Submitting your work
70
66
71
67
**Important**. Please make sure that the submitted notebooks have been run and the cell outputs are visible.
@@ -85,6 +81,6 @@ If your submission for this step was successful, you should see the following di
85
81
86
82
**_Note: When you have completed all notebookes, please ensure your most recent kernel execution order is chronological as this can otherwise cause issues for the Gradescope autograder. If this isn't the case, you should restart your kernel for that notebook and rerun all cells in the notebook using the Runtime Menu option "Restart and Run All"._**
87
83
88
-
**2.** Submit the PDF and the zip file to [Gradescope](https://www.gradescope.com/courses/527613).
84
+
**2.** Submit the PDF and the zip file to [Gradescope](https://www.gradescope.com/courses/1012166).
89
85
90
86
Remember to download `a2_code_submission.zip` and `a2_inline_submission.pdf` locally before submitting to Gradescope.
0 commit comments