ITP-NYU :: 4/21/2016
Recurrent neural networks
- Review feedforward neural networks (2:19)
- Feedforward vs. recurrence (7:38)
- How recurrent neural nets work (9:06)
- Training RNNs on text (character sequences) (11:32)
- RNNs and sequence-to-sequence (20:05)
- Image captioning (22:25)
- Advanced architectures and applications (28:12)
- Tutorial: text generation via torch-rnn (34:20)
- Tutorial: style transfer via neural-style (1:03:31)
Class notes
News / admin
- take Patrick's class next semester
- setting up terminal instances (offline)
- Real time style transfer! chainer implementation by Yusuke Tomoto
Review feedforward neural networks
- "Static" internal state
- Weights, activations, and applications
- Limitations of feedforward neural nets
- Fixed input and output size
- Internal state is static
Recurrent neural networks
- How RNNs work
- Hidden state and time steps
- Operating on sequences as inputs, outputs, or both
- Variety of architectures and corresponding use cases
- LSTMs and GRUs
- The unreasonable effectiveness of recurrent neural networks (@karpathy)
- Understanding LSTMs (@colah)
- Applications of RNNs
- Sampling text, 1 character at a time
- Image captioning
- Sequence to unit prediction
- sentiment analysis from audio/text frames
- Generating images from captions
- More applications
Application tutorials
- Using terminal.com
- Sampling text with torch-rnn
- Style transfer with neural-style