RNNs are neural networks that can handle sequence data by incorporating a time component. They learn from past sequence data to predict future states in new sequence data. The document discusses RNN architecture, which uses a hidden layer that receives both the current input and the previous hidden state. It also covers backpropagation through time (BPTT) for training RNNs on sequence data. Examples are provided to implement an RNN from scratch using TensorFlow and Keras to predict a noisy sine wave time series.