.. _rnnrbm: Modeling and generating sequences of polyphonic music with the RNN-RBM ======================================================================== .. note:: This tutorial demonstrates a basic implementation of the RNN-RBM as described in [BoulangerLewandowski12]_ (`pdf `_). We assume the reader is familiar with `recurrent neural networks using the scan op `_ and `restricted Boltzmann machines (RBM) `_. .. note:: The code for this section is available for download here: `rnnrbm.py `_. You will need the modified `Python MIDI package (GPL license) `_ in your ``$PYTHONPATH`` or in the working directory in order to convert MIDI files to and from piano-rolls. The script also assumes that the content of the `Nottingham Database of folk tunes `_ has been extracted in the ``../data`` directory. Alternative MIDI datasets are available `here `_. Note that both dependencies above can be setup automatically by running the ``download.sh`` script in the ``../data`` directory. .. caution:: Need Theano 0.6 or more recent. The RNN-RBM +++++++++++++++++++++++++ The RNN-RBM is an energy-based model for density estimation of temporal sequences, where the feature vector :math:`v^{(t)}` at time step :math:`t` may be high-dimensional. It allows to describe multimodal conditional distributions of :math:`v^{(t)}|\mathcal A^{(t)}`, where :math:`\mathcal A^{(t)}\equiv \{v_\tau|\tau