Seminar: Generating Sequences with Recurrent Neural Networks

SpeakerAlex Graves
AffiliationGoogle Deepmind
DateFriday, 25 Apr 2014
Time13:00 - 14:00
LocationMalet Place Engineering Building 1.02
Event seriesDeepMind CSML Seminar Series

Generating sequential data is the closest computers get to dreaming. Digital dreams are likely to play a crucial role in the future of AI, by helping agents to simulate, predict and interpret their surroundings. This talk shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with large-scale structure, simply by
predicting one step at a time. The method is demonstrated for character-level language modelling (where the data are discrete) and speech and handwriting generation (where the data are real-valued). A novel extension allows the network to condition its predictions on an auxiliary input sequence, making it possible to speak or write specific texts.

iCalendar csml_id_184.ics