246 Hesburgh Library, Navari Family Center for Digital Scholarship
Learn about how to apply recurrent neural networks with PyTorch to text-based applications.
Those who study words as a stream of rapid sound, as objects which fulfill roles in a larger discourse structure, or as an interpretable vessel for abstract ideas know that, in all these cases, language is far from simple. Yet somehow neural networks manage to capture elements of them, sometimes mimicking or even surpassing human performance on language-based tasks.
Recurrent neural networks have played a critical role in this process; up until recently, they were the de facto standard in the field of natural language processing. This workshop aims to provide participants with an in-depth exploration of recurrent neural networks.
On the one hand, it provides a full, hands-on introduction to the code which represents and uses a recurrent neural network, going all the way from loading data to evaluating a trained model. On the other hand, it also accompanies the abstract architecture and its programmatic form with linguistic examples to provide a stronger intuition about the networks’ capabilities.
After completing this workshop, participants will:
Prerequisites:
This session will be presented by one of the NFCDS Pedagogy Fellows. Visit this page for more information about this fellowship program.