Mathematics of Deep Learning
DEADLINE: term papers (and code, if any) are due on 20/09/2019 Reports must be no longer than 10 pages, 11pt font. There is no minimal length.
Two kinds of term papers are possible.
In both cases, chosen topic must be preempively discussed with and validated by Dr. Venant.
This are only suggestion, you are more than encouraged to propose your own.
Reproduce this experiments of Nangia & Bowman (2018): “Listops: A Diagnostic Dataset for Latent Tree Learning” (using their dataset). And/or try with different algebraic operations (possibly using their code) (Experimental).
Try some of the LSTM simplifications tested in Levy et al. (2018): “Long Short-Term Memory as a Dynamically Computed Element-wise Weighted Sum” on a task of your choice using Tree-LSTMs (Experimental).
Build a toy neural recognizer or parser. Train it using various artificial data generated using context free or non context free grammars (Experimental).
Report on Shi et al. (2018): “On Tree-Based Neural Sentence Modeling”, in the light of the seminar’s discussions (Report).
Report on Tran et al. (2018) “The Importance of Being Recurrent for Modeling Hierarchical Structure”, in the light of the seminar’s discussions (Report).