Location: Helsinki, Finland
Remote: Yes
Willing to relocate: Temporarily. Immigration will be easier in US (TN)/Canada /Australia. Not an EU national (yet).
Technologies: Deep learning, though I'm seeking something very specific, see below.
Résumé/CV: https://sspilsbury.com/hire
Email: s@polysquare.org
I'm a 2nd year Machine Learning PhD student looking for an internship to work on something related to my thesis. The thesis is on compositional generalization problems in deep learning, right now with a particular focus on NLP and interactive environments. Asides from that, I have a lot of background in a previous life doing research/graphics/UI/web/backend/ETL stuff at various other places.
Some open research problems which might be relevant to what you are working on:
- Much improved sample efficiency for language-interactive robots or agents (don't need to train every single combination of different aspects)
- Compositional path synthesis
- Compositional particle or protein synthesis (eg, composing two separate subgraphs into a single graph by way of following some learned rules)
- Understanding why transformers can and can't do different kinds of compositional generalization
- Improved sample efficiency and generalization on many different language tasks in different language domains (for example, grammatical error correction, machine translation, semantic parsing, entity and relationship extraction).
One hard requirement is that we must be able to publish the research results of my internship such that it can be included in my thesis.
Some papers I'm a first author of include:
- Compositional Generalization in Grounded Language Learning via Induced Model Sparsity (https://aclanthology.org/2022.naacl-srw.19/)
- Meta-learning from demonstrations improves Compositional Generalization (https://openreview.net/forum?id=hb3Et9tJSC9).
Some open research problems which might be relevant to what you are working on:
- Much improved sample efficiency for language-interactive robots or agents (don't need to train every single combination of different aspects)
- Compositional path synthesis
- Compositional particle or protein synthesis (eg, composing two separate subgraphs into a single graph by way of following some learned rules)
- Understanding why transformers can and can't do different kinds of compositional generalization
- Improved sample efficiency and generalization on many different language tasks in different language domains (for example, grammatical error correction, machine translation, semantic parsing, entity and relationship extraction).
One hard requirement is that we must be able to publish the research results of my internship such that it can be included in my thesis.