Dr. Nicholas Guttenberg, Cross Labs, Cross Compass
Our very own Nicholas Guttenberg presented Metalearning, Communication, and Expressability at Cross Roads #9.
Meta-learning replaces the idea of learning to do a single task with the idea of learning to quickly learn a task when presented with it. In some methods, this is achieved by learning a representation of tasks, so that the problem of learning what to do is replaced by the problem of inferring the proper representation (which then indexes the corresponding behavior). This is similar to the idea that a teacher could use language to communicate their knowledge of a particular skill to a student, and so it seems possible that there may be a deep connection between the emergence of language and the ability to learn quickly. However, the space of learned task representations does not necessarily lead to the ability to express new tasks which are distant from the training distribution - something exacerbated by any attempts to optimize communication bandwidth. I'd like to discuss making expressive models and representations as a goal (separate from the usual optimization of performance or accuracy on a task), as well as possible hints of how this gap might be bridged in the context of meta-learning and AI-invented communication protocols.
Thanks to Nicholas for an engaging presentation, our host Professor Nathanael Aubert-Kato, everyone who came out for a lively discussion, Cross Compass for sponsoring the event, and Ochanomizu University for graciously hosting us.