Continuous Meta Learning without tasks
NeurIPS 20 10-9-2020
Last updated
NeurIPS 20 10-9-2020
Last updated
Meta-learning is a promising strategy for learning to efficiently learn within new tasks, using data gathered from a distribution of tasks.
However, the meta-learning literature thus far has focused on the task segmented setting, where at train-time, offline data is assumed to be split according to the underlying task, and at test-time, the algorithms are optimized to learn in a single task.
In this work, we enable the application of generic meta-learning algorithms to settings where this task segmentation is unavailable, such as continual online learning with a time-varying task.
We present meta-learning via online changepoint analysis (MOCA), an approach which augments a meta-learning algorithm with a differentiable Bayesian changepoint detection scheme.
We assume that we have access to a representative time series generated in the same manner from the same distribution of tasks, and use this time series to optimize in an offline, meta-training phase.
Critically, however, in stark contrast to standard meta-learning approaches, we do not assume access to task segmentation.
Moreover, we highlight that we consider the case of individual data points provided sequentially, in contrast to the common “k-shot, n-way” problem setting prevalent in few-shot learning (especially classification).
We build on Bayesian online changepoint detection (Adams & MacKay, 2007), an approach for detecting changepoints (i.e. task switches) originally presented in a streaming unconditional density estimation context.
BOCPD operates by maintaining a belief distribution over run lengths, i.e. how many of the past data points correspond to the current task.
In this work, we extend this approach of Adams & MacKay (2007) beyond Bayesian unconditional density estimation to apply to general meta-learning models operating in the conditional density estimation setting.
Details could be found from: http://gregorygundersen.com/blog/2019/08/13/bocd/
https://arxiv.org/pdf/1912.08866.pdf (latest version)