Learning-to-learn to guide random search: Derivative-free meta blackbox optimization on manifold
Solving a sequence of high-dimensional, nonconvex, yet similar optimization problems is common in engineering. We propose a meta-learning framework that exploits shared structure across tasks to improve computational efficiency and sample complexity for derivative-free optimization. Assuming practical high-dimensional objectives lie on a shared low-dimensional manifold, we jointly learn a meta-initialization and a meta-manifold. We provide theoretical benefits and demonstrate effectiveness on two high-dimensional RL tasks.