laitimes

Hyperparameter tuning Heber, combo optimizer CompBO, Huawei Noah open source Bayesian optimization library

Reports from the Heart of the Machine

Editors: Chen Ping, Du Wei

Huawei Noah open sourced a Bayesian-optimized library with three parts: Heber, T-LBO, and CompBO.

Bayesian optimization can be said to be a black-box optimization algorithm used to solve the extreme value problem of an unknown function of an expression. Because of its strong sample validity, it has been widely used in recent years, and researchers only need to iterate less to get a better result, so it can be used for machine learning model algorithm parameter tuning.

Recently, Huawei Noah open sourced a new library on Bayesian optimization, which can be used for Bayesian optimization in low-dimensional and high-dimensional fields, mainly including:

Heteroscedastic Evolutionary Bayesian Optimization (HEBO): Bayesian optimization of heteroscedastic evolution, which can be used for hyperparameter tuning, with which Huawei Noah won the NeurIPS BBO competition;

T-LBO: An algorithm that combines deep metric learning with potential spatial Bayesian optimization to achieve high-dimensional optimization that reduces data requirements by 97%;

CompBO: Bayesian optimization using the combinatorial optimizer.

Project Address: https://github.com/huawei-noah/HEBO

Heber

Hyperparameter tuning Heber, combo optimizer CompBO, Huawei Noah open source Bayesian optimization library

The Heber algorithm is a Bayesian optimization library developed by Huawei's Noah's Ark Decision and Reasoning (DMnR) Laboratory. The algorithm beat NVIDIA, IBM, Jetbrain, etc., winning the black box optimization competition of THE AI International Summit NeurIPS 2020 with a score of 93.519.

HEBO is the algorithm that differs the most from its top 5 competitors and wins with a very large advantage. Here is a screenshot of the result of the match:

Hyperparameter tuning Heber, combo optimizer CompBO, Huawei Noah open source Bayesian optimization library

Full list: https://bbochallenge.com/leaderboard/

T-LBO algorithm

The algorithm is from the paper "High-Dimensional Bayesian Optimisation with Variational Autoencoders and Deep Metric Learning," 42 pages from Huawei's Noah's Ark Lab.

Hyperparameter tuning Heber, combo optimizer CompBO, Huawei Noah open source Bayesian optimization library

Address: https://arxiv.org/pdf/2106.03609.pdf

The researchers propose a method based on deep metric learning to perform Bayesian optimizations in a high-dimensional structured space using a variational autoencoder (VAE). By extending the idea of supervising deep metric learning, they solved a long-standing problem in high-dimensional VAE Bayesian optimizations, namely how to perform discriminant hidden spaces as inductive biases. Importantly, the researchers achieved this inductive bias using only 1% of the labeled data from previous work, demonstrating the sample-oriented efficiency of the proposed method.

In experiments, the researchers demonstrated SOTA results on real-world high-dimensional black-box optimization problems, including property-guided molecular generation. They hope that the results presented in this article can serve as guidelines for achieving efficient high-dimensional Bayesian optimization.

Hyperparameter tuning Heber, combo optimizer CompBO, Huawei Noah open source Bayesian optimization library

Bayesian optimization with a combinatorial optimizer (CompBO)

This is a paper published in the journal JMLR 2021 in the journal Machine Learning Research, titled "Are We Forgetting about Compositional Optimisers in Bayesian Optimisation?", which is a 78-page paper. The researchers are from Huawei's UK R&D center.

Hyperparameter tuning Heber, combo optimizer CompBO, Huawei Noah open source Bayesian optimization library

Address of the paper: https://www.jmlr.org/papers/volume22/20-1422/20-1422.pdf

Project Address https://github.com/huawei-noah/noah-research/tree/CompBO/BO/HEBO/CompBO

Bayesian optimization provides a sample-oriented and efficient method for global optimization. Within this framework, the maximization of the acquisition function is a key factor in determining performance. However, because the acquisition function tends to be non-convex, it is not easy to optimize, which complicates its maximization.

The Huawei paper conducts a comprehensive empirical study of the method of maximizing the acquisition function. In addition, by deriving new but mathematically equivalent combinatorial forms for popular acquisition functions, researchers redefine the task of maximizing acquisition functions as combinatorial optimization problems, thus benefiting from a large body of literature in the field. In particular, they highlighted the empirical advantages of the method of maximizing the combination of acquisition functions in 3,958 separate experiments, including combinatorial optimization tasks and Bayesian tasks.

Given the versatility of the acquisition function maximization method, researchers believe that using a combinatorial optimizer has the potential to achieve performance gains in all areas of current Bayesian optimization applications.

Hyperparameter tuning Heber, combo optimizer CompBO, Huawei Noah open source Bayesian optimization library

ETH Zurich DS3Lab: Building data-centric machine learning systems

The DS3Lab Lab at ETH Zurich is comprised of Assistant Professor Ce Zhang and 16 PhD and postdoctoral fellows, Ease.ML project: how to design, manage, and accelerate data-centric machine learning development, operation, and operation processes, and ZipML: Designing efficient and scalable machine learning systems for new hardware and software environments.

From December 15th to December 22nd, 11 guests from the DS3Lab Lab at ETH Zurich will share 6 sessions: Building Data-Centric Machine Learning Systems, as follows:

Read on