I2I Translation


2022-08-09 更新

FlexiBO: A Decoupled Cost-Aware Multi-Objective Optimization Approach for Deep Neural Networks

Authors:Md Shahriar Iqbal, Jianhai Su, Lars Kotthoff, Pooyan Jamshidi

The design of machine learning systems often requires trading off different objectives, for example, prediction error and energy consumption for deep neural networks (DNNs). Typically, there is no single design that performs well in all objectives, therefore, finding Pareto-optimal designs is of interest. Often, measuring different objectives incurs different costs; for example, the cost of measuring the prediction error of DNNs is orders of magnitude higher than that of measuring the energy consumption of a pre-trained DNN as it requires re-training the DNN. Current state-of-the-art methods do not take this difference in objective evaluation cost into account, potentially wasting expensive evaluations of objective functions for little information gain. In this paper, we develop a novel decoupled cost-aware approach we call Flexible Multi-Objective Bayesian Optimization (FlexiBO) to address this issue. FlexiBO weights the improvement of the hypervolume of the Pareto region by the measurement cost of each objective. This helps us in balancing the expense of collecting new information with the knowledge gained through objective evaluations, preventing us from performing expensive measurements for little to no gain. We evaluate FlexiBO on seven state-of-the-art DNNs for image recognition, natural language processing (NLP), and speech-to-text translation. Our results indicate that, given the same total experimental budget, FlexiBO discovers designs with 4.8% to 12.4% lower hypervolume error than the next best state-of-the-art multi-objective optimization method depending on a particular DNN architecture.
PDF

点此查看论文截图

文章作者: 木子已
版权声明: 本博客所有文章除特別声明外,均采用 CC BY 4.0 许可协议。转载请注明来源 木子已 !
  目录