【源码】L1正则化的快速优化方法:算法比较研究及两种新方法
L1正则化对于特征选择是有效的,但是由于1-范数的不可微性,最终的优化是具有挑战性的。
L1 regularization is effective for featureselection, but the resulting optimization is challenging due to thenon-differentiability of the 1-norm.
在本文中,我们对目前最先进的优化技术进行了比较分析,以解决多个损失函数的这一问题。
In this paper we compare state-of-the-artoptimization techniques to solve this problem across several loss functions.
此外,我们提出了两种新技术。
Furthermore, we propose two new techniques.
第一种是基于L1正则化器的光滑(可微的)凸逼近,这种逼近不依赖于所使用的损失函数的任何假设。
The first is based on a smooth(differentiable) convex approximation for the L1 regularizer that does notdepend on any assumptions about the loss function used.
另一种技术是通过将L1-正则化问题转换为约束优化问题来解决L1-正则化器的不可微性的新策略,然后使用专门的梯度投影方法求解该约束优化问题。
The other technique is a new strategy thataddresses the non-differentiability of the L1-regularizer by casting theproblem as a constrained optimization problem that is then solved using aspecialized gradient projection method.
进一步的比较研究表明,通过测量相关的评估函数,我们新提出的方法在收敛速度和执行效率方面始终处于最佳之列。
Extensive comparisons show that our newlyproposed approaches consistently rank among the best in terms of convergencespeed and efficiency by measuring the number of function evaluations required.
源码及相关英文参考网站:
https://www.cs.ubc.ca/~schmidtm/Software/L1General.html
https://www.cs.ubc.ca/~schmidtm/Software/L1General/examples.html
下载英文原文地址:
http://page2.dfpan.com/fs/5lcaj2021d290168bc1/
更多精彩文章请关注微信号: