1. 程式人生 > >【原始碼】凸優化問題的不精確逼近梯度演算法的收斂速度分析

【原始碼】凸優化問題的不精確逼近梯度演算法的收斂速度分析

凸優化問題的不精確逼近梯度演算法的收斂速度分析

在這裡插入圖片描述

我們考慮利用逼近梯度法優化光滑凸函式和非光滑凸函式之和的問題,其中在光滑項的梯度或非光滑項的逼近運算元的計算中存在誤差。

We consider the problem of optimizing thesum of a smooth convex function and a non-smooth convex function usingproximal-gradient methods, where an error is present in the calculation of thegradient of the smooth term or in the proximity operator with respect to thenon-smooth term.

我們證明了基本逼近梯度法和加速逼近梯度法都獲得了與無誤差情況下相同的收斂速度,只要誤差能夠以適當的速率減小。

We show that both the basicproximal-gradient method and the accelerated proximal-gradient method achievethe same convergence rate as in the error-free case, provided that the errorsdecrease at appropriate rates.

使用這些速率,與精心選擇的固定誤差水平相比,我們在一組結構化稀疏性問題上能夠執行得到近似或更優的結果。

Using these rates, we perform as well as orbetter than a carefully chosen fixed error level on a set of structuredsparsity problems.

近年來,利用凸優化結構的重要性問題已經成為機器學習界的一個熱門研究課題。

In recent years the importance of takingadvantage of the structure of convex optimization problems has become a topicof intense research in the machine learning community.

……

下載原文及相關原始碼地址:

更多精彩文章請關注微訊號:在這裡插入圖片描述