Previous topic

3.1.1.2. pyprox.forward_backward

Next topic

3.1.3.1. pyprox.admm

This Page

3.1.2.1. pyprox.forward_backward_dual

pyprox.forward_backward_dual(grad_fs, prox_gs, K, x0, L, maxiter=100, method='fb', fbdamping=1.8, full_output=0, retall=0, callback=None)

Minimize the sum of the strongly convex function and a proper convex function.

This algorithm minimizes

F(x) + G(K(x))

where F is strongly convex, G is a proper convex function and K is a linear operator by a duality argument.

Parameters :

grad_fs : callable

should take one argument : an ndarray.

prox_gs : callable

should take two arguments : an ndarray and a float.

K : callable or ndarray

a linear operator

KS : callable or ndarray

the dual linear operator

x0 : ndarray

initial guess for the solution.

L : float

Module of Lipschitz of nabla G.

maxiter : int, optional

maximum number of iterations.

method : string, optional,

can be ‘fb’, ‘fista’ or ‘nesterov’

fbdamping : float, optional

full_output : bool, optional

non-zero to return all optional outputs.

retall : bool, optional

Return a list of results at each iteration if non-zero.

callback : callable, optional

An optional user-supplied function to call after each iteration. Called as callback(xk), where xk is the current parameter vector.

Returns :

xrec: ndarray :

fx: list :

Notes

This algorithm use the equivalence of

min_x F(x) + G(K(x)) (*)

with

min_u F^*(-K(u)) + G^*(u) (**)

using x = grad(F^*)(-K(u)) where the convex dual function is

F^*(y) = sup_x = <x,y> - F(x)

It uses forward_backward as a solver of (**)