sigpy.alg.PrimalDualHybridGradient

class sigpy.alg.PrimalDualHybridGradient(proxfc, proxg, A, AH, x, u, tau, sigma, theta=1, gamma_primal=0, gamma_dual=0, max_iter=100, tol=0)[source]

Primal dual hybrid gradient.

Considers the problem:

\[\min_x \max_u - f^*(u) + g(x) + \left<Ax, u\right>\]

Or equivalently:

\[\min_x f(A x) + g(x)\]

where f, and g are simple.

Parameters:
  • proxfc (function) – Function to compute proximal operator of f^*.
  • proxg (function) – Function to compute proximal operator of g.
  • A (function) – Function to compute a linear mapping.
  • AH (function) – Function to compute the adjoint linear mapping of A.
  • x (array) – Primal solution.
  • u (array) – Dual solution.
  • tau (float or array) – Primal step-size.
  • sigma (float or array) – Dual step-size.
  • gamma_primal (float) – Strong convexity parameter of g.
  • gamma_dual (float) – Strong convexity parameter of f^*.
  • max_iter (int) – Maximum number of iterations.
  • tol (float) – Tolerance for stopping condition.

References

Chambolle, A., & Pock, T. (2011). A first-order primal-dual algorithm for convex problems with applications to imaging. Journal of mathematical imaging and vision, 40(1), 120-145.

__init__(proxfc, proxg, A, AH, x, u, tau, sigma, theta=1, gamma_primal=0, gamma_dual=0, max_iter=100, tol=0)[source]

Initialize self. See help(type(self)) for accurate signature.

Methods

__init__(proxfc, proxg, A, AH, x, u, tau, sigma) Initialize self.
done() Return whether the algorithm is done.
update() Perform one update step.