Conditional Gradient

Conditional Gradient[Frank-Wolfe Method]

The problem :

minf(x)subject  tox∈Pminf(x) \\ subject \; to \\ x \in P

​Where ffis convex on the bounded polygonal region PP

How can we determine the search direction dk\mathbf{d}_kin this problem? It is connected to a linear problem.

Zk(y):=f(xk)+∇f(xk)T(y−xk)min zk(y)=∇f(xk)T(y−xk)subject  toy∈PZ_k(y):=f(x_k)+\nabla f(x_k)^T(y-x_k) \\ min \ z_k(y)=\nabla f(x_k)^T(y-x_k) \\ subject \; to \\ y \in P

We first look into (y−xk)(y-x_k).

Suppose yky_kis a solution of this problem.

  • yky_k is an extreme point of the polygon P and hence, since P is convex, the line joining yky_kand xkx_kis contained in PPand so the vector yk−xky_k-x_kis a feasible direction.

  • Because of the convexity of ff, ∇f(xˉ)T(y−xˉ)≥0    ∀y∈P \nabla f(\bar{x})^T(y-\bar{x}) \geq 0 \;\; \forall y\in P. It implies that this direction is also a descent direction.

  • zk(yk)≤z(xk)=∇f(xk)T(xk−xk)=0z_k(y_k)\leq z(x_k) = \nabla f(x_k)^T(x_k-x_k)=0 , so zk(yk)=0  or  zk(yk)<0 z_k(y_k)=0 \; or \; z_k(y_k)<0

Last updated