When $\b{f}$ is a quadratic function, the convergence analysis of Newton’s method is straightforward.
Newton’s method reaches the point $\b{x}^{*}$ such that $\nabla f(\b{x}^*) = 0$ in just one step starting from any initial point $\b{x}^{(0)}$.
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Suppose that $f \in \mathcal{ C }^3$, and $\b{x}^* \in \R^n$ is a point such that $\nabla f(\b{x}^*) =0 $ and $\b{F}(\b{x}^*) $ is invertible. Then, for all $\b{x}^{(0)} $ sufficiently close to $\b{x}^* $, Newton’s method is well defined for all $k$, and converges to $\b{x}^* $ with order of convergence at least $2$.
Let $\set{ \b{x}^{(k)}} $ be the sequence generated by Newton’s method for minimizing a given objective function $f(\b{x})$. If the Hessian $\b{F}(\b{x}^{(k)}) > 0$ and $\b{g}^{(k)} = \nabla f(\b{x}^{(k)}) \neq \b{0}$, then the direction
$$\b{d}^{(k)} = - \inv{ \b{F}(\b{x}^{(k)})} \b{g}^{(k)} = \b{x}^{(k+1)} - \b{x}^{(k)}$$
from $\b{x}^{(k)} $ to $\b{x}^{(k+1)} $ is a descent direction for $f $ in the sense that there exists an $\overline{ \alpha } > 0$ such that for all $\alpha \in (0, \overline{ \alpha }) $.
$$ f(\b{x}^{(k)} + \alpha \b{d}^{(k)}) < f(\b{x}^{(k)}) $$