The Newton-Raphson method is an iterative algorithm for finding the root of a function $f(x)$, meaning the value of $x$ where $f(x)=0$. The method starts at an initial guess $x_i$ and refines the estimate by using the function's slope at that point. The update rule is given by:
$ x_{i+1}=x_i- \frac{f(x_i)}{f^\prime(x_i)} $
where:
- $x_i$ is the current estimate
- $f^\prime(x_i)$ is the derivative of $f(x)$ evaluated at $x_i$
## Derivation
The method relies on the *linear approximation* of a function, which say that over a very small interval $\delta x$ the function $f(x)$ can be approximated to be linear.
$ f(x_0+\delta x) \approx f(x_0)+f'(x_0)*\delta x $
We can view it from the [[Rise Over Run Method]], where we look at a new point $(x_0+\delta x)$, which is close to the reference point $x_0$. The function value of that new point is the function value of the reference point, plus the slope ("rise") $f^\prime(x_0$) times the distance to the reference point ("run") $\delta x$.
![[rise-over-run-2.png|center|400]]
By setting $f(x_0+ \delta x)=0$ and solving for $\delta x$ we know the distance from $x_0$ where the linearized function of $f(x)$ crosses the root.
$
\begin{aligned}
f(x_0+\delta x)&= f(x_0)+f^\prime(x_0)*\delta x \\[8pt]
0&=f(x_0)+f^\prime(x_0)*\delta x \\[2pt]
\delta x &= -\frac{f(x_0)}{f'(x_0)}
\end{aligned}
$
The updated estimate of $x_{i+1}$ where $f(x)$ crosses the root is then $x_i + \delta x$.
$ \begin{align}
x_{i+1}&=x_i+ \delta x \\ x_{i+1}&=x_i- \frac{f(x_i)}{f'(x_i)}
\end{align} $
## Non-Converging Cases
- *Slope near zero:* Near turning points, the derivative can be very small $f^\prime(x_i) \approx 0$. The resulting update step $-\frac{f(x_i)}{f^\prime(x_i)}$ becomes very large, resulting in a new estimate that is potentially far away from the root.
- *Unlucky starting point:* If the initial guess $x_i$ is too far from the root, Newton's method may fail to converge.
- *Oscillation or cycling:* In some cases, the estimates $x_i$ may oscillate without approaching the root.