In mathematics, the **Hessian matrix** or **Hessian** is a square matrix of __second-order partial derivatives__ of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables. Hessian Matrices are often used in optimization problems within Newton-Raphson's method.

## Example 1: Computing a Hessian

** Problem**: Compute the Hessian of $f(x, y)=x^{3}-2 x y-y^{6}$.

**Solution: **

First compute both partial derivatives:

$$f_{x}(x, y)=\frac{\partial}{\partial x}\left(x^{3}-2 x y-y^{6}\right)=3 x^{2}-2 y$$

$$f_{y}(x, y)=\frac{\partial}{\partial y}\left(x^{3}-2 x y-y^{6}\right)=-2 x-6 y^{5}$$

With these, we compute all four second partial derivatives:

$$f_{x x}(x, y)=\frac{\partial}{\partial x}\left(3 x^{2}-2 y\right)=6 x$$ $${f_{x y}(x, y)=\frac{\partial}{\partial y}\left(3 x^{2}-2 y\right)=-2}$$ $${f_{y x}(x, y)=\frac{\partial}{\partial x}\left(-2 x-6 y^{5}\right)=-2}$$ $$f_{y y}(x, y)=\frac{\partial}{\partial y}\left(-2 x-6 y^{5}\right)=-30 y^{4}$$The Hessian matrix in this case is a $ 2\times 2$ matrix with these functions as entries:

$$\mathbf{H} f(x, y)=\left[ \begin{array}{cc}{f_{x x}(x, y)} & {f_{x y}(x, y)} \\ {f_{y x}(x, y)} & {f_{y y}(x, y)}\end{array}\right]=\left[ \begin{array}{cc}{6 x} & {-2} \\ {-2} & {-30 y^{4}}\end{array}\right]$$## Example 2

**Problem:** the function $f(x)=x^{\top} A x+b^{\top} x+c$, where $A$ is a $n \times n$ matrix, $b$ is a vector of length $n$ and $c$ is a constant.

- Determine the gradient of $f$: $\nabla f(x)$.
- Determine the Hessian of $f$: $H_{f}(x)$.

**Solution:**