Houjun Liu

SU-CS205L JAN232025

Issues with Direct Methods

  • for instance, direct solvers have problems at numerical stability issues (for instance numerically stable quadratic formula); for cubics, there maybe unacceptable errors since there’s no such fix

Continuous Collision Detection

Implementing collision detection: three points, generally, are three \(v_1, v_2, v_3\) in \(\mathbb{R}^{3}\); yet, if they become linearly dependent, we know collision happened. In particular if the rank of \(\mqty(v_1, v_2, v_3)\) < 3, we have collided.

Problem! Solving this (taking the determinant of our matrix to figure out when collisions happened) will result in a cubic polynomial! This is numerically quite unstable.

So, to make it work, we actually have to use a double precision iterative solver.

Residual and Solution Error

When solving \(Ac = b\), for current guess \(c^{(q)}\) has residual \(r^{(q)} = b - A c^{(q)} = Ac^{\text{exact}} - A c^{(q)}\); the error in the parameters \(e^{(q)} = c^{(q)} - c^{\text{exact}}\) relates to the residual in solution by:

\begin{equation} r^{(q)} = - A e^{(q)} \end{equation}

this is true by:

\begin{equation} r^{(q)} = b - Ac^{(q)} = Ac^{\text{exact}} - Ac^{(q)} = A \qty(c^{\text{exact}} - c^{(q)}) = -Ae^{(q)} \end{equation}

Considering the SVD’d version of this:

\begin{equation} U^{T} r^{(q)} = - \Sigma \qty(v^{T}e^{(q)}) \end{equation}

Meaning each one we have \(\hat{r}^{q}_{k} = -\sigma_{k} \hat{e}_{k}^{(q)}\); small singular values lead to deceiving small residuals even if the error in the parameters is large.

For what it is, see Line Search

So, to update parameters, we have:

\begin{equation} c^{(q+1)} = c^{(q)} + \alpha^{(q)} s^{(q)} \end{equation}

where \(s^{(q)}\) is the search direction and \(\alpha^{(q)}\) is the learning rate. By subtracting oracle \(c^{\text{exact}}\) to both sides, we will get an equation for the error:

\begin{equation} e^{(q+1)} = e^{(q)} + \alpha^{(q)} s^{(q)} \end{equation}

and we can multiply through \(-A\) by both sides, and use the relationship above

\begin{equation} r^{(q+1)} = r^{(q)} - \alpha^{(q)} \qty(A s^{(q)}) \end{equation}

Notice how we now have an expression for the residuals and error! Optimally, we want to wait until the parameter error in a particular direction is entirely eliminated, so at \(e^{(q+1)}s ^{(q)} = 0\); yet, the error is unknown! But, multiplying \(A\) through both sides is a thing we can do!! So, here’s a fun convergence criteria:

\begin{equation} r^{(q+1)} s^{(q)} = 0 \end{equation}

so the residuals is orthogonal to us. Now, plugging in the definition of \(r^{(q+1)}\) from above, we get:

\begin{equation} \alpha^{(q)} = \frac{s^{(q)} \cdot r^{(q)}}{s^{(q)} \cdot A s^{(q)}} \end{equation}

where \(s^{(q)}\) is a vector representing the Line Search direction, and \(r^{(q)}\) is the output residual.

Steepest Design

We can just choose the search direction based on the current residual! Choose \(s^{q} = r^{q}\). So, our iteration is:

\begin{equation} r^{(q)} = b - Ac^{(q)}, a^{(q)} = \frac{r^{(q)} r^{(q)}}{r^{(q)} \cdot Ar^{(q)}} \end{equation}

and

\begin{equation} c^{(q+1)} = c^{(q)} + a^{(q)} r^{(q)} \end{equation}

Small algebra change; we can just combine these two things together in the first equation:

\begin{equation} r^{(q)} = r^{(q-1)} - \alpha^{(q-1)} A r^{(q-1)} \end{equation}

Challenge: since we are moving until the residual to \(0\), its possible to oscillate since we don’t know if the input parameter errors needs to be different at times.

Conjugate Gradient

See Error-Analysis for Conjugate Gradient and Conjugate Gradient