This seemingly obscure function is useful for solving systems of differential equations in which the derivatives and functions are linearly dependent. We can plot systems of differential equations as 2D points, each corresponding to an input t. For example, we can make y(t)=sin(t), and x(t)=cos(t), resulting in a unit circle when you let t range from 0 to ∞ (or rather in this case 0 to 2π would be enough to trace out the full path). We can write this system as a multiplication of a matrix by a vector. For example (I am using the floor and ceiling symbols to represent the border of matrix, not as their original functions): ⌈x'(t)⌉ = ⌈0 1⌉ * ⌈x(t)⌉ ⌊y'(t)⌋ = ⌊-1 0⌋ * ⌊y(t)⌋
This system would correspond to the system x'(t) = y(t), and y'(t) = -x(t) This system is pretty easy to just guess at an answer, in this case x(t) = cos(t), and y(t) = sin(t), the system I mentioned earlier. But multiplying both of these functions by any constants would also work, and this will not always satisfy our given initial conditions. How can we incorporate all possible initial conditions into our system? To solve this, we can use the matrix exponential. Consider e function f(t) = ceᵏᵗ, where the initial condition (when t = 0) is c, and k controls how fast it grows.The derivative of the function is k* ceᵏᵗ, or k * f(t). If we plug matrix M for k, and make f(t) a column vector encoding x(t) and y(t), we get this equation: d/dt f(t) = M f(t). You may notice this looks similar to the previous equation, where the derivative is some value multiplied by the function. So, we can express f(t) like this: f(t) = eᴹᵗ * v(0), where v(0) describes the initial condition as a column vector. In the case of our original system, let's say x(0), and y(0) are the initial values of x(t) and y(t), respectively. If we plug it in, and distribute t throughout the matrix, we get this: f(t)= exp( ⌈0 t⌉ ) * ⌈x(0)⌉
( ⌊-t 0⌋ ) ⌊y(0)⌋ But how would we go about actually computing this? Usually, the Taylor-Maclaurin series is used. Raising a matrix to a whole number exponent is relatively easy, you can just multiply the matrix by itself as many times as necessary (you can prove with some linear algebra the operation is commutative in this case), and dividing a matrix by a scalar is also straightforward, just dividing each entry by that scalar. Using the Taylor-Maclaurin series, we can represent exp(x) as Σ(xⁿ/n!) for n =[0...∞]. From there, we can insert our matrix in, and evaluate it after as many terms as we want. In this case though, if we expand the polynomials, we can actually find an interesting relationship. Our matrix is equivalent to the 90°, or quarter turn rotation matrix around the origin, so the matrix will loop back every 4 terms. After doing some algebra, we notice the matrix looks like this: ⌈1 - t ²/2! + t⁴/4!... -x + x³/3! - x⁵/5!... ⌉
⌊x - x³/3! +x⁵/5!... 1 - t ²/2! + t⁴/4!... ⌋ Now, we can notice the terms are the Taylor-Maclaurin expansions of sin(x), -sin(x), and cos(x). Substituting those in, and putting it back into our function: f(t) = ⌈cos(t) -sin(t)⌉ * ⌈x(0)⌉
⌊sin(t) cos(t)⌋ * ⌊y(0)⌋ (note: this is basically the matrix equivalent of Euler's famous identity: eⁱᵗ = cos(t) + i sin(t). Multiplying a complex number by i results in a 90° counterclockwise rotation about the origin, analogous to this one. They both trace out a circle, the only shape that consistently has it's tangent line perpendicular to any given point on it's path.) Finally, distributing: f(t) = ⌈x(0) cos(t) - y(0) sin(t)⌉
⌊x(0) sin(t) + y(0) cos(t)⌋ , meaning x(t) = x(0) cos(t) - y(0) sin(t), and y(t) = x(0) sin(t) + y(0) cos(t) As a final check, you can plug these back into our original system and check if it satisfies the conditions: x'(t) = -x(0) sin(t) - y(0) cos(t) = -y(t), and y'(t) = x(0) cos(t) - y(0) sin(t) = x(t) Both of these equations are indeed true, meaning we have effectively solved our system of differential equations, using an un-orthodox idea but undeniably useful tool: the matrix exponential.
Grant Sanderon (3Blue1Brown) has an amazing video on his channel (
https://www.youtube.com/watch?v=O85OWBJ2ayo), which heavily inspired this graph, and I highly recommend. He also has many other brilliant videos about all sorts of mathematical phenomena, and provides captivating out-of-the box visual proofs.