How Differentiation Works in Computers? š§®
How it started?
It all started when I was watching the YouTube video on Finding The Slope Algorithm (Forward Mode Automatic Differentiation) by Computerphile:
In this video, Mark Williams demonstrates Forward Mode Automatic Differentiation and explains how it addresses the limitations of traditional methods like Symbolic and Numerical Differentiation. The video also introduces the concept of Dual Numbers and shows how they can be efficiently used to compute the gradient of a function at any point.
Let's start with the basic concepts of differentiation!
Differentiation
In mathematics, the derivative is a fundamental tool that quantifies the sensitivity to change of a function's output with respect to its input. The derivative of a function of a single variable at a chosen input value, when it exists, is the slope of the tangent line to the graph of the function at that point. The process of finding a derivative is called differentiation.
Mathematical Definition
A function of a real variable \( f(x) \) is differentiable at a point \( a \) of its domain, if its domain contains an open interval containing \( ā a \)ā , and the limit \( L \) exists.
$$ L = \lim_{h \to 0} \frac{f(a + h) - f(a)}{h} $$
Why We Calculate Derivatives?
Derivatives are a fundamental concept in calculus with many practical use cases across science, engineering, economics, and computer science. They measure how a quantity changes in response to a small change in another, commonly called "rate of change".
Think of a derivatives as answering:
"If I nudge the input just a little, how much does the output change?"
This makes derivatives crucial wherever change, sensitivity, or optimization is important. Some important applications include:
- CFD (Computational Fluid Dynamics) š: Simulates fluid flow by solving NavierāStokes equations using partial derivatives of velocity, pressure, and density. These derivatives capture how small changes propagate, enabling realistic real-time simulations of smoke, fire, and airflow.
- Image Processing & Edge Detection š¼ļø: Image processing uses derivatives like Sobel filters and Laplacians to detect edges by identifying rapid changes in pixel intensity. This helps highlight boundaries for applications in computer vision and object recognition.
- Signal Smoothing & Filtering š”: Derivatives detect sudden spikes or noise in audio and sensor data, enabling smoothing and feedback control. This improves performance in applications like music processing, GPS filtering, and motion stabilization.
- Machine Learning & AI š§ : In machine learning, derivatives guide gradient descent by showing how to adjust weights to minimize loss. Backpropagation uses these derivatives to efficiently update neural network parameters during training.
Differentiation Methods
When it comes to differentiating with a computer, our first instinct is often the method we learned in high school, using known rules for common functions and applying them step by step. Letās take a closer look at how that works!
Symbolic Differentiation
Symbolic differentiation relies on the fundamental definition of a derivative which is taking the limit of the difference quotient. After establishing the derivatives of basic functions, we can systematically apply differentiation rules to compute the derivatives of more complex expressions.
$$ \boxed{L = \lim_{h \to 0} \frac{f(a + h) - f(a)}{h}} $$
Letās calculate it for a simple function: \( f(x)=x^n \)
$$ \displaylines{\begin{align} f(x) &= x^n \\ f'(x) &= \lim_{h \to 0} \frac{f(x+h) - f(x)}{h} \\ &= \lim_{h \to 0} \frac{(x + h)^n - x^n}{h}\end{align}} $$
Now apply the Binomial Theorem:
$$ (x + h)^n = x^n + n x^{n-1} h + \frac{n(n-1)}{2!} x^{n-2} h^2 + \cdots + h^n $$
Substitute this back into the limit:
$$ \displaylines{\begin{align}f'(x) &= \lim_{h \to 0} \frac{x^n + nx^{n-1}h + \frac{n(n-1)}{2!}x^{n-2}h^2 + \cdots + h^n - x^n}{h} \\ &= \lim_{h \to 0} \frac{nx^{n-1}h + \frac{n(n-1)}{2!}x^{n-2}h^2 + \cdots + h^n}{h} \\ &= \lim_{h \to 0} \left(nx^{n-1} + \frac{n(n-1)}{2!}x^{n-2}h + \cdots + h^{n-1}\right) \\ &= nx^{n-1}\end{align}} $$
Using the power rule from symbolic differentiation, the derivative is:
$$ \frac{d}{dx} x^n = n x^{n-1} $$
This rule is one of the foundational results and forms the basis for differentiating more complex expressions built from powers of \(x\). Youāve probably seen a sheet that lists differentiation rules formulas like this:
By breaking a complex function into smaller components and systematically applying basic rules such as the power, product, and chain rules you can compute the derivative of complex functions step-by-step.
References
Stay Tuned
Hope you enjoyed reading this. Stay tuned for more cool stuff coming your way!