๐ From Limits to Smoothness: Transformations, Limits, Continuity & Differentiability
๐ Functions as Transformations
A function can be seen as a machine: it takes input values and returns output values. But in a more visual and geometric sense, a function can be thought of as a transformation โ changing or warping the input space into a new shape.
๐ This post is part of the "Intro to Calculus" series
๐ Previously: Visualizing Multivariable Functions: Contour Plots, Vector-Valued Functions & Vector Fields (Beginner's Guide)
๐ Next: What is a Derivative? (Beginnerโs Guide to Calculus for ML)
๐ Mathematical Idea
If we start with a point \( x \in \mathbb{R} \), a function \( f(x) \) maps it to another real number \( y \).
For a 2D transformation:
\[ f(x, y) = (x, y^2) \]
This โbendsโ the coordinate plane vertically โ keeping \( x \) unchanged but squaring the \( y \)-coordinate.
๐งช Python Example: Transforming a Grid
Visual: Show a grid before and after the transformation \( f(x, y) = (x, y^2) \)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
import numpy as np
import matplotlib.pyplot as plt
x, y = np.meshgrid(np.linspace(-2, 2, 20), np.linspace(-2, 2, 20))
x_new = x
y_new = y ** 2
fig, axs = plt.subplots(1, 2, figsize=(10, 4))
axs[0].quiver(x, y, np.zeros_like(x), np.zeros_like(y), color='gray')
axs[0].set_title("Original Grid")
axs[0].axis('equal')
axs[1].quiver(x_new, y_new, np.zeros_like(x), np.zeros_like(y), color='teal')
axs[1].set_title("Transformed Grid: $f(x, y) = (x, y^2)$")
axs[1].axis('equal')
plt.tight_layout()
plt.show()

๐ What Is a Limit?
A limit describes what a function is approaching as the input gets closer and closer to a specific value โ even if the function is not defined at that value.
๐ Mathematical Definition
\[ \lim_{x \to a} f(x) = L \]
This means: as \( x \) gets arbitrarily close to \( a \), the value of \( f(x) \) gets arbitrarily close to \( L \).
๐ Example:
Let:
\[ f(x) = \frac{x^2 - 1}{x - 1} \]
At \( x = 1 \), the function is undefined โ but we can factor:
\[ f(x) = \frac{(x - 1)(x + 1)}{x - 1} = x + 1 \quad \text{(for } x \neq 1 \text{)} \]
So:
\[ \lim_{x \to 1} f(x) = 2 \]
๐งช Python Visualization
1
2
3
4
5
6
7
8
9
10
11
12
x = np.linspace(0.5, 1.5, 200)
y = (x**2 - 1)/(x - 1)
plt.plot(x, y, label=r'$f(x) = \frac{x^2 - 1}{x - 1}$')
plt.axvline(1, color='red', linestyle='--', label='x = 1')
plt.axhline(2, color='green', linestyle='--', label='Limit = 2')
plt.xlabel('x')
plt.ylabel('f(x)')
plt.title('Limit Approaching x = 1')
plt.legend()
plt.grid(True)
plt.show()
๐งฉ Examples of Limits in Functions
๐น One-Sided Limits
\[ \lim_{x \to a^-} f(x) \quad \text{and} \quad \lim_{x \to a^+} f(x) \]
The full limit exists only when both sides agree.
๐น Piecewise Example:
\[ f(x) = \begin{cases} x^2 & \text{if } x < 1 \ 3 & \text{if } x = 1 \ 2 - x & \text{if } x > 1 \end{cases} \]
Then:
\[ \lim_{x \to 1^-} f(x) = 1 \quad \text{and} \quad \lim_{x \to 1^+} f(x) = 1 \Rightarrow \lim_{x \to 1} f(x) = 1 \]
But:
\[ f(1) = 3 \neq \lim_{x \to 1} f(x) \]
So the function has a removable discontinuity at \( x = 1 \).
๐ Continuity and Differentiability
โ Continuity
A function \( f(x) \) is continuous at \( x = a \) if:
- \( f(a) \) is defined
- \( \lim_{x \to a} f(x) \) exists
- \( \lim_{x \to a} f(x) = f(a) \)
โ Differentiability
A function is differentiable at \( x = a \) if it is continuous and smooth (no sharp corners or cusps).
๐ Example: \( f(x) = |x| \)
- Continuous everywhere
- Not differentiable at \( x = 0 \) because of a sharp point
๐งช Python Visualization
1
2
3
4
5
6
7
8
9
10
11
x = np.linspace(-2, 2, 400)
y = np.abs(x)
plt.plot(x, y, label=r'$f(x) = |x|$', color='blue')
plt.axvline(0, color='red', linestyle='--', label='x = 0')
plt.xlabel('x')
plt.ylabel('f(x)')
plt.title('Absolute Value: Continuous but Not Differentiable at x = 0')
plt.legend()
plt.grid(True)
plt.show()
๐ค Relevance to Machine Learning
Understanding limits, continuity, and differentiability is essential for many foundational ideas in machine learning and deep learning:
๐ง Gradient Descent & Optimization
Most learning algorithms (like gradient descent) rely on functions being continuous and differentiable so we can compute smooth gradients to minimize loss.๐ Backpropagation
Neural networks use the chain rule to propagate error gradients backward โ which requires differentiable activation functions and loss functions.๐ Loss Surfaces
The cost or loss function must be smooth and continuous for optimizers to navigate toward minima efficiently. Sharp discontinuities can trap or mislead optimization.๐งฉ Activation Functions
Common activations (ReLU, sigmoid, tanh) are chosen based on their continuity and differentiability โ affecting both model capacity and training dynamics.๐ Regularization & Generalization
Techniques like L2 regularization implicitly promote smoother (more continuous and differentiable) functions, which helps with generalization and avoiding overfitting.โ ๏ธ Adversarial Robustness
Discontinuous or non-differentiable spots in the model behavior can be exploited by adversarial examples. Smoothness leads to more stable and robust models.
๐งญ Key Insight: If your model isnโt differentiable, gradient-based learning breaks down. Smoothness isnโt just elegant โ itโs essential!
๐ง Level Up
- ๐ Functions as Mappings: Think of every function as a way to reshape input space โ this is crucial for understanding transformations in deep learning layers.
- ๐ Limits and Precision: Mastering limits builds your intuition for numerical stability, convergence, and approximation in ML algorithms.
- ๐ Continuity in Practice: Continuous loss functions ensure smooth training. Discontinuities can cause sudden optimization failures.
- ๐งฎ Differentiability = Learnability: If a function isnโt differentiable, gradient-based methods (like backpropagation) wonโt work.
- ๐ Piecewise Behavior: Recognize when piecewise models like ReLU introduce non-differentiable points โ and how this affects learning speed.
- ๐งฉ Function Smoothness: Smooth, continuous, and differentiable models generalize better and are more robust to noisy data.
โ Best Practices
- ๐ Clearly define your domain: Before analyzing limits or continuity, specify where the function is defined and what happens near edges.
- ๐ Check one-sided limits: Always test left-hand and right-hand limits โ especially for piecewise or discontinuous functions.
- ๐ Use simple plots for intuition: Visualizing limits or corners (like in \(|x|\)) makes differentiability easier to grasp.
- ๐งฎ Simplify before evaluating: Use algebra (factoring, cancelling) to rewrite functions when limits seem undefined.
- ๐ง Distinguish continuity from differentiability: Remember, a function can be continuous but not differentiable.
- ๐ก Test critical points: Especially around \(x = 0\), corners, or undefined values โ those are the hotspots for discontinuity or non-smooth behavior.
โ ๏ธ Common Pitfalls
- โ Assuming all functions are smooth: Not all continuous functions are differentiable. Donโt confuse them.
- โ Forgetting removable discontinuities: A function might have a limit even when itโs undefined at a point.
- โ Using only numeric evaluation: Relying only on plotting or calculators can miss underlying structure โ combine with algebra.
- โ Overlooking piecewise definitions: For functions defined in parts, always check each region separately.
- โ Ignoring symmetry: Functions like even/odd functions or absolute values have special properties that affect continuity and smoothness.
๐ Try It Yourself
๐ Limit Challenge: What is \(\displaystyle \lim_{x \to 2} \frac{x^2 - 4}{x - 2} \) ?
๐ง Step-by-step:- Factor numerator: \( x^2 - 4 = (x - 2)(x + 2) \)
- Cancel terms: \( \frac{(x - 2)(x + 2)}{x - 2} = x + 2 \) (for \( x \ne 2 \))
โ Final Answer: \[ \lim_{x \to 2} \frac{x^2 - 4}{x - 2} = 4 \]
๐ Continuity Check: Is the function \[ f(x) = \begin{cases} x^2 & x < 1 \\\\ 3 & x = 1 \\\\ 2 - x & x > 1 \end{cases} \] continuous at \( x = 1 \) ?
๐ง Step-by-step:- Left-hand limit: \( \lim_{x \to 1^-} f(x) = 1^2 = 1 \)
- Right-hand limit: \( \lim_{x \to 1^+} f(x) = 2 - 1 = 1 \)
- But \( f(1) = 3 \) ๐ค
โ Not continuous! โ Final Answer: \[ \lim_{x \to 1} f(x) = 1 \ne f(1) \]
๐ Differentiability Test: Is \( f(x) = |x| \) differentiable at \( x = 0 \)?
๐ง Hint:- Left-hand derivative: \( f'(x) = -1 \)
- Right-hand derivative: \( f'(x) = 1 \)
โ Derivatives don't match at \( x = 0 \), so not differentiable! โ Final Answer: \[ f(x) = |x| \text{ is not differentiable at } x = 0 \]
๐ Transformation Intuition: What does \( f(x, y) = (x, y^2) \) do to the plane?
๐ง Insight:- Keeps \( x \) the same - Squashes negative \( y \) to positive - Bends the grid into a parabolic shape โ Visualization: - Try plotting the grid with original vs transformed coordinates!
โ Summary
Letโs wrap up the key ideas from this post:
Topic | Summary |
---|---|
Function as Transformation | Warping or reshaping input space โ e.g., \( f(x, y) = (x, y^2) \) folds the plane |
Limits | Describe how a function behaves near a point โ not just at it |
Continuity | Function is continuous if limit exists and matches the value at the point |
Differentiability | Smoothness โ function must have a well-defined slope (no corners) |
Relevance to ML | Essential for gradients, backpropagation, and smooth training |
๐ฌ Got a question or suggestion?
Leave a comment below โ Iโd love to hear your thoughts or help if something was unclear.
๐งญ Next Up
Now that youโve explored how functions behave through transformations, limits, and smoothness, itโs time to zoom in on how they change โ and how we measure that change precisely.
In the upcoming post, weโll dive into:
- What a gradient really is โ and how it generalizes the derivative to higher dimensions
- The meaning of instantaneous rate of change in both math and machine learning
- How limits give rise to derivatives, step by step
- Using gradients for approximation and direction-finding in complex systems
- How to calculate derivatives symbolically and numerically using Python
๐ง These tools are essential for optimization, learning, and understanding the terrain of functions.
Stay tuned โ weโre about to unlock the core mechanics of calculus and machine learning!