Post

๐Ÿ” From Limits to Smoothness: Transformations, Limits, Continuity & Differentiability

Understand functions as geometric transformations, visualize limits, and grasp continuity and differentiability with Python and math-based insights.

๐Ÿ” From Limits to Smoothness: Transformations, Limits, Continuity & Differentiability

๐Ÿ” Functions as Transformations

A function can be seen as a machine: it takes input values and returns output values. But in a more visual and geometric sense, a function can be thought of as a transformation โ€” changing or warping the input space into a new shape.



๐Ÿ“š Mathematical Idea

If we start with a point \( x \in \mathbb{R} \), a function \( f(x) \) maps it to another real number \( y \).
For a 2D transformation:

\[ f(x, y) = (x, y^2) \]

This โ€œbendsโ€ the coordinate plane vertically โ€” keeping \( x \) unchanged but squaring the \( y \)-coordinate.

๐Ÿงช Python Example: Transforming a Grid

Visual: Show a grid before and after the transformation \( f(x, y) = (x, y^2) \)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
import numpy as np
import matplotlib.pyplot as plt

x, y = np.meshgrid(np.linspace(-2, 2, 20), np.linspace(-2, 2, 20))
x_new = x
y_new = y ** 2

fig, axs = plt.subplots(1, 2, figsize=(10, 4))
axs[0].quiver(x, y, np.zeros_like(x), np.zeros_like(y), color='gray')
axs[0].set_title("Original Grid")
axs[0].axis('equal')

axs[1].quiver(x_new, y_new, np.zeros_like(x), np.zeros_like(y), color='teal')
axs[1].set_title("Transformed Grid: $f(x, y) = (x, y^2)$")
axs[1].axis('equal')

plt.tight_layout()
plt.show()

Left panel shows the original 2D grid. Right panel shows how the transformation bends the grid vertically, curving the ๐‘ฆ y-coordinates upward while preserving the ๐‘ฅ x-axis.
Left panel shows the original 2D grid. Right panel shows how the transformation bends the grid vertically, curving the ๐‘ฆ y-coordinates upward while preserving the ๐‘ฅ x-axis.

๐Ÿ“‰ What Is a Limit?

A limit describes what a function is approaching as the input gets closer and closer to a specific value โ€” even if the function is not defined at that value.

๐Ÿ“š Mathematical Definition

\[ \lim_{x \to a} f(x) = L \]

This means: as \( x \) gets arbitrarily close to \( a \), the value of \( f(x) \) gets arbitrarily close to \( L \).

๐Ÿ” Example:

Let:

\[ f(x) = \frac{x^2 - 1}{x - 1} \]

At \( x = 1 \), the function is undefined โ€” but we can factor:

\[ f(x) = \frac{(x - 1)(x + 1)}{x - 1} = x + 1 \quad \text{(for } x \neq 1 \text{)} \]

So:

\[ \lim_{x \to 1} f(x) = 2 \]

๐Ÿงช Python Visualization

1
2
3
4
5
6
7
8
9
10
11
12
x = np.linspace(0.5, 1.5, 200)
y = (x**2 - 1)/(x - 1)

plt.plot(x, y, label=r'$f(x) = \frac{x^2 - 1}{x - 1}$')
plt.axvline(1, color='red', linestyle='--', label='x = 1')
plt.axhline(2, color='green', linestyle='--', label='Limit = 2')
plt.xlabel('x')
plt.ylabel('f(x)')
plt.title('Limit Approaching x = 1')
plt.legend()
plt.grid(True)
plt.show()

Plot showing the limit of ( f(x) = \frac{x^2 - 1}{x - 1} ) as ( x \to 1 )


๐Ÿงฉ Examples of Limits in Functions

๐Ÿ”น One-Sided Limits

\[ \lim_{x \to a^-} f(x) \quad \text{and} \quad \lim_{x \to a^+} f(x) \]

The full limit exists only when both sides agree.

๐Ÿ”น Piecewise Example:

\[ f(x) = \begin{cases} x^2 & \text{if } x < 1 \ 3 & \text{if } x = 1 \ 2 - x & \text{if } x > 1 \end{cases} \]

Then:

\[ \lim_{x \to 1^-} f(x) = 1 \quad \text{and} \quad \lim_{x \to 1^+} f(x) = 1 \Rightarrow \lim_{x \to 1} f(x) = 1 \]

But:

\[ f(1) = 3 \neq \lim_{x \to 1} f(x) \]

So the function has a removable discontinuity at \( x = 1 \).


Piecewise function with removable discontinuity at ( x = 1 ) โ€”

๐Ÿ”— Continuity and Differentiability

โœ… Continuity

A function \( f(x) \) is continuous at \( x = a \) if:

  1. \( f(a) \) is defined
  2. \( \lim_{x \to a} f(x) \) exists
  3. \( \lim_{x \to a} f(x) = f(a) \)

โœ… Differentiability

A function is differentiable at \( x = a \) if it is continuous and smooth (no sharp corners or cusps).

๐Ÿ” Example: \( f(x) = |x| \)

  • Continuous everywhere
  • Not differentiable at \( x = 0 \) because of a sharp point

๐Ÿงช Python Visualization

1
2
3
4
5
6
7
8
9
10
11
x = np.linspace(-2, 2, 400)
y = np.abs(x)

plt.plot(x, y, label=r'$f(x) = |x|$', color='blue')
plt.axvline(0, color='red', linestyle='--', label='x = 0')
plt.xlabel('x')
plt.ylabel('f(x)')
plt.title('Absolute Value: Continuous but Not Differentiable at x = 0')
plt.legend()
plt.grid(True)
plt.show()

Absolute value function is continuous but not differentiable at ( x = 0 )


๐Ÿค– Relevance to Machine Learning

Understanding limits, continuity, and differentiability is essential for many foundational ideas in machine learning and deep learning:

  • ๐Ÿง  Gradient Descent & Optimization
    Most learning algorithms (like gradient descent) rely on functions being continuous and differentiable so we can compute smooth gradients to minimize loss.

  • ๐Ÿ” Backpropagation
    Neural networks use the chain rule to propagate error gradients backward โ€” which requires differentiable activation functions and loss functions.

  • ๐Ÿ“‰ Loss Surfaces
    The cost or loss function must be smooth and continuous for optimizers to navigate toward minima efficiently. Sharp discontinuities can trap or mislead optimization.

  • ๐Ÿงฉ Activation Functions
    Common activations (ReLU, sigmoid, tanh) are chosen based on their continuity and differentiability โ€” affecting both model capacity and training dynamics.

  • ๐Ÿ“ Regularization & Generalization
    Techniques like L2 regularization implicitly promote smoother (more continuous and differentiable) functions, which helps with generalization and avoiding overfitting.

  • โš ๏ธ Adversarial Robustness
    Discontinuous or non-differentiable spots in the model behavior can be exploited by adversarial examples. Smoothness leads to more stable and robust models.


๐Ÿงญ Key Insight: If your model isnโ€™t differentiable, gradient-based learning breaks down. Smoothness isnโ€™t just elegant โ€” itโ€™s essential!


๐Ÿง  Level Up
  • ๐Ÿ”„ Functions as Mappings: Think of every function as a way to reshape input space โ€” this is crucial for understanding transformations in deep learning layers.
  • ๐Ÿ“ Limits and Precision: Mastering limits builds your intuition for numerical stability, convergence, and approximation in ML algorithms.
  • ๐Ÿ“ Continuity in Practice: Continuous loss functions ensure smooth training. Discontinuities can cause sudden optimization failures.
  • ๐Ÿงฎ Differentiability = Learnability: If a function isnโ€™t differentiable, gradient-based methods (like backpropagation) wonโ€™t work.
  • ๐Ÿ“‰ Piecewise Behavior: Recognize when piecewise models like ReLU introduce non-differentiable points โ€” and how this affects learning speed.
  • ๐Ÿงฉ Function Smoothness: Smooth, continuous, and differentiable models generalize better and are more robust to noisy data.

โœ… Best Practices
  • ๐Ÿ“Œ Clearly define your domain: Before analyzing limits or continuity, specify where the function is defined and what happens near edges.
  • ๐Ÿ” Check one-sided limits: Always test left-hand and right-hand limits โ€” especially for piecewise or discontinuous functions.
  • ๐Ÿ“‰ Use simple plots for intuition: Visualizing limits or corners (like in \(|x|\)) makes differentiability easier to grasp.
  • ๐Ÿงฎ Simplify before evaluating: Use algebra (factoring, cancelling) to rewrite functions when limits seem undefined.
  • ๐Ÿง  Distinguish continuity from differentiability: Remember, a function can be continuous but not differentiable.
  • ๐Ÿ’ก Test critical points: Especially around \(x = 0\), corners, or undefined values โ€” those are the hotspots for discontinuity or non-smooth behavior.

โš ๏ธ Common Pitfalls
  • โŒ Assuming all functions are smooth: Not all continuous functions are differentiable. Donโ€™t confuse them.
  • โŒ Forgetting removable discontinuities: A function might have a limit even when itโ€™s undefined at a point.
  • โŒ Using only numeric evaluation: Relying only on plotting or calculators can miss underlying structure โ€” combine with algebra.
  • โŒ Overlooking piecewise definitions: For functions defined in parts, always check each region separately.
  • โŒ Ignoring symmetry: Functions like even/odd functions or absolute values have special properties that affect continuity and smoothness.

๐Ÿ“Œ Try It Yourself

๐Ÿ“‰ Limit Challenge: What is \(\displaystyle \lim_{x \to 2} \frac{x^2 - 4}{x - 2} \) ? ๐Ÿง  Step-by-step:
- Factor numerator: \( x^2 - 4 = (x - 2)(x + 2) \)
- Cancel terms: \( \frac{(x - 2)(x + 2)}{x - 2} = x + 2 \) (for \( x \ne 2 \))
โœ… Final Answer: \[ \lim_{x \to 2} \frac{x^2 - 4}{x - 2} = 4 \]

๐Ÿ“ˆ Continuity Check: Is the function \[ f(x) = \begin{cases} x^2 & x < 1 \\\\ 3 & x = 1 \\\\ 2 - x & x > 1 \end{cases} \] continuous at \( x = 1 \) ? ๐Ÿง  Step-by-step:
- Left-hand limit: \( \lim_{x \to 1^-} f(x) = 1^2 = 1 \)
- Right-hand limit: \( \lim_{x \to 1^+} f(x) = 2 - 1 = 1 \)
- But \( f(1) = 3 \) ๐Ÿค”
โŒ Not continuous! โœ… Final Answer: \[ \lim_{x \to 1} f(x) = 1 \ne f(1) \]

๐Ÿ“ Differentiability Test: Is \( f(x) = |x| \) differentiable at \( x = 0 \)? ๐Ÿง  Hint:
- Left-hand derivative: \( f'(x) = -1 \)
- Right-hand derivative: \( f'(x) = 1 \)
โŒ Derivatives don't match at \( x = 0 \), so not differentiable! โœ… Final Answer: \[ f(x) = |x| \text{ is not differentiable at } x = 0 \]

๐ŸŒ Transformation Intuition: What does \( f(x, y) = (x, y^2) \) do to the plane? ๐Ÿง  Insight:
- Keeps \( x \) the same - Squashes negative \( y \) to positive - Bends the grid into a parabolic shape โœ… Visualization: - Try plotting the grid with original vs transformed coordinates!

โœ… Summary

Letโ€™s wrap up the key ideas from this post:


TopicSummary
Function as TransformationWarping or reshaping input space โ€” e.g., \( f(x, y) = (x, y^2) \) folds the plane
LimitsDescribe how a function behaves near a point โ€” not just at it
ContinuityFunction is continuous if limit exists and matches the value at the point
DifferentiabilitySmoothness โ€” function must have a well-defined slope (no corners)
Relevance to MLEssential for gradients, backpropagation, and smooth training

๐Ÿงญ Next Up

Now that youโ€™ve explored how functions behave through transformations, limits, and smoothness, itโ€™s time to zoom in on how they change โ€” and how we measure that change precisely.

In the upcoming post, weโ€™ll dive into:

  • What a gradient really is โ€” and how it generalizes the derivative to higher dimensions
  • The meaning of instantaneous rate of change in both math and machine learning
  • How limits give rise to derivatives, step by step
  • Using gradients for approximation and direction-finding in complex systems
  • How to calculate derivatives symbolically and numerically using Python

๐Ÿง  These tools are essential for optimization, learning, and understanding the terrain of functions.

Stay tuned โ€” weโ€™re about to unlock the core mechanics of calculus and machine learning!

๐Ÿ“บ Explore the Channel

Hoda Osama AI Channel

๐ŸŽฅ Hoda Osama AI

Learn statistics and machine learning concepts step by step with visuals and real examples.


๐Ÿ’ฌ Got a Question?

Leave a comment or open an issue on GitHub โ€” I love connecting with other learners and builders. ๐Ÿ”

This post is licensed under CC BY 4.0 by the author.