Calculus

Limits

Intuition
lim_{x→c} f(x) = L

"As x approaches c, f(x) approaches L"

The function doesn't need to BE L at c — it just needs to GET CLOSE.
A limit exists when the left and right limits agree.

lim_{x→c^-} f(x) = L  (left-hand limit)
lim_{x→c^+} f(x) = L  (right-hand limit)
Limit exists iff both sides equal L.
Key limits
lim_{x→0} sin(x)/x = 1
lim_{x→inf} (1 + 1/x)^x = e ≈ 2.71828
lim_{x→0} (e^x - 1)/x = 1
lim_{x→inf} e^(-x) = 0
lim_{x→inf} ln(x)/x = 0         (log grows slower than linear)
lim_{x→inf} x^n/e^x = 0         (exponential beats polynomial)
Limit laws
lim [f(x) + g(x)] = lim f(x) + lim g(x)
lim [f(x) * g(x)] = lim f(x) * lim g(x)
lim [f(x) / g(x)] = lim f(x) / lim g(x)    (denominator != 0)
lim [c * f(x)]    = c * lim f(x)
lim [f(x)]^n      = [lim f(x)]^n

Derivatives

Definition
f'(x) = lim_{h→0} [f(x+h) - f(x)] / h

Geometric meaning:  slope of the tangent line at x
Physical meaning:   instantaneous rate of change

If y = f(x), equivalent notations:
  f'(x)  =  dy/dx  =  d/dx[f(x)]  =  y'
Derivative rules
Constant:       d/dx[c] = 0
Power rule:     d/dx[x^n] = n * x^(n-1)
Constant mult:  d/dx[c*f] = c * f'
Sum/diff:       d/dx[f +/- g] = f' +/- g'
Product rule:   d/dx[f*g] = f'g + fg'
Quotient rule:  d/dx[f/g] = (f'g - fg') / g^2
Chain rule:     d/dx[f(g(x))] = f'(g(x)) * g'(x)
Common derivatives
d/dx[x^n]     = n*x^(n-1)
d/dx[e^x]     = e^x
d/dx[a^x]     = a^x * ln(a)
d/dx[ln(x)]   = 1/x
d/dx[log_a(x)]= 1 / (x * ln(a))
d/dx[sin(x)]  = cos(x)
d/dx[cos(x)]  = -sin(x)
d/dx[tan(x)]  = sec^2(x)
d/dx[sqrt(x)] = 1 / (2*sqrt(x))
Why derivatives matter (engineering context)
Position s(t)   →  Velocity v(t) = s'(t)  →  Acceleration a(t) = v'(t)

Signal processing: rate of change of a signal
Optimization:      find maxima/minima (f'(x) = 0)
Taylor series:     approximate functions as polynomials
Gradient descent:  machine learning optimization

Integrals

Definition
Indefinite integral (antiderivative):
  ∫ f(x) dx = F(x) + C    where F'(x) = f(x)

Definite integral (area):
  ∫_a^b f(x) dx = F(b) - F(a)    (Fundamental Theorem of Calculus)

Geometric meaning: signed area between f(x) and x-axis
Physical meaning:  accumulation (distance from velocity, etc.)
Common integrals
∫ x^n dx       = x^(n+1) / (n+1) + C     (n != -1)
∫ 1/x dx       = ln|x| + C
∫ e^x dx       = e^x + C
∫ a^x dx       = a^x / ln(a) + C
∫ sin(x) dx    = -cos(x) + C
∫ cos(x) dx    = sin(x) + C
∫ sec^2(x) dx  = tan(x) + C
∫ 1/sqrt(x) dx = 2*sqrt(x) + C
Integration rules
Constant mult:  ∫ c*f(x) dx = c * ∫ f(x) dx
Sum/diff:       ∫ [f +/- g] dx = ∫ f dx +/- ∫ g dx
u-substitution: ∫ f(g(x))*g'(x) dx = ∫ f(u) du    (u = g(x))

Example (u-substitution):
  ∫ 2x * e^(x^2) dx
  Let u = x^2, du = 2x dx
  = ∫ e^u du = e^u + C = e^(x^2) + C

Applications in Computing

Big-O and growth rates
Calculus gives the hierarchy of growth:

  O(1) < O(log n) < O(n) < O(n log n) < O(n^2) < O(n^3) < O(2^n) < O(n!)

Limits prove these relationships:
  lim_{n→inf} log(n)/n = 0        (log grows slower than linear)
  lim_{n→inf} n^k/e^n = 0         (polynomial < exponential)

Derivative tells you the rate of growth:
  d/dn[n^2] = 2n                  (grows linearly with n)
  d/dn[2^n] = 2^n * ln(2)        (grows as fast as itself!)
Summation formulas (discrete calculus)
sum_{i=1}^{n} 1     = n                       O(n)
sum_{i=1}^{n} i     = n(n+1)/2                O(n^2)
sum_{i=1}^{n} i^2   = n(n+1)(2n+1)/6          O(n^3)
sum_{i=0}^{n} r^i   = (r^(n+1) - 1)/(r - 1)  geometric sum
sum_{i=1}^{n} 1/i   ≈ ln(n) + 0.5772          harmonic number

These appear constantly in algorithm analysis:
  Nested loops over n:  sum i = O(n^2)
  Divide and conquer:   geometric sum with r = 1/2