Physics

Classical mechanics, electricity and circuits, wave properties, and thermodynamics — the physical foundations of computing and networking.

Classical Mechanics

Newton’s laws
1st Law (Inertia):   An object at rest stays at rest; in motion stays in motion
                     unless acted upon by a net force.

2nd Law:             F = ma    (force = mass × acceleration)
                     Units: Newton (N) = kg × m/s^2

3rd Law:             Every action has an equal and opposite reaction.
                     You push wall → wall pushes you back.

Work:    W = F × d × cos(θ)    (force × distance × angle)
Energy:  KE = (1/2)mv^2        (kinetic)
         PE = mgh              (gravitational potential)
Power:   P = W/t = F × v       (watts = joules/second)

Conservation of energy:
  Total energy in a closed system remains constant.
  KE + PE = constant (no friction)

Electricity and Circuits

Fundamental quantities
Voltage (V):     electrical pressure (potential difference)
                 Units: Volts (V) = Joules/Coulomb
Current (I):     flow of charge
                 Units: Amperes (A) = Coulombs/second
Resistance (R):  opposition to current flow
                 Units: Ohms (Ω) = V/A
Power (P):       rate of energy conversion
                 Units: Watts (W) = V × A
Ohm’s law and power
V = IR           Voltage = Current × Resistance
P = VI           Power = Voltage × Current
P = I^2R         Power in terms of current and resistance
P = V^2/R        Power in terms of voltage and resistance

Example: 120V outlet, 15A circuit breaker
  Max power: P = 120 × 15 = 1800W
  A 2000W device trips the breaker
Series and parallel circuits
Series:
  R_total = R1 + R2 + R3 + ...
  Current same through all components
  Voltage divides

Parallel:
  1/R_total = 1/R1 + 1/R2 + 1/R3 + ...
  Voltage same across all components
  Current divides

Kirchhoff's laws:
  KCL (current): current into a node = current out
  KVL (voltage): sum of voltages around a loop = 0

Waves and Signals

Wave properties
Wavelength (λ):   distance between crests (meters)
Frequency (f):    cycles per second (Hertz)
Period (T):       time for one cycle = 1/f
Amplitude (A):    maximum displacement
Speed (v):        v = λ × f

Electromagnetic spectrum:
  Radio < Microwave < Infrared < Visible < UV < X-ray < Gamma

  WiFi 2.4 GHz:  λ = 3×10^8 / 2.4×10^9 ≈ 12.5 cm
  WiFi 5 GHz:    λ = 3×10^8 / 5×10^9 = 6 cm
  Light (green):  λ ≈ 550 nm, f ≈ 5.5×10^14 Hz
Signal concepts (for networking)
Bandwidth:      range of frequencies a channel can carry
Throughput:     actual data rate achieved
Latency:        time for signal to travel from source to destination

Shannon's theorem:
  C = B × log_2(1 + SNR)
  C = max data rate (bits/sec)
  B = bandwidth (Hz)
  SNR = signal-to-noise ratio

  Higher bandwidth → more capacity
  Higher SNR → more capacity
  Doubling bandwidth doubles capacity
  Doubling SNR adds ~1 bit/Hz

Decibels:
  dB = 10 × log_10(P1/P2)      (power ratio)
  3 dB  = double power
  10 dB = 10× power
  -3 dB = half power

Thermodynamics

Laws
0th Law:  If A is in thermal equilibrium with B, and B with C,
          then A is in equilibrium with C (defines temperature)

1st Law:  Energy is conserved: ΔU = Q - W
          (internal energy change = heat in - work out)

2nd Law:  Entropy of an isolated system never decreases
          Heat flows spontaneously from hot to cold, not reverse
          No heat engine is 100% efficient

3rd Law:  Absolute zero (0 K = -273.15°C) is unattainable

Temperature conversions:
  F = C × 9/5 + 32
  C = (F - 32) × 5/9
  K = C + 273.15
Entropy and information
Thermodynamic entropy: measure of disorder in a system
Information entropy:   measure of uncertainty in a message

Shannon entropy:  H = -sum(p_i × log_2(p_i))    bits

Connection:
  Landauer's principle: erasing 1 bit of information
  dissipates at least kT × ln(2) joules of heat
  At room temperature: ≈ 2.85 × 10^(-21) J per bit

  This is why computers generate heat — computation erases information

See Also

  • Electronics — applying electricity to digital logic and CPU design

  • Signals — wave physics applied to signal processing

  • Information Theory — entropy bridges physics and computing