Computer Science Formulas
Computer science formulas for information theory, algorithm complexity, digital logic, Boolean algebra, and data structure analysis.
Master's Theorem
Analyze divide-and-conquer algorithm complexity using the Master Theorem. Solve T(n) = aT(n/b) + f(n) with worked examples.
Cache Hit Ratio Formula
The cache hit ratio formula measures cache effectiveness. Learn how to calculate hit rate, miss rate, and effective memory access time with examples.
TCP Throughput Formula
The TCP throughput formula explains the maximum data transfer rate limited by window size, RTT, and packet loss. Includes worked examples.
Amdahl's Law
Calculate maximum parallel speedup using Amdahl's Law. Enter the fraction of serial code and processor count to find the theoretical performance limit.
Shannon's Channel Capacity
Calculate maximum channel data rate in bits per second using Shannon's theorem C=B*log2(1+S/N). Enter bandwidth and signal-to-noise ratio in dB.
Big O Notation Reference
Understand Big O notation for algorithm complexity. Covers O(1), O(log n), O(n), O(n log n), O(n2), and O(2n) with examples.
Hash Collision Probability (Birthday Problem)
Calculate the probability of hash collisions using the birthday problem formula. Understand collision resistance with worked examples.
Sorting Algorithm Complexity
Compare Big-O time and space complexity of sorting algorithms. Covers bubble, merge, quick, heap, and radix sort with best, average, and worst case analysis.
Big O Notation
Reference for Big O — O(1), O(log n), O(n), O(n log n), O(n²). Compares how sorting, searching, and graph algorithms scale with growth rate examples.
Binary Conversion Formulas
Convert numbers between binary, decimal, octal, and hexadecimal with step-by-step formulas. Covers bit operations and two-complement notation basics.
Boolean Algebra Laws
Reference for Boolean algebra laws: De Morgan, identity, complement, and absorption. Covers AND, OR, NOT, XOR, and NAND truth tables for digital logic.
Hamming Distance Formula
Hamming distance counts differing bit positions between two binary strings. Used in error detection, correction codes, DNA analysis, and information theory.
Nyquist-Shannon Sampling Theorem
Nyquist theorem states sampling rate must be at least 2× the signal frequency. Explains why CD audio uses 44,100 Hz to capture up to 22,050 Hz accurately.
Shannon's Entropy Formula
Reference for Shannon entropy H(X) = -Σ p(x) log₂ p(x), measuring information in bits. Covers data compression, cryptography, and feature selection.