Algorithms and Computation
The value of learning about algorithms are:
- Solving computational problems
- Proving correctness
- Argue the efficiency of your algorithm
- Communicate your ideas to others
Solving computational problems
A computational problem is a binary relation between the problem inputs and the correct outputs. An example of a computational problem is sorting numbers in an array or list.
Solving a problem involves designing an algorithm. An algorithm is a function that accepts arbitrary inputs and maps them to deterministic output(s).
Proving correctness
A program’s correctness is proven using inductive reasoning.
Argue efficiency
An algorithm’s efficiency is measured by how fast it produces the correct output. Efficiency is measured by counting the number of operations(ops).
The efficiency of an algorithm is annotated using asymptotic analysis. The annotations used include:
- Big O: measures the upper bound or worst-case scenario
- Omega (Ω): measures the lower bound or best case scenario
- Theta (θ): measures the average case
Example complexity annotations:
- O(1): constant. The computation time will be the same regardless of the input size.
- O(log n): logarithmic.
- O(n): linear. Computation time increases when the data set is doubled.
- O(n logn): log-linear
- O(n 2): quadratic
- O(n c): polynomial
- 2 Øn: exponential
Illustration of the complexities
From towards data science ↗.
Other learnings
- Inductive reasoning ↗: method of drawing conclusions by going from the specific to general.
- Asymptotic analysis ↗: computing the running time of an operation in mathematical units of computation. (Used in the study of algorithms)
- Pigeon hole principle ↗: The principle states that if n+1 pigeons occupy n holes, then some hole must occupy at least 2 pigeons. If 3 pigeons occupy 2 holes, some hole must have at least 2 pigeons.