# Algorithm: Big O Notation

## Overview

The Big O Notation generally describes how efficient an algorithm is. Specifically, it measures how long an algorithm takes to execute, under the worst case scenario.

Generally, for sufficiently large `n`, the value of a function is determined by its dominant term (growth rate). Specifically, the Big O captures the most dominant term in a function, and ignores multiplicative constants.

Note: the term brute force in algorithms, corresponds to the straight-forward approach of trying every possible solution.

### Statements

Given a sequence of statements

``````statement 1
statement 2
...
statement k
``````

If we assume each statement are constant time operations, then the total time for running the above set of statements are as follows:

``````total time = time(statement 1) + time(statement 2) + ... + time(statement k)
``````

### Conditions

When presented an `if-elif-else` statement,

```if (condition 1): block 1 (sequence of statements)
elif (condition 2): block 2 (sequence of statements)
else: block 3 (sequence of statements)```

the worst-case time, is the slowest of the three possibilities.

``````max(time(block 1), time(block 2), time(block3))
``````

## Algorithm Times

### Constant Time `O(1)`

An algorithm is referred to constant time, if it executes in the same amount of time regardless of the input `value` provided.

```## check_int: check if supplied value is an integer type.
def check_int(value):
return isinstance(value, int)```

### Linear Time `O(n)`

An algorithm is referred to linear time, if the time it takes to execute, is directly proportional to the input `value` size. In other words, every time the sample size doubles, the runtime doubles.

```## print_range: print values within the supplied 'value' range.
def print_range(value):
for x in range(value):
print x```

### Quadratic Time `O(n^2)`

An algorithm is referred to quadratic time, if the time it takes to execute, is directly proportional to the squared input `value` size.

```## print_nested_range: print values within the supplied 'value' range.
def print_nested_range(value):
for x in range(value):
for y in range(value):
print x, y```

Note: the above example, the `range(value)` were identical for both loops. Instead, if we had `z` number of nested loops, with the same `range(value)`, then the algorithm time would be `z`th power. However, if the above example had different `range(value)`, then the algorithm time be `O(N * M)`.

### Exponential Time `O(a^n)`

An algorithm is referred to exponential time, if the time it takes to execute, is exponentially proportional to the input `value` size. In other words, increasing the sample size by 1, increases the runtime by a factor of `a`, where `a > 1`.

The following is `O(2^n)` exponential time:

```## fibonacci: return the nth fibonacci number.
def fibonacci(value):
if value == 0: return 0
elif value == 1: return 1
else: return fibonacci(value - 1) + fibonacci(value - 2)```

### Factorial Time `O(n!)`

An algorithm is referred to factorial time, if the time to execute, is directly proportional to the factorial of the input `value` size.

```## factorial: prints the factorial for the given value.
def factorial(value):
if n < 1:
return 1
else:
returnNumber = value * factorial( value - 1 )
print(str(n) + '! = ' + str(returnNumber))```

Note: factorial time is slower than exponential time.

### Logarithmic Time `O(log(n))`

An algorithm is referred to logarithmic time, if the time to execute, is logarithmically proportional to the input `value` size. In other words, every time the sample size doubles, the runtime increases by a constant.

Note: an example of an algorithm that implements in logarithmic time, is the least common ancestor algorithm.

A binary search tree consists of a single root node, where each node (including the root), may contain at most two child nodes (left, or right). By definition, the left child node is less than it's parent node, while the right node is greater than or equal to it's parent node.

Note: we could arbitrarily restrict the left node to be less than or equal to it's parent node. This would mean the right now would strictly be greater than it's parent node.

Note: a binary tree that does not implement structured ordering (discussed above), has a linear time complexity.

When searching for values within a binary search tree, the logic is as follows:

``````Base Case: is `value` the root node, then stop
is `value` less than the root, traverse the left node, enter *induction*
is `value` greater than or equal to the root, traverse the right node, enter *induction*
Induction: is `value` the parent node, then stop
is `value` less than the parent, traverse the left node, and repeat the *induction*
is `value` greater than or equal to the parent, traverse the right node, and repeat the *induction*
``````

More specifically, when trying to find a value within a tree, the search space `n`, is divided by 2 repeatedly until the value is found. By definition, the worst case scenario is assumed as one search space, or one search iteration, which contains the maximum number of times the single search space can be split in half.

Note: the search space `n`, refers to the number of nodes, or elements within the binary search tree.

Algorithmically, this can be represented as follows:

``````(n / (2^x)) = 1
``````

And, can be rewritten as follows:

``````2^x = n
log(2^x) = log(n)
xlog(2) = log(n)
x = log(n) / log(2) = log_2(n)
``````

This means, the maximum number of steps required to search a space of `n` elements is `log_2(n)`. Since the difference between `log_2(n)`, and `log_10(n)` is a constant factor, it is acceptable to generalize the binary search tree to be `O(log(n))`

## Conclusion

### Sequential Statements

The following determines the overall Big O notation when one operation follows another:

``````Complexity time = O(n) + O(n) = O(n)

Complexity time = O(n) + O(n^2) = O(n^2)

Complexity time = O(n) + O(log(n)) = O(n)
``````

### Nested Logic

The following determines the overall Big O notation for operations that execute another:

``````Complexity time = O(n) * O(n) = O(n^2)

Complexity time = O(n) * O(log(n)) = O(log(n))

Complexity time = O(n) * O(1) = O(n)
``````

### Variable Complexity `O(n*sum(m))`

An algorithm is referred to variable complexity, if the time it takes to execute, is directly proportional to the product of the input `value` size, followed by the sum of all elements within a variable sequence.

```seq1 = [[0, 1], [2], [3, 4, 5]]
s = 0
for seq2 in seq1:
for x in seq2:
s += x```

In the above case, the outer loop has a complexity time equal to the number of elements within the first sequence. Since the the second sequence is nested within the first, the runtime varies. But, can be acquired by counting the number of times `s += x` is executed. Specifically, it executes `2 + 1 + 3 = 6` times.

##### Clone this wiki locally
You can’t perform that action at this time.
Press h to open a hovercard with more details.