Skip to content

Big O Notation

coloradokim edited this page Apr 26, 2018 · 2 revisions

Big O notation describes 2 things:

  1. The amount of memory an algorithm needs to run
  2. The time the algorithm will take a runtime based on how much data is passed to the algorithm
  • 'O' stands for order
  • O(1) is constant time
  • O(n) is linear time
  • O(n²) is
Clone this wiki locally