Big O notation is a theoretical measurement of the execution of an algorithm. It considers time, memory, and problem size. It helps to analysis the programming code with different types of performance i.e. best case, average case, and worst case
Algorithm Analysis using Big O notation
