Big-O notation and Time complexity in Python

Table of content:

Time complexity is a way to measure how the runtime of your program grows as the input size increases.
It helps you predict performance without actually running the code with huge inputs.

  • Input size (n) - the number of elements your function processes (length of a list, number of nodes, etc).
  • Runtime growth - how much slower your algorithm gets when n becomes very large.

Think of it as a way to classify algorithm by efficiency, not as an exact timing.


2. Big-O Notation

Big-O notation expresses the upper bound of the growth rate.
I answers the question: "How bad can it get, in the worst case?"

Here are the most common complexities (from the fastest to slowest):

Big-O Name Example in Python
O(1) Constant time Accessing a list element: my_list[0]
O(log n) Logarithmic Binary search in a sorted list
O(n) Linear Iterating through a list: for x in my_list:
O(n log n) Log-linear Sorting: sorted(my_list) (Timsort)
O(n^2) Quadratic Nested loops over the same list
O(2^n) Exponential Solving the Traveling Salesman by brute force
O(n!) Factorial Generating all permutations of n items

3. Examples in Python

Let's see real Python code examples.

O(1) - Constant Time

Does not depend on n:

my_list = [10, 20, 30, 40]
print(my_list[2]) # O(1)

No matter how long the list is, acessing an element takes the same time.


O(n) - Linear Time

Grows proportionally with input size:

for x in my_list: # O(n)
    print(x)

If the list doubles in size, runtime roughly doubles too.


O(n²) - Quadratic Time

Nested loops ➡️ much slower growth:

for x in my_list:
    for y in my_list:
        print(x, y)

If n = 10, we do 10 * 10 = 100 operations.
If n = 100, we do 100 * 100 = 10 000 operations.


O(log n)

Halves the search space each step:

import bisect

my_list = list(range(100000))
index = bisect.bisect_left(my_list, 99999) # O(log n)

Binary search is very fast on huge lists because it discards half of the data each step.


O(n log n) - Log-Linear Time

Typical for efficient sorting algorithms:

sorted_list = sorted(my_list) # O(n log n)

This is much faster than a naïve bubble sort (O(n^2)) for big n.


4. Why this matters ?

  • Code efficiency: Helps you choose the best data structures and algorithms.
  • Scalability: Ensures you program works well even with large inputs.
  • Interview prep: Common topic in technical interviews.

5. Rules of thumb in Python

  • Lists
    • Index access: O(1)
    • Append at end: O(1) (amortized)
    • Insert/remove at start: O(n) (shift elements)
    • Iterations: O(n)
  • Sets and Dictionnaries
    • Lookup, insert, delete: O(1) avarage case
    • Worst case can degrade to O(n) but it's rare
  • Strings
    • Concatenation s1 + s2: O(n + m)
    • Iteration: O(n)
Sept. 18, 2025, 5:14 p.m.