Calculate time complexity algorithms java programs

In this post will find the solutions for some of the queries related to Time Complexity.

    1. What is Time Complexity?
    2. Why to learn Time Complexity to become best developer?
  1. Why time complexity is an important issue in It field?
  2. How to calculate time complexity of algorithms program?


Time Complexity measures the time taken for running an algorithm and it is commonly used to count the number of elementary operations performed by the algorithm to improve the performance. Lets starts with simple example to  understand the meaning of Time Complexity in java.


Problem Statement

What are the possible approaches to reach destination “D” from the source “A”  with the best Time Complexity?

Time Complexity


Consider the 3-persons X, Y and Z

Case 1:

In this case the person “X” starts his journey via car.

Direction = A-B-D

Distance = 4 km

Time Complexity = 5 min

Case 2:

In this case The person “Y” starts his journey via Metro.

Direction = A-D

Distance = 2 km

Time Complexity = 3 min

Case 3:

In this case the person “Z” starts his journey via bus.

Direction = A-C-D

Distance = 5 km

Time Complexity = 10 min

There fore In this problem source and destination is similar but the way of approaches are different like bus,car and metro to reach destination and the time taken is also different, that is called Time Complexity. In Software companies always looking for smart Developers like Person “Y” so Time Complexity always makes man perfect and smart.

We all know that In current generation everyone looking for high paid job in the software industries. For example Developers in Google, Facebook, LinkedIn and Microsoft. every developers write programs but the Top companies always looking for Smart developers (like “Y” in the above example) and even they will pay High salary to them, Because time is money.

Asymptotic Analysis

It is the method of describing the limiting (best, average and worst ) behavior of any programming operation.

The main Objective of Asymptotic Analysis is to measure of efficiency of the program.

The Asymptotic Analysis can be done in three ways, As shown below.

Asymtotic Analysis

1. Best Case

It is the Minimum time taken for program execution. For example,

In the Introduction we have seen three different cases, in that we can say case 2 is the Best one with in the Time Complexity of 3 Minutes because, he reached the destination very early compare to others.

2. Average Case

It is the Average time taken for program execution. For example,

From the Introduction example we can say that case 1 is the Average one with the Time Complexity of 5 Minutes, compared to other two cases.

3. Worst Case

It is the Maximum time taken for program execution. For example,

From the Introduction example we can conclude that case 3 is the worst case with the Time Complexity of 10 Minutes compare to other cases.

Asymptotic Notations

The Asymptotic notations are used to calculate the running time complexity of a program.

It analyze a program running time based on the input size.

There are three types of Asymptotic notations used in Time Complexity, As shown below.

Asymtotic notations

1. Ο (Big Oh) Notation

* It is used to describe the performance or complexity of a program.

*  Big O describes the worst-case scenario i.e the longest amount of time taken to execute the program.

Common Asymptotic Notations

The below table has list of some common asymptotic notations in ascending order.

LogarithmicΟ(log n)
n log nΟ(n log n)

Lets understand the Big O Notation thoroughly by taking the java examples on common orders of growth like,

  1. O(1) – Constant
  2. O(log n) – Logarithmic
  3. O(n log n) – n log n
  4. O(n) – Linear
  5. O(n2) – Quadratic
  6. O(2n) – exponential

How to calculate time complexity of a java program with an example?

1. O(1)

The O(1) is also called as constant time, it will always execute in the same time regardless of the input size. For example if the input array could be 1 item or 100 items, but this method required only one step.

Note:O(1) is best Time Complexity method.

O(1) time complexity graph



package sample1;

public class sample {

static int y = 3;

static int z = 5;

public static void main(String[] args) {

int x = y+z;  //O(1) complexity




Output: 8  


O(1) complexity example

2. O(log n)

In O(log n) function the complexity increases as the size of input increases. It scale up very well i.e increases the number of steps to handle large size of inputs. For example in the Binary search program it uses divide and conquer technique (breaking down a problem into two or more sub-problems of the same type, until it become simple to be solved directly) for searching elements in large sized array.

Note: In O(log n) number of steps increases as the Size of elements increases.

O(log n) complexity graph


package BinarySearch;

public class BinarySearch {

int binarySearch(int arr[], int l, int r, int x)

{  if (r>=l)


int mid = l + (r – l)/2;

if (arr[mid] == x)

return mid;

if (arr[mid] > x)

return binarySearch(arr, l, mid-1, x);

return binarySearch(arr, mid+1, r, x);

} return -1;


public static void main(String args[])


BinarySearch ob = new BinarySearch();

int arr[] = {2,3,4,10,40};

int n = arr.length;

int x = 10;

int result = ob.binarySearch(arr,0,n-1,x);

if (result == -1)

System.out.println(“Element not present”);


System.out.println(“Element found at index ” +  result);



output: Element found at index 3


O(log n) Complexity example

3. O(n log n)

The O(n log n) function fall between the linear and quadratic function ( i.e O(n) and Ο(n2). It is mainly used in sorting algorithm to get good Time complexity. For example Merge sort and quick sort.

Note: The O(n log n) is the good time Complexity method.

O(n log n) y time complexity graph

package test;

import java.util.Arrays;

public class QuickSort {

public static void main(String[] args) {

int[] x = { 9, 2, 4, 7, 3, 7, 10 };


int low = 0;

int high = x.length – 1;

quickSort(x, low, high);



public static void quickSort(int[] arr, int low, int high) {

if (arr == null || arr.length == 0)


if (low >= high)


if (low >= high)


int middle = low + (high – low) / 2;

int pivot = arr[middle];

int i = low, j = high;

while (i <= j) {

while (arr[i] < pivot) {



while (arr[j] > pivot) {



if (i <= j) {

int temp = arr[i];

arr[i] = arr[j];

arr[j] = temp;





// recursively sort two sub parts

if (low < j)

quickSort(arr, low, j);

if (high > i)

quickSort(arr, i, high);



Output: [9, 2, 4, 7, 3, 7, 10]

               [2, 3, 4, 7, 7, 9, 10]



example of O(n log n) Complexity

4. O(n)

The O(n) is also called as linear time, it is direct proportion to the number of inputs. For example if the array has 5 items, it will print 5 times.

Note: In O(n) the number of elements increases, the number of steps also increases.

O(n) Time Complexity graph


package sample1;

public class sample {

public static void main(String[] args) {

int n = 5;

for (int i=0; i<n; i++) // O(n) Complexity




Output : 0 1 2 3 4


O(n) complexity example


5. O(n2)

The O(n2) is also called as quadratic time, it is directly proportional to the square of the input size. For example if the array has 2 items, it will print 4 times.

Note: In O(n2) as the number of steps increases exponential, number of elements also increases. It is the worst case Time Complexity method.

O(n2) Time Complexity graph

package sample1;

public class sample {

public static void main(String[] args) {

int n = 2;

for (int i=0; i<n; i++) // O(n) Complexity

for (int j=0; j<n; j++) //O(n) complexity

System.out.println(i+ “,” +j);


}  // O(n)+O(n)=O(n^2) Complexity

Output: 0,0





O(n^2) Complexity example

6. O(2n)

The O(2n) is also called as recursive calculation method i.e repeatedly it will check condition unless it becomes true, here the performance will be doubles for each input.

package sample1;

public class sample {

static int n = 4;

public static void main(String[] args) {

if (n <= 3) {


} else {

System.out.println((n-1) +(n-2));




Output: 5

O(2^n) complexity example

2. Ω (Omega) Notation

* Omega describes the best-case scenario i.e the best amount of time taken to execute the program.

3. θ (theta) Notation

* Theta describes the both best case scenario and worst-case scenario  of a program running time.

How to Calculate Time Complexity of a java program?


Consider the below program to calculate the time Complexity.

package sample1;

public class sample {

public static void main(String[] args) {

int n = 2;

int p = 1;

int a = 2+3; // O(1) complexity

for (int i=0; i<n; i++) { // O(n) complexity

for (int j=0; j<n; j++)  {//O(n) complexity

for (int k=0; k<n; k++) { //O(n) complexity







Output:1 2 3 4 5 6 7 8 

Time Complexity = O(1) + O(n) + O(n) + O(n) //taking big oh has a common factor

= O(1) + O(n3)

= O(1+n3)  //higher value is taken for complexity i.e n3

= O(n3)


Time Complexity example

Time Complexity analysis table for different Algorithms From best case to worst case

AlgorithmData structureBest caseAverage caseWorst case
Quick sortArrayO(n log(n))O(n log(n))Ο(n2)
Merge sortArrayO(n log(n))O(n log(n))O(n log(n))
Heap sortArrayO(n)O(n log(n))O(n log(n))
Smooth sortArrayO(n)O(n log(n))O(n log(n))
Bubble sortArrayO(n)Ο(n2)Ο(n2)
Insertion sortArrayO(n)Ο(n2)Ο(n2)
Selection sortArrayΟ(n2)Ο(n2)Ο(n2)

“That’s all about the Time Complexity in java, i hope this post is usefully to become the best developer in the top Software Companies”


Leave a Reply

Your email address will not be published. Required fields are marked *