Calculate time complexity algorithms java programs
In this post will find the solutions for some of the queries related to Time Complexity.
 What is Time Complexity?
 Why to learn Time Complexity to become best developer?
 Why time complexity is an important issue in It field?
 How to calculate time complexity of algorithms program?
Introduction
Time Complexity measures the time taken for running an algorithm and it is commonly used to count the number of elementary operations performed by the algorithm to improve the performance. Lets starts with simple example to understand the meaning of Time Complexity in java.
Example
Problem Statement
What are the possible approaches to reach destination “D” from the source “A” with the best Time Complexity?
Solution
Consider the 3persons X, Y and Z
Case 1:
In this case the person “X” starts his journey via car.
Direction = ABD
Distance = 4 km
Time Complexity = 5 min
Case 2:
In this case The person “Y” starts his journey via Metro.
Direction = AD
Distance = 2 km
Time Complexity = 3 min
Case 3:
In this case the person “Z” starts his journey via bus.
Direction = ACD
Distance = 5 km
Time Complexity = 10 min
There fore In this problem source and destination is similar but the way of approaches are different like bus,car and metro to reach destination and the time taken is also different, that is called Time Complexity. In Software companies always looking for smart Developers like Person “Y” so Time Complexity always makes man perfect and smart.
We all know that In current generation everyone looking for high paid job in the software industries. For example Developers in Google, Facebook, LinkedIn and Microsoft. every developers write programs but the Top companies always looking for Smart developers (like “Y” in the above example) and even they will pay High salary to them, Because time is money.
Asymptotic Analysis
It is the method of describing the limiting (best, average and worst ) behavior of any programming operation.
The main Objective of Asymptotic Analysis is to measure of efficiency of the program.
The Asymptotic Analysis can be done in three ways, As shown below.
1. Best Case
It is the Minimum time taken for program execution. For example,
In the Introduction we have seen three different cases, in that we can say case 2 is the Best one with in the Time Complexity of 3 Minutes because, he reached the destination very early compare to others.
2. Average Case
It is the Average time taken for program execution. For example,
From the Introduction example we can say that case 1 is the Average one with the Time Complexity of 5 Minutes, compared to other two cases.
3. Worst Case
It is the Maximum time taken for program execution. For example,
From the Introduction example we can conclude that case 3 is the worst case with the Time Complexity of 10 Minutes compare to other cases.
Asymptotic Notations
The Asymptotic notations are used to calculate the running time complexity of a program.
It analyze a program running time based on the input size.
There are three types of Asymptotic notations used in Time Complexity, As shown below.
1. Ο (Big Oh) Notation
* It is used to describe the performance or complexity of a program.
* Big O describes the worstcase scenario i.e the longest amount of time taken to execute the program.
Common Asymptotic Notations
The below table has list of some common asymptotic notations in ascending order.
Constant  Ο(1) 
Logarithmic  Ο(log n) 
n log n  Ο(n log n) 
Linear  Ο(n) 
Quadratic  Ο(n^{2}) 
Cubic  Ο(n^{3}) 
Polynomial  n^{Ο(1)} 
Exponential  2^{Ο(n)} 
Lets understand the Big O Notation thoroughly by taking the java examples on common orders of growth like,
 O(1) – Constant
 O(log n) – Logarithmic
 O(n log n) – n log n
 O(n) – Linear
 O(n^{2}) – Quadratic
 O(2^{n}) – exponential
How to calculate time complexity of a java program with an example?
1. O(1)
The O(1) is also called as constant time, it will always execute in the same time regardless of the input size. For example if the input array could be 1 item or 100 items, but this method required only one step.
Note:O(1) is best Time Complexity method.
Coding
package sample1; public class sample { static int y = 3; static int z = 5; public static void main(String[] args) { int x = y+z; //O(1) complexity System.out.println(x); } } Output: 8 
Screenshot
2. O(log n)
In O(log n) function the complexity increases as the size of input increases. It scale up very well i.e increases the number of steps to handle large size of inputs. For example in the Binary search program it uses divide and conquer technique (breaking down a problem into two or more subproblems of the same type, until it become simple to be solved directly) for searching elements in large sized array.
Note: In O(log n) number of steps increases as the Size of elements increases.
Coding
package BinarySearch; public class BinarySearch { int binarySearch(int arr[], int l, int r, int x) { if (r>=l) { int mid = l + (r – l)/2; if (arr[mid] == x) return mid; if (arr[mid] > x) return binarySearch(arr, l, mid1, x); return binarySearch(arr, mid+1, r, x); } return 1; } public static void main(String args[]) { BinarySearch ob = new BinarySearch(); int arr[] = {2,3,4,10,40}; int n = arr.length; int x = 10; int result = ob.binarySearch(arr,0,n1,x); if (result == 1) System.out.println(“Element not present”); else System.out.println(“Element found at index ” + result); } } output: Element found at index 3 
Screenshot
3. O(n log n)
The O(n log n) function fall between the linear and quadratic function ( i.e O(n) and Ο(n^{2}). It is mainly used in sorting algorithm to get good Time complexity. For example Merge sort and quick sort.
Note: The O(n log n) is the good time Complexity method.
Coding
package test; import java.util.Arrays; public class QuickSort { public static void main(String[] args) { int[] x = { 9, 2, 4, 7, 3, 7, 10 }; System.out.println(Arrays.toString(x)); int low = 0; int high = x.length – 1; quickSort(x, low, high); System.out.println(Arrays.toString(x)); } public static void quickSort(int[] arr, int low, int high) { if (arr == null  arr.length == 0) return; if (low >= high) return; if (low >= high) return; int middle = low + (high – low) / 2; int pivot = arr[middle]; int i = low, j = high; while (i <= j) { while (arr[i] < pivot) { i++; } while (arr[j] > pivot) { j–; } if (i <= j) { int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; i++; j–; } } // recursively sort two sub parts if (low < j) quickSort(arr, low, j); if (high > i) quickSort(arr, i, high); } } Output: [9, 2, 4, 7, 3, 7, 10] [2, 3, 4, 7, 7, 9, 10]

Screenshot
4. O(n)
The O(n) is also called as linear time, it is direct proportion to the number of inputs. For example if the array has 5 items, it will print 5 times.
Note: In O(n) the number of elements increases, the number of steps also increases.
Coding
package sample1; public class sample { public static void main(String[] args) { int n = 5; for (int i=0; i<n; i++) // O(n) Complexity System.out.println(i); } } Output : 0 1 2 3 4 
Screenshot
5. O(n^{2})
The O(n^{2}) is also called as quadratic time, it is directly proportional to the square of the input size. For example if the array has 2 items, it will print 4 times.
Note: In O(n^{2}) as the number of steps increases exponential, number of elements also increases. It is the worst case Time Complexity method.
Coding
package sample1; public class sample { public static void main(String[] args) { int n = 2; for (int i=0; i<n; i++) // O(n) Complexity for (int j=0; j<n; j++) //O(n) complexity System.out.println(i+ “,” +j); } } // O(n)+O(n)=O(n^2) Complexity Output: 0,0 0,1 1,0 1,1 
Screenshot
6. O(2^{n})
The O(2^{n}) is also called as recursive calculation method i.e repeatedly it will check condition unless it becomes true, here the performance will be doubles for each input.
Coding
package sample1; public class sample { static int n = 4; public static void main(String[] args) { if (n <= 3) { System.out.println(n); } else { System.out.println((n1) +(n2)); } } } Output: 5 
Screenshot
2. Ω (Omega) Notation
* Omega describes the bestcase scenario i.e the best amount of time taken to execute the program.
3. θ (theta) Notation
* Theta describes the both best case scenario and worstcase scenario of a program running time.
How to Calculate Time Complexity of a java program?
Example
Consider the below program to calculate the time Complexity.
package sample1; public class sample { public static void main(String[] args) { int n = 2; int p = 1; int a = 2+3; // O(1) complexity for (int i=0; i<n; i++) { // O(n) complexity for (int j=0; j<n; j++) {//O(n) complexity for (int k=0; k<n; k++) { //O(n) complexity System.out.println(p++); } } } } } Output:1 2 3 4 5 6 7 8 
Time Complexity = O(1) + O(n) + O(n) + O(n) //taking big oh has a common factor
= O(1) + O(n^{3})
= O(1+n^{3}) //higher value is taken for complexity i.e n^{3}
= O(n^{3})
Snapshot
Time Complexity analysis table for different Algorithms From best case to worst case
Algorithm  Data structure  Best case  Average case  Worst case 
Quick sort  Array  O(n log(n))  O(n log(n))  Ο(n^{2}) 
Merge sort  Array  O(n log(n))  O(n log(n))  O(n log(n)) 
Heap sort  Array  O(n)  O(n log(n))  O(n log(n)) 
Smooth sort  Array  O(n)  O(n log(n))  O(n log(n)) 
Bubble sort  Array  O(n)  Ο(n^{2})  Ο(n^{2}) 
Insertion sort  Array  O(n)  Ο(n^{2})  Ο(n^{2}) 
Selection sort  Array  Ο(n^{2})  Ο(n^{2})  Ο(n^{2}) 
“That’s all about the Time Complexity in java, i hope this post is usefully to become the best developer in the top Software Companies”