DEV Community

Cover image for TAKE DOWN OF TIME AND SPACE COMPLEXITY
rengadeviv
rengadeviv

Posted on • Updated on

TAKE DOWN OF TIME AND SPACE COMPLEXITY

TAKE DOWN OF TIME AND SPACE COMPLEXITY

Before start learning data structures and algorithms it is super important to know what and why time and space complexity is.

Algorithm is a step by step for solving a computational problem which is in design phase. For a given problem, there can be multiple solutions using different data structures and algorithms. How do you decide as a programmer which is better solutions which is why analysis of algorithms takes place.

ASSYMPTOTIC ANALYSIS

Assymptotic analysis is measuring order of growth of a program in interms of its input size.

When we do analysis of algorithms,we consider terms of input size which is greater than zero or equal to zero because which have never be less than zero in mathematical analysis.Also,the function of time taken can never be negative because algorithm or program can never take negative to run.

For instance, sum of numbers can be code in multiple ways like given below.

I1:int fun1(int n)                        
    {                                              
       n = n*(n+1)/2;                              
       return n;                                    
     } 
Enter fullscreen mode Exit fullscreen mode
I2: int fun2(int n)
      {
        int sum = 0;
        for(int i=1;i<=n;i++)
             sum = sum+i;
        return sum;
       } 
Enter fullscreen mode Exit fullscreen mode
 I3: int fun3(int n)
      {
         int sum=0;
         for(int i=1;i<=n;i++)
            for(j=1;j<=i;j++)   
               sum++;
         return sum;
       }

Enter fullscreen mode Exit fullscreen mode

Significance

  1. The idea is to measure order of growth.
  2. Does not depend upon machine, programming language.
  3. No need to implement, we can analyze algorithms only.
  4. In assymptotic analysis,mathematical operations or comparisional operations,lets assume always takes constant or same time irrespective of input value.

ORDER OF GROWTH

Time Complexity/order of growth defines the amount of time taken by any program with respect to the size of the input.Order of growth specifies how the program would behave as the order of size of input is increased.

NOTE: If order grows faster program is less efficient, order grows slower program is more efficient.

A function f(n) is said to be growing faster than g(n) if g(n)/f(n) = 0 else f(n)/g(n) = 0, where f(n),g(n) represents time taken for every
n>=0 which means size of input must be greater than zero.

DIRECT WAY TO FIND & COMPARE GROWTHS

  1. Ignore lower order terms.
  2. Ignore leading term constant.
For Example: 2n2 + n + 6   order of growth: O(n2)
                         100n + 3     order of growth: O(n)
Enter fullscreen mode Exit fullscreen mode

How do we compare terms?
Here it is inequality, C=1 < loglogn < logn < n1/3 < n1/2 < n < n2 < n3 < n4 < 2n < nn where C tends to constant.

ASSYMPTOTIC NOTATION

Assymptotic notation are the mathematical notation to represent order of growth of any mathematical function. This comes from the mathematics because function comes from mathematics.

In time complexity we used functions in algorithms.

O Big-O : Exact or upper bound, Theta : Exact bound
Ω Omega : Exact or lower bound

Any function you can represent either upper bound or lower bound or average bound.
When we can’t find the exact notation of functions time complexity,we can use upper bound or lower bound.

The function f(n)=Og(n) if and only if there exist positive constant such that f(n)<=c*g(n) for every value n >=n0
for instance : f(n)
2n+3 <= 10n, n>=1 where f(n)=2n+3,C=10,g(n)=n
so here f(n) = O(n)

2n+2 <= 5n, n>=1 where f(n)=2n+2,C=5,g(n)=n
so here f(n) = O(n)
2n+3 <= 5n2 where f(n)=2n+3,C=5,g(n)=n
so here f(n) = O(n2)

Lets deep into algorithm to find time complexity

Algorithm swap(a,b){
  temp = a;   ......  1 time
  a = b;       ...... 1 time
  b = temp;   ...... 1 time
}                -------------
                 f(n) = 3  .......  constant value or fixed value.

so here orderoftime = O(1)
Enter fullscreen mode Exit fullscreen mode

Why we analyse things and find complexities here is a situation example, If you want to go school,simply took a car and reach out that place (does not require analysis).
If you want to go mars, yes absolutely need to do some analysis, every minute details you must take care of it. So it depends on your requirements.

We can analyse algorithm using Frequency Count Method,

Algorithm sum(A,n)
{
s=0; .....1
for(i=0;i<n;i++) ........  i=0...1,i<n....n+1,i++....n
{
    s = s + A[i]; .....n
}
return s; .....1
}
Enter fullscreen mode Exit fullscreen mode

here declaration i = 0 it happens one time so i = 0 ....1
condition checking i < n it happens n+1 time so ...... n+1
operation s = s + A[i] happens n times so ..... n

so sum up all 1 + 1 + n + 1 + n + n + 1 = 4 + 3n ...... O(n)

The time taken by an algorithm known by assigning 1 unit of time for each statements and if any statement is repeating for some number of times the frequency of statement and the frequency of execution of that statement will calculate

f(n) = 4n + 3 degree of polynomial
O(n) – order of n

SPACE COMPLEXITY

The space calculation for above algorithm , A .... n alloc
n .... 1 single allocation
s .... 1 single allocation
i .... 1 single allocation
.................
n + 3 so O(n)
so space complexity O(n)

Another Example

Algorithm Add(A,B,n)
{
   for(i=0;i<n;i++) ...... n+1
   {
     for(j=0;j<n;j++) ..... n * (n+1)
     { 
        C[i,j]=A[i,j]+B[i,j] .... n * n
     }                      ..................
   }                       f(n) = n+1 + n(n+1) + n2
}                               = 2n2 + 2n + 1       ......... O(n2)

Enter fullscreen mode Exit fullscreen mode

SPACE COMPLEXITY
For above example,
A .... n * n
B .... n * n
C .... n * n

i ....1, n ....1, j ....1, k ....1 sum up = 3n2 + 4
Orderofspace = O(n2)

Multiplication of two matrices

Algorithm Multiply(A,B,n)
{
   for(i=0;i<n;i++)    ....... n+1
   {
      for(j=0;j<n;j++)    ...... n(n+1)
      {
        C[i,j] = 0;      ...... n * n
        for(k=0;k<n;k++)  .....n * n * (n+1)
        {
          C[i,j] = C[i,j] + A[i,k] * B[k,j];  ..... n * n * n
        }
       }}}
sum up = n+1 + n(n+1) + n*n + n*n*(n+1) + n*n*n
         = n + 1 + n2 + n + n2 + n3 + n2 + n3
               = 2n3 + 3n2 + 2n + 1
f(n) = 2n3 + 3n2 + 2n + 1   ......... order of growth = O(n3)
Enter fullscreen mode Exit fullscreen mode

SPACE COMPLEXITY

For above example
A .... n * n
B .... n * n
C .... n * n
n .... 1
i .... 1
j .... 1
k .... 1
..................
f(n) = n * n + n * n + n * n + 1 + 1 + 1 + 1
= n2 + n2 + n2 + 4
= 3n2 + 4 .......O(n2)

TYPES OF TIME FUNCTION

O(1) Constant
O(logn) Logarithmic
O(n) Linear
O(n2) Quadratic
O(n3) Cubic
O(2n) Exponential
O(3n), O(4n).............O(nn)

Follow my blogs, here it continous after this about notations and different types of analysis.

Top comments (0)