Algorithm complexity is a measure of how long it would take an algorithm to complete given an input of size
n
. If an algorithm has to scale, it should complete the result within a finite and practical time bound even for large values of
n
. For this reason, complexity calculated asymptotically as
n
approaches infinity.
In this article we are going to take a look at
Asymptotic notations
How to compare two functions using limits?
Master theorem
How to solve recurrence relations?
Asymptotic Notations
Calculation number of operations allows you to compare the efficiency of algorithms. As we’ve mentioned in the Introduction section, the analysis is carried out with the expectation of a sufficiently large amount of processed data where
n→∞
, therefore, the growth rate is a key of importance, not the exact number of operations.
At this moment, we are ready to deep dive into details. Let’s define the function classes and consider asymptotic notations that describe the running time for a particular program or algorithm:
O(n)
- functions whose growth is faster than
g
Ω(n)
- functions whose growth is slower than
g
Θ(n)
- functions whose growth is the same as
g
For a given function
g(n)
the notation
Θ(g(n))
denotes:
Let us have a recurrence relation of the following form:
T(n)={a∗T(bn)+O(nc),O(1)n > 1n = 1
where
a∈N,b∈R,b>1,c∈R+
. Then the solution of the given recurrence relation has the form:
If
c>logba
, then
T(n)=Θ(n)
If
c=logba
, then
T(n)=Θ(nclogbn)
If
c<logba
, then
T(n)=Θ(nlogba)
Proof:
Let’s consider the recursion tree. It has
logbn
levels where each
k
-level has
ak
vertexes each of them costs
(bkn)c
.
Let’s summarize the cost of the tree at all levels:
T(n)=k=0∑logbnak∗(bkn)c=nck=0∑logbn(bca)k
We get the tree cases:
If
c>logba
, then
∑(bca)k
is the sum of a decreasing geometric progression (where
bca<1
), which doesn’t depend on
n
. It means that the sum is just a constant.
There are three main methods for solving recurrence relations: substitution method, recursion tree and master theorem. Let’s consider them.
Substitution method
In order to solve a recurrence relation using the substitution method it's necessary to complete two steps:
Make an assumption about the solution;
Use the mathematical induction to prove the assumption’s correctness.
Example
Show that the solution of
T(n)=T(⌈2n⌉)+1
is
O(lgn)
Proof:
Assume that
T(n)≤c∗lg(n−a)
(we used a definition of big-
O
from the Asymptotic Notations part), where
a
is a number which will be selected later. Now, we are ready to substitute it into initial recurrence.
We proved that
O(lgn)
is a solution for the given relation when
a≥1,c≥1
.
Let’s see another example.
Show that the solution of
T(n)=T(n−1)+n
is
O(n2)
Proof:
Assume, that
T(n)≤c∗n2
. Let’s substitute this inequality to a given recurrence.
Then,
T(n)≤c∗(n−1)2+n=cn2−2cn+c+n=cn2+n(1−2c)+c≤cn2
The last step completes when
c≥21
.
Recursion Tree
Another way to find a solution for relation recursion is using a Recursion Tree.
In this method, a recurrence relation is converted into recursive tree. Each node represents the cost incurred at various levels of recursion. To find the total cost, cost of all levels are summed up.
Example
Use a recursion tree to determine a good asymptotic upper bound on the recurrence
T(n)=4T(2n+2)+n)
. Use the substitution method to verify your answer.
Proof:
The tree has
log2n
levels. The subproblem size for a node at depth
i
is
2in
. The subproblem’s size reach
1
when
2in=1⇒i=log2n
.
The last level of depth
log2n
contains
4log2n=n2
.
Top comments (0)