Featured

Asymptotic Notations and Basic Efficiency Classes


Asymptotic complexity is a way of expressing the main component of the cost of an algorithm, using idealized (not comparable) units of computational work. Consider, for example, the algorithm for sorting a deck of cards, which proceeds by repeatedly searching through the deck for the lowest card. The asymptotic complexity of this algorithm is the square of the number of cards in the deck.
To compare and rank such orders of growth, computer scientists use three notations: O (big oh),  (big omega), and Ѳ (big theta).
First, we introduce these notations informally, and then, after several examples, formal definitions are given. In the following discussion, t(n) and g(n) can be any nonnegative functions defined on the set of natural numbers. In the context we are interested in, t(n) will be an algorithm’s running time (usually indicated by its basic operation count C(n)), and g(n) will be some simple function to compare the count with.

O-notation

Definition a function t (n) is said to be in O(g(n)), denoted t (n) ∈ O(g(n)), if t (n) is bounded above by some constant multiple of g(n) for all large n, i.e., if there exist some positive constant c and some nonnegative integer n0 such that
t (n) ≤ cg(n) for all n ≥ n0


Figure a : Big-oh notation: t (n) ∈ O(g(n))

The definition is illustrated in Figure a where, for the sake of visual clarity, n is extended to be a real number. As an example, let us formally prove one of the assertions made in the introduction : 100n + 5 ∈ O(n2). Indeed,
100n + 5 ≤ 100n + n (for all n ≥ 5) = 101n ≤ 101n2

Thus, as values of the constants c and n0 required by the definition, we can take 101 and 5, respectively. Note that the definition gives us a lot of freedom in choosing specific values for constants c and n0. For example, we could also reason that
100n + 5 ≤ 100n + 5n (for all n ≥ 1) = 105n
to complete the proof with c = 105 and n0 = 1.

Ω-notation

Definition a function t(n) is said to be in Ω(g(n)), denoted t (n) ∈ Ω(g(n)), if t (n) is bounded below by some positive constant multiple of g(n) for all large n, i.e., if there exist some positive constant c and some nonnegative integer n0 such that t (n) ≥ cg(n) for all n ≥ n0 The definition is illustrated in Figure b

Figure b : Big-omega notation: t (n) ∈ Ω(g(n))

Here is an example of the formal proof that n3∈ Ω(n2): n3 ≥ n2 for all n ≥ 0, i.e., we can select c = 1 and n0 = 0

Ѳ-notation

Definition a function t (n) is said to be in Ѳ(g(n)),denoted t (n) ∈ Ѳ(g(n)), if t (n) is bounded both above and below by some positive constant multiples of g(n) for all large n, i.e., if there exist some positive constants c1 and c2 and some nonnegative integer n0 such that
c2g(n) ≤ t(n) ≤ c1g(n) for all n ≥ n0
The definition is illustrated in Figure c.

Figure c : Big-theta notation: t (n) ∈ Ѳ(g(n))

For example, let us prove that :

First, we prove the right inequality (the upper bound):

Second, we prove the left inequality (the lower bound):

Hence, we can select:


Related Post :
Algorithms Introduction

www.CodeNirvana.in

Copyright © Computer Science | Blogger Templates | Designed By Code Nirvana