# DECODING TIME>)

TIME! It is an autonomous identity that manifests its being in the world. No one has ever conquered the identity till date. The big shots in the world have just tweaked it in a certain way making them lucrative in their fields. Today’s topic is based on time complexity, A term that is hell-bent on making a computer science engineer’s life worst. In the field of computer science, the concept of time complexity is very crucial, a programmer must know the best possible way to solve an algorithm.

# ORDER OF GROWTH

Let’s start with a very basic and foremost topic of time complexity i.e order of growth. Basically, it tells us about how the time of execution is dependent on the length of the input. It will help us in calculating the running time of a code with ease. We ignore the small terms as they are quite insignificant when compared to a larger input. Order of growth helps in hardwiring the process of time complexity as it lets us know how a particular algorithm is dependent on the size of the input.

# NOTATIONS AND LIMITING BEHAVIOR OF FUNCTIONS

O-notation:

To denote asymptotic upper bound, we use O-notation. For a given function g(n), we denote by O(g(n)) (pronounced “big-oh of g of n”) the set of functions:
O(g(n))= { f(n) : there exist positive constants c and n0 such that 0≤f(n)≤c∗g(n) for all n≥n0 }

Ω-notation:

To denote asymptotic lower bound, we use Ω-notation. For a given function g(n), we denote by Ω(g(n)) (pronounced “big-omega of g of n”) the set of functions:
Ω(g(n))= { f(n) : there exist positive constants c and n0 such that 0≤c∗g(n)≤f(n) for all n≥n0 }

Θ-notation:

To denote asymptotic tight bound, we use Θ-notation. For a given function g(n), we denote by Θ(g(n)) (pronounced “big-theta of g of n”) the set of functions:
Θ(g(n))= { f(n) : there exist positive constants c1,c2 and n0 such that 0≤c1∗g(n)≤f(n)≤c2∗g(n) for all n>n0 }

=>While calculating the time complexity we always consider big oh of n notion as it gives is the upper bound of the function which in turn is the worst case scenario.

# UNDERSTANDING THE NOTATIONS IN AN EASY WAY

1. O(N) — Set of functions that grows slowly or equal to the rate of the expression. It represents the worst case of an algorithm’s time complexity.
2. Ω(N) — Set of functions that grows faster or equal to the rate of the expression. It represents the best case of an algorithm’s time complexity.
3. Θ(N) — It consist of all the functions lying between the big oh and omega. It represents the average case of an algorithm’s time complexity.

I HOPE THIS BLOG IS INFORMATIVE AND SHOULD GIVE YOU A SLIGHT IDEA ABOUT THE TIME COMPLEXITY AND ITS CALCULATION.