**Big-Omega** is the lower bound of a function is denoted as *notation* (**Ω**).

**Definition:** Suppose we have two functions **f(n)** and **g(n)**, then the function **f(n) = Ω(g(n)** if and only if there exists two positive constants **c** and **k**, such that **f(n) >= c * g(n)** for all **n >= k**

For example:

Suppose, if **f(n) = 2 * n + 3**, then

**2 * n + 3 >= 1 * n** for all **n >= 0**.

In the above equation,

c = 1, k = 0, and g(n) = n

Now we know that, 1< log(n) < n^(1/2) < n < n log(n) < n^2 < n^3 < — < 2^n < 3^n <---- n^n.

In the above comparison, 1< log(n) < n^(1/2) are lower bound.

This symbolizes,

- f(n) = Ω(n) —
**True**
- f(n) = Ω(n^2) —
**False**
- f(n) = Ω(1) —
**True**, and so on…

**Finally, in Big-Omega, we take the nearest g(n), which in this case is **Ω(n)**.

1 Like

What is the difference between a knife and an orange? Yup, two different things. The connection there is that, among its other uses, you can use a knife to peel an orange.

The big omega notation is mathematical notation used to talk about asymptotic growth of functions. Saying that “𝑔

is Omega of 𝑓" (or writing “𝑔∈Ω(𝑓)”) means that 𝑔 grows asymptotically at least as quickly as 𝑓

.

Big omega notation is useful to express lower bounds. The sentence “each comparison-based sort runs in Ω(𝑛log𝑛)

" means that the (worst-case) time complexity of any comparison-based sort is a function that grows at least as quickly as the function 𝑛log𝑛

.

Among other things, you can also use this notation to talk about the best-case performance of algorithms. For example, you can say things like “on any input of size 𝑛

, MergeSort will perform Ω(𝑛log𝑛) steps".