Random Forest over Decision Tree. Why is RF better?

Let’s imagine an example, you’d like to go on a date and want to pick up a restaurant for that. To make a selection you’ll ask your friend. Your friend (he or she) will ask you a few questions before recommending like what type of food you’d like, what is your budget or how far would you like to go? Upon the answers you give, he or she will recommend you a place.
From machine learning perspective the questions that you were asked are nothing but features.
Now, you would probably consider asking multiple friends and make the most suitable choice right? And each of your friend will ask you different set of questions, probably similar, but not the same.
This is exactly how a random forest will work.

(PS: This was just an illustrative example. A human brain doesn’t consist of DT models, but Neural Nets, and very complex)

It depends on your problem statement. We can’t be bias towards one model for every problem statement.

Decision tree (DT) is non linear ML model and Random forest(RF) is basically bagging technique which is the type of ensemble model.

RF Is nothing but the combination of multiple Decision trees and output of model is aggression of multiple decision trees.

Performance of RF is better that normal DT as it uses multiple DTs for output.

Each one has its own advantages and disadvantages.