What is the difference between cluster computing and grid computing?

Cluster computing is a method of minimizing downtime. In a cluster, different computers can take over the role of a failed member of the cluster. Something like high up time requirements for back-end database.

Grid computing is a method of enhancing compute power by breaking tasks into small pieces to be worked on by your peers in the grid. Something like a render farm for an animated movie frame.