A friend asked this question, and it is a common question, so let us get the definitions straight.
Grid computing has been around longer than all other terms. It implies the technology that allows to combine a number of computers in a system working together. It often include heterogeneous environments, that is, computers of different capacity, configuration, and even different OS.
As an example of grid technology, look at ProActive - anything but simple. It is based on the "Theory of distributed objects" book. I have read it, and it is deep.
Cloud computing is a newer term, and it talks about where you get your computers, that is, from the cloud. You can get just one, and it will be cloud computing, but not grid.
Now you can see how the terms are related. Grid is the technology to organize the work of many computers, and it may be very complex. It may be in the open or hidden, but you hardly ever implement it. You usually build your implementation on some grid technology. Cloud is where the computers are. For example, you may use EC2 cloud. If you want to have a grid computing system, you would need to choose grid technology, and then implement it using your cloud computers.
Another example to consider is Google App Engine. You write your application (currently in Python and using Google BigTable), and place it on a Google network. For you, it is a cloud. You application may need many computers, and Google will take care of that. Internally, they run a grid, but it is hidden from you.
Distributed computing is so similar to grid computing that for practical purposes they are one and the same. However, distributed computing is more concerned with breaking the computations in parts which can run concurrently and independently, or with organizing a system of computations using many components, and is less concerned with computers it runs on.
Computing-on-demand is more of a marketing term. It is very close to SAAS, or Software As A Service. It emphasizes that you do not own your equipment nor your applications. You just rent the computing power, usually for some specific purpose. One example of computing-on-demand is SalesForce. As far as its customers are concerned, it is a hands-off CRM (Customer Relations Management) system, which they just use. However, the internal technology that it uses is some grid technology. And the many programmers who build their applications on SalesForce are treating the platform as a cloud, because this is where they get their computing power.