Two women are selling apples. The first sells 30 apples at 2 for $1, earning $15. The second sells 30 apples at 3 for $1, earning $10. So between them they’ve sold 60 apples for $25.

The next day they set the same goal, but work together. They sell 60 apples at 5 for $2, but they’re puzzled to find that they’ve made only $24.

What became of the other dollar?

If you look at the average price of the apples on the first day, it is $25 divided by 60, or 41^{2}⁄_{3} cents per apple.

When the women sell the apples at 5 for $2 on the second day, they are only charging 40 cents per apple. The decrease in price of 1^{2}⁄_{3} cents per apple for 60 apples accounts for the missing dollar.

Here is a another way to look at this. On the second day, every time 5 apples are sold for $2, let us assume that the first woman takes $1 and gives 2 apples and the second woman takes $1 and gives 3 apples.

This way the first woman sells 24 apples and second one sells 36 apples for $12 each.

This means that the 6 apples of the first woman have been transferred to the second woman at a loss of $1. If the first woman sold the 6 apples, she would have made $3. But since the second woman sold the 6 apples, she would have only made $2. $3 – $2 = $1, the loss they incurred compared to before.