Coins have a value of ci, for every i ≥ 0, for some integer constant c > 1. For example, if c = 3, you have coins with values 1 (= 30), 3, 9 (= 32), 27, ...
You need to design an algorithm that, given an integer value n, makes change for n using the fewest number of coins. You can assume that there is an unlimited supply of coins of each denomination ci with ci ≤ n.
I've come up with the greedy algorithm, but I'm stuck on this question:
Show that any optimal solution has at most c − 1 coins of value ci, for any i.
I get it, and I see how it works, but I don't know how to verbalize/show it. Can someone please point me in the right direction?
c;) ...s*that uses more thanc-1coins of valuesc^ifor somei. Then you can replaceccoins of valuec^iwith a single coin of valuec^{i+1}, thereby reducing the number of coins byc. But thens*cannot be an optimal solution. A contradiction.