I had my first computing module class yesterday, and it was quite fun. Just some pseudo-codes, so I can't say it is really fun :p
Anyway, I just knew that in C, when an integer is divided by an integer, the result is an integer and the decimals are omitted.
For example, 375/100 would be 3.75, but in C if both are integer the result would be 3. How is this useful?
The given question was to count the least amount of coins needed to make up an amount of money. Let's say $3.75, if we have 1 dollar, 50 cents, 20 cents, 10 cents, 5 cents and 1 cents, then we would need minimum 6 coins to make it up (3x $1, 1x 50c, 1x 20c, 1x 5c).
The algorithm would be to count with the biggest coin value first. Then, how to get the remaining value? First, convert the money into cents (in this case 375). Divide by 100, you would get the result 3, meaning 3 coins of $1 are needed. The decimals are thrown away. To get the remaining value, use the mod function, 375%100 = 75. Yay!
No comments:
Post a Comment