user24580807245 - 1 year ago 106

Python Question

i am tasked to make a program that will take a monetary amount and find the minimum number of coins needed to get that amount. here is my code.

`import math`

n1 = eval(input("Enter a monetary amount: "))

n1 = n1 * 100

dollars = 0

quarters = 0

dimes = 0

nickels = 0

pennies = 0

dollars = n1 / 100

n1 %= 100

quarters = n1 / 25

n1 %= 25

dimes = n1 / 10

n1 %= 10

nickels = n1 / 5

n1 %= 5

pennies = n1

print (int(dollars), int(quarters), int(dimes), int(nickels), int(pennies))

whenever I enter a number that needs nickels, it doesn't count them. for example, the output for 1.05 would be

`1 0 0 0 0`

the output for 1.15 is

`1 0 1 0 4`

any hints would be appreciated, thanks.

edited a typo that i had, code is still not working as intended though.

Answer

Looks like a typo: `nickels`

vs `nickles`

Edit: now that you've fixed the typo, it looks like it's definitely a rounding issue. Since you're converting from dollars to a whole number of cents, try turning it into an integer before doing any operations.

Change your `n1 = n1 * 100`

line to `n1 = int(round(n1 * 100))`

. I tried this out on my computer and it seemed to work.

Source (Stackoverflow)