I've made a super simple script to pop out some hourly rates from a pool of tips. Thing is, this one specific result always comes out wrong. What the heck is going on?
var tips = prompt('Enter final tips after payouts and cleaning');
//Hours worked for both positions
var tendHrsFirst = 11;
var tendHrsSecond = 10;
var barThourly = ((tips/(tendHrsFirst+++tendHrsSecond)));
//This result here always comes out as if tendHrsFirst is 12 and not 11.
var barToneTotal = (tendHrsFirst * barThourly);
//This result is always correct
var barTtwoTotal = (tendHrsSecond * barThourly);
You are incrementing with
tendHrsFirst++, so it actually is 12.
I guess those are actually two commands.
tendHrsFrist by 1. Afterwards, you add both numbers. Not sure why you think that's a good idea. Cleaning up your code should help avoiding such mistakes.