kdweber89 - 1 year ago 68

Javascript Question

I've been looking and haven't found a simple question and answer on stack overflow looking into finding the average of an array.

This is the array that I have

`var grades = [80, 77, 88, 95, 68];`

I first thought that the answer to this problem would be something like this:

`var avg = (grades / grades.length) * grades.length`

console.log(avg)

However this gave me an output of NaN.

So then I tried this:

`for ( var i = 0; i < grades.length; i ++){`

var avg = (grades[i] / grades.length) * grades.length

}

console.log(avg)

This gave me an output of 68. (I'm not sure why).

So with this I have two questions. 1. Why was my output 68? and 2. Could somebody help me out with actually finding the average of an array?

Recommended for you: Get network issues from **WhatsUp Gold**. **Not end users.**

Answer Source

You calculate an average by adding all the elements and then dividing by the number of elements.

```
var total = 0;
for(var i = 0; i < grades.length; i++) {
total += grades[i];
}
var avg = total / grades.length
```

The reason you got 68 as your result is because in your loop, you keep overwriting your average, so the final value will be the result of your last calculation. And your division and multiplication by grades.length cancel each other out.