newman newman - 6 months ago 26
Javascript Question

Issue trying to find length of sequence in an array

I'm trying to find the length of the sequence in an array, between the first and the second occurance of a specified number.

For Example:

lengthOfSequence([0, -3, 7, 4, 0, 3, 7, 9], 7)
would return 5, because there are 5 indices between the first and second occurrence of the number 7.

I feel like the code that I have written should work, but after console logging it looks as if my
method is only pushing the first index to my indexes array variable, and its pushing it twice. Why would this be happening?

Here is my code for context:

var lengthOfSequence = function (arr, n) {

var indexes = [];

for (var i = 0; i < arr.length; i++) {
if (arr[i] === n) {

return arr.indexOf(indexes[1]) - arr.indexOf(indexes[0]);


So, for example, if I use my array that I used earlier
lengthOfSequence([0, -3, 7, 4, 0, 3, 7, 9], 7)
, my for loop would find the first occurrence of 7 (index 2) and push it to my index array variable, but it would just do it twice. So my indexes array would just be
. Why would it not be


indexOf does not do what you think it does. It returns the index of the first item that it finds with the provided value. For both values in the array, it returns that first index.

Since you want the index only and you are already iterating over it with your loop, you can simply use i itself: