Is it always possible that (a == 1 && a == 2 && a == 3) could evaluate to true in JavaScript?

Question:

Is it possible that (a == 1 && a == 2 && a == 3) can evaluate to true ?

This is an interview question asked by a large technology company. I'm trying to find the answer. I know we never write this code in our day-to-day life, but I'm curious.

Answer:

It is possible if a is a constant object that returns values ​​that are incremented each time the object is invoked. In the case below:

const a={
  valor: 1,
  valueOf: function(){
    return a.valor++;
  }
}

On the first invocation of a it will return 1 . After that, each time a is called, it will return the last +1 value (1, 2 , 3…).

Hence, (a == 1 && a == 2 && a == 3) will be true, because a will be equal to 1, 2, and 3 returned from the constant a .

When I invoke a == 1 , the value of a is 1 (true) and a changes to 2 , so a == 2 is also true, and so on.

To exemplify, I could also do so that the result is the same:

 const a={ valor: 1, valueOf: function(){ return a.valor++; } } if(a == 1 && a == 2 && a == 3 && a == 4 && a == 5){ // até esse "a" no console.log abaixo já incrementa o valor console.log("fim. O valor de a é "+a); }

There is a similar question on SOen .

Scroll to Top