helion3 - 2 months ago 12

Javascript Question

I'm generating an isometric tile map using a diamond pattern:

`tileWidth = 128;`

tileHeight = 94;

for (var x = 0; x < rows; x++) {

for (var y = 0; y < cols; y++) {

var screenX = (x - y) * tileWidthHalf;

var screenY = (x + y) * tileHeightHalf;

drawTile(screenX, screenY);

}

}

This renders correctly, but now I'm having trouble converting screen coordinates (mouse location) back to the isometric coordinates.

I've tried reversing the math:

`var x = _.floor(screenY / (tileWidth / 2) - (screenX / tileWidth / 2));`

var y = _.floor(screenY / (tileHeight / 2) + (screenX / tileHeight / 2));

It works fine for the

`0, 0`

I'm just unable to come up with the right math - am I missing something trivial or am I just all wrong about the process?

Answer

I don't see how you came up with this solution. You have to solve the system of equation, which gives the following solution:

```
x = 0.5 * ( screenX / tileWidthHalf + screenY / tileHeightHalf)
y = 0.5 * (-screenX / tileWidthHalf + screenY / tileHeightHalf)
```

If you need the tile index, use `floor`

as in your code.

I can only guess what your alignment of the tiles in the coordinate system looks like. But from the screenshot you posted in the comments, I assume that you need to swap `screenX`

with `(screenX - tileWidthHalf)`

to get accurate values.

Source (Stackoverflow)

Comments