SLA SLA - 4 months ago 48
Javascript Question

How to use Uint8Array, Uint16Array, Uin32Array

I recently just started using

webgl
and I am trying to understand the difference between
Uint8Array, Uint16Array, Uin32Array.
and how you would use them. I found some information about it here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array but can anyone tell me the difference between them and how you would use them?

Answer

Uint*Arrays construct arrays (as commented by @zfor, they're not regular arrays, then there's no push method, for example) with numbers (bytes, yet) only. The difference is that each constructor array has different byte range in memory. Uint8Array has 1 byte only, then the limit of a number is 255. Uint16Array is 2 bytes long, then the limit is 65535. Uint32Array is 4 bytes long, so the limit is 4294967295.

When constructing a Uint*Array you declare the array length as the first argument:

var arr = new Uint8Array(1);

If you declare a array/buffer/object instead, the constructor transform it in a Uint*Array.

var arr = new Uint8Array([10, 257]);
console.log(arr[0]); // 10
console.log(arr[1]); // 1 (same thing: 257 % 256)

Now, see some examples:

arr[0] = 256;
console.log(arr[0]); // 0

arr[0] = 255;
console.log(arr[0]); // 255
Comments