skeggse skeggse - 4 days ago 5
Javascript Question

Using HTML5 WebGL Shaders for Computation

It seems to me like one could theoretically use WebGL for computation--such as computing primes or π or something along those lines. However, from what little I've seen, the shader itself isn't written in Javascript, so I have a few questions:


  1. What language are the shaders written in?

  2. Would it even be worthwhile to attempt to do such a thing, taking into account how shaders work?

  3. How does one pass variables back and forth during runtime? Or if not possible, how does one pass information back after the shader finishes executing?

  4. Since it isn't Javascript, how would one handle very large integers (BigInteger in Java or a ported version in Javascript)?

  5. I would assume this automatically compiles the script so that it runs across all the cores in the graphics card, can I get a confirmation?



If relevant, in this specific case, I'm trying to factor fairly large numbers as part of a [very] extended compsci project.

EDIT:


  1. WebGL shaders are written in GLSL.


Answer

There's a project currently being worked on to do pretty much exactly what you're doing - WebCL. I don't believe it's live in any browsers yet, though.

To answer your questions:

  1. Already answered I guess!
  2. Probably not worth doing in WebGL. If you want to play around with GPU computation, you'll probably have better luck doing it outside the browser for now, as the toolchains are much more mature there.
  3. If you're stuck with WebGL, one approach might be to write your results into a texture and read that back.
  4. With difficulty. Much like CPUs, GPUs can only work with certain size values natively, and everything else has to be emulated.
  5. Yep.
Comments