Hacker News new | past | comments | ask | show | jobs | submit login
Gpu.js – GPU Accelerated JavaScript (gpu.rocks)
321 points by olegkikin on July 14, 2017 | hide | past | favorite | 77 comments



From their post-hackathon devpost at https://devpost.com/software/gpu-js,

What I learned: Not to make a compiler during a hackathon.


goldblum moment


I've built an animated raytracer using this library.

http://raytracer.crypt.sg

The compile-able JavaScript subset is still fairly limited, so vector operations are quite painful. Beyond that, it's a great library to start with for parallel computation in the browser if you have no background in WebGL/GLSL.


I love this, but the occasional stutter makes my GC detector tingle :)


It reminds me the path-traced online game, that I made three years ago :) http://powerstones.ivank.net/


Nice but please, at least, put an audio off button :)


One thing that might be helpful is if you linked to the Github repo on the site: https://github.com/gpujs/gpu.js. Especially for a project like this, high-performance computing, I'm gonna want to look at the code before I trust it.


There's a github corner-triangle-tabby-thingy in the top right corner of the page. Took me a moment of exasperation to spot, granted.


I saw this a few months back when I was researching GPGPU implementations in the browser. It looked like one of the better projects out there. Do you know how much theoretical GPU allocation you can get from a JS implementation vs say a native wrapper?

Worth noting that Clarifai just released an SDK for offline mobile DL training/evaluation. Not browser based but I'd be curious what the difference in GPU utilization is practically.


Is js via gpu (WebGL i assume) already efficient enough to make the concept of "computation time instead of banners" finally viable?


This will never become a thing. If my phone heats up and dies after twenty minutes because the news site I'm reading is crunching numbers on my phone's GPU, I am (and most other folks on HN are) going to be VERY salty. Compute time on user machines isn't free, and users will notice.

On top of that, display advertising pays when the user sees the ad. Coin mining only pays when the user has completed enough useful work to send results back to the server. The odds of that happening for a huge number of websites are very low.


doesn't have to be 100% cpu calculations


I've never come across this concept before. What is it? Is it something like, to see the page content, you must have a key which is either time-consuming to compute, or is just given to you if you first view an ad?


Its the use of hashcash as money, instead of making users see ads.

https://en.wikipedia.org/wiki/Hashcash


I would guess that some people imagine a business model for websites where the user pays with computational power. Using GPU would be a way to make this worthwhile.


the beginning of the end of adblock :(


Ad blocking will always work as long as: a) ad networks insist on having users connect to their servers and b) users have the ability to blacklist said servers.


If you treat it like proof-of-work system it's rather depending on c): Making sure that search engines can still index your content. Because if that is not a requirement it's actually rather trivial to defeat any ad blocker: pose a computational problem that takes x seconds to compute on average hardware. Show ad while computation is going on. Only serve content once client sends back the answer. Sure, blockers can still block the ad itself, but it will just make the user stare at a blank screen.. or more likely loose him forever ;).


One can always disable WebGL.


My point is it could enforce a waittime. Disabling webgl only makes it worse as you'd have to wait much longer for the solution.


Mine bitcoins, see content.


Unfortunately I don't see this option in Safari 11.0 ...


cool! shameless self-promotion, I wrote a similar thing for a blog post

travelling salesman in js+gpu: https://amoffat.github.io/held-karp-gpu-demo/

gpgpu.js: https://github.com/amoffat/gpgpu.js


Brilliant and thank you for posting it here!

This is just the example and encouragement I needed to entice my friend to get the interest in what I have been thoroughly absorbed by the past months, and yet failing to impress upon my buddy how much he has been missing.

(dude is my best friend in the business and until amicable split recently my partner in a number of projects. His latest is precisely what I know your effort will help him to figure out now : a delivery related app which is kinda a consumer breakaway from the last thing we worked on together.

As cheesy as this is, I have to say I am very grateful for your provision of a great chance to get my friend a good direction, right when I have been afraid for him. My best bud he is, but not a hacker [edit] of the modern mold and his glory days are from 40 years ago banging out mainframe clients. (cool though, he sold a db2 front end on the first IBM pc, which was a tour de force of memory management caching the data in a prefect scheme to create a floppy copy of the query results from partial tables. Example job was what I would use a Excel pivot table for today: flip sales orders against regions and tally sales rep targets, margins and success rates. Querying the big iron is not happening, it's shared and hosted and the computation cycles cost (I forget the term for IBM capacity charges) so the job is to juggle the PC as input terminal for the sales office secretary inputting the work rate stats, write local copies of the input and raw query results, and pull up missing data from the mainframe to grab the entries from other sales offices and e.g if they sold to a new customer in a new city get any other results which are sales to the locale by other departments, as they were only measured by senior management by the margin made and terms of supply length. To be clear the different offices were different departments and not restricted by geography, but knowing that they were more profitable than the other departments was a key factor in bonuses and morale. A big company, this could be a good deal of data. Only if the corporate already had customers in a area did bonuses get affected by relative performance, because they knew they would grow their market after sunk costs were eaten and they advertised the more profitable products first and piggy back sold off that as matter of policy. This meant that the exact price of a sale could be critical to the departments and securing advertising for their business lines was a significant advantage. Once the company had established enough gross sales overall it applied a less biased marketing policy, but the idea was to increase the incentives for people to both keep their prices high and to open new geographical areas. Since the advertising affected the margins quarterly, the department my buddy worked for later got him to work on the ability to target customers in areas which had trade magazine circulation in the highest densities, to increase the efficiency and also leave the other departments to less efficient advertising deals. Anyhow this was a rolling process of growing a new database continuously and it was a further tool for departmental management to book sales in the most advantageous times and that meant that both their and the mainframe were only eventually consistent. I'm told that the hardest part was because effectively the entire storage capacity of this PC was all in use and getting a scratch space would bust the performance horribly, everything was sequenced down to disk spinning to be sending the inputs and query results to final location in discreet action, no later sorting. I have only the stories and I was in school when my friend got this job, but I love the balance of the specification he had to deliver. What's cool is my buddy got the job on spec, they didn't believe because they considered him a kid, he was barely graduated and visited the business because his dad was a supplier of theirs. And his pitch was basically "I can make a database do anything you want but think is impossible, hire me!" His pay was nominal because of the speculative pitch, like $500 but he got a new PC and the expansion as part of his deal, and so got himself bootstrapped. At 20, in Chicago, in like 1982. I can only imagine how that must have felt like...

But I don't think that modern Web stacks is a world my friend can come to terms with. I've struggled and I'm still in my youthful 40s. I'm trying to get him into the whole Web assembly thing, which I think is the best thing to happen this century. Time to get rid of all the JS implementation grief, and love the primitives. I like how it started with a Scheme and is devolved to be almost a pure lambda layer. I have such admiration for my friend though, but what keeps getting between us is he is a amazing salesman and lost no energy in all the years, but the way it is just not clicking for him is a showstopper every time to getting shipped. We have a long friendship and I just feel that he feels he lost a touch without which he fears that things would get out of his grasp to fix. I'm secretly writing a parallel app to the same timetable as his, hell bent on rolling his place in the likely event that he is unable to get the perfection he needs to show his stuff. I'm not giving up on the guy, your self promotion is my absolute pleasure, thanks again for posting!)


Things like this make me fear banner ads mining bitcoins (not that they aren't already).


--- Sorry if this is too far off topic --- This makes me find it funny that the Brave browser purposefully blocks all ads by default, seemingly without much negative recourse to the end user -- not directly, anyway.

edit: Point being the browser that capitalizes off of blockchain virtual currencies/tokens would purposefully block such a sneaky device.

Companies who want to advertise through the browser's ecosystem have to purchase the company's tokens on the Ethereum public chain and the end user has to OK viewing ads on sites they visit regularly in order for those publishers to see returns and the user doesn't have to OK anything at all. The browser capitalizes off of publishers/content providers who don't want to lose ad views, and the user base grows because of the late and growing public desire for privacy and dismissal of unwanted rampant advertising that is sometimes even quite inappropriate.

----

All that said, I'm quite excited to try this out, too.

Due to my regular dev workflow in JS I'm looking forward to it being published to NPM, so it can more easily be used as a module and bundled for the odd front end project/experiment.


I'd think pegging CPU/GPU at 99% would be even more of a turn off than banner ads. Especially on mobile where it'd drain your battery.

Interesting thought though. I wonder if Windows or MacOS had some hidden process that did bitcoin mining at ~5% CPU, how much that would be worth worldwide.


Okay, Bill Gates alone is worth more than twice all the bitcoin ever mined. Apple is worth almost 20x that. So, per risk/reward, probably not a great strategy for either company. https://howmuch.net/articles/worlds-money-in-perspective.


Please adjust your equations for time.


Adjusting to the Gates or Apple timeline doesn't get you anywhere. Bitcoin won't exist in 30 years.


Why not? It has defied such predictions for several years now.


Meaning? MSFT and AAPL have proved enduring returns over decades, and ... is a flash in the pan? So adjusting over time means what?


Nitpick: Bitcoins can't effectively be mined on anything but specialized ASIC hardware. Not even high-end GPUs. They could theoretically mine other cryptocurrencies, though.


Nitpick of your nitpick: GPU and even CPU can mine Bitcoin. It's considered ineffective because of the power consumption and hardware time relative to the value (and compared to what custom ASICs can do once you've made that investment) makes it prohibitively expensive for the returns one gets. If it was being mined on other people's power and time it's more effective than not mining at all and you'd not be paying for the power or hardware time. Etherium or Litecoin would be more effectively mined that way than Bitcoin, but anything you're getting for free is free. Of course the ethical and moral implications here are very negative.


This is why I use ublock and noscript.

Fuck advertising.


v0.0 Alpha

My project manager just hear "So, you're saying this is production-ready? Great!"


Hi!

Did you run the benchmark? I did. It shows 60x faster speed.

I'd like you to close your IDE and start integrating our web site with Stripe to start taking payment. I want the order page up by 5 PM. I'll give you the pricing information by then.

Also can you bump the displayed version to 5.1, it just feels more mature?

Good job getting this out the door, now let's start making some money so we can pay your salary.


> Made under 24 hours for a NUS Hackers hackathon, out of 68 competing projects

It's not meant to be production ready


I've been watching this project for over a year. There is quite a bit of work and care put into it. https://github.com/gpujs/gpu.js/graphs/contributors


However, I'm not claiming it's production ready, just that it's more than a weekend project.


"production ready" is a state of mind, not project.


Next steps: A computational graph based deep learning system with a Tensorflow like API on Javascript/Node lala-land.

https://github.com/tqchen/tinyflow would be a great showcase (and useful) project to port for Gpu.js


Does node.js support webgl now?



Is anyone else getting consistently slower speeds on their GPU compared with their CPU? I seem to be getting:

CPU: 0.426s ±7.6%

GPU: 2.399s ±4.7%

Running Chrome, Latest Stable. Windows 7. It seems odd to me that it would take 6x longer when my graphics card (GTX 690) would theoretically be much faster than my CPU (Intel i7-3930k)


I'm also getting the same (sort of) results:

CPU: 0.363s ±6.7%

GPU: 1.077s ±1.8%

Chromium 61, Windows 7, i7-5930k, Nvidia 1080. I did notice my GPU clock didn't move from the idle state.


FWIW I had the gnu at 85% faster with built in gnu (Intel Iris) intel i5.


To be fair, that is pretty old GPU. GPGPU's have advanced a lot more than CPU's as of recent.


This is very similar to the idea of Sh/RapidMind (eventually acquired by Intel.)

Some details:

https://en.wikipedia.org/wiki/Lib_Sh https://www.cs.umd.edu/class/spring2005/cmsc828v/papers/p787...

Never really took off. In the end OpenCL and CUDA were the winners in this space and OpenCL and CUDA, while explicit GPU languages, can be simulated on the CPU. I think this pattern will continue.


What is NUS, National University of Singapore?


That's my alma mater! ;)


Yes.


How does this compare to Turbo.js?


turbo.js is a barebones library (a port of NanoCL[1]). It

- doesn't support hardware or drivers that still don't support hp float textures (all things Apple basically)[2]

- requires you to use GLSL

(I'm the creator of turbo.js)

[1] - https://github.com/turbo/NanoCL [2] - https://github.com/turbo/js/blob/dev/turbo.js#L45-L135


How does it compare to weblas (GPU Powered BLAS for Browsers, https://github.com/waylonflinn/weblas)?

I had in mind matrix operations for neural networks, as in https://github.com/transcranial/keras-js.


I have a ray tracer build on it on http://xinan.github.io/ray-tracer/


I am curious, how does it get an AST from the JS, and compiles it to compute kernels?


Looks like the JS parser is created using Jison, a parser generator, and the generation of GLSL from the AST is performed in src/backend/webgl/function-node.js using a single-pass tree walk.



After running these GPU benchmarks I had to reboot my Macbook because all my open windows went scrambled and unreadable.

Not sure what's the root cause for this.


Is it repeatable? It could be a driver problem. If it doesn't happen again, it may be path dependent on other things you've run since boot. Or it could be bad hardware, or bad cooling.

It's definitely not the fault of the GPU benchmarks - that shouldn't cause system wide problems.


Sounds like your GPU is dying.


Are you serious? It's just 1 year old.. Intel Iris Graphics 6100 1536 MB in my MBP 13, Early 2015.


Yea had that happen to me before. Turns out the solder went bad and cracked from to many heat cycles. Returned it and got it replaced.


I don't see how that can happen in this case since the GPU is in the same chip as the CPU.


GPU drivers are crap.


Function.prototype.toString is one of the more interesting JavaScript features. This is a fun use of it.


Well that worked entirely as expected, browser locks up, unresponsive and utterly slow.


This looks cool—can someone help explain to me a real world use case?


Super excited to experiment around with this.


This is cool


A major caveat is, it's still JavaScript so the performance vary according to the browser apart from other variables.

Benchmark on iPad Air 2 iOS 10.3.2 is close for both chrome and safari.

Safari

CPU: 6.110s ±1.9% GPU: 0.487s ±1.0% (12.55 times faster!)

Chrome 59

CPU: 4.454s ±51.8% GPU: 0.483s ±0.9% (9.22 times faster!)

Project looks promising though, congratulations to the team.


Apple does not allow for real Chrome on iOS, so Chrome on iOS is just a wrapper around Safari.


Yes this use WKWebview, now that Google has made iOS chromium open source; is there any project trying to integrate blink engine to it? Although of course it cannot be released to Appstore.


If this example proves anything, it's that the GPU result is the same in both browsers.


Yes, it is. Added the same.


Notice the ±51.8% in the second benchmark. You probably got unlucky and hit garbage collection, or some other process decided to use the CPU while you were running it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: