The compile-able JavaScript subset is still fairly limited, so vector operations are quite painful. Beyond that, it's a great library to start with for parallel computation in the browser if you have no background in WebGL/GLSL.
One thing that might be helpful is if you linked to the Github repo on the site: https://github.com/gpujs/gpu.js. Especially for a project like this, high-performance computing, I'm gonna want to look at the code before I trust it.
I saw this a few months back when I was researching GPGPU implementations in the browser. It looked like one of the better projects out there. Do you know how much theoretical GPU allocation you can get from a JS implementation vs say a native wrapper?
Worth noting that Clarifai just released an SDK for offline mobile DL training/evaluation. Not browser based but I'd be curious what the difference in GPU utilization is practically.
This will never become a thing. If my phone heats up and dies after twenty minutes because the news site I'm reading is crunching numbers on my phone's GPU, I am (and most other folks on HN are) going to be VERY salty. Compute time on user machines isn't free, and users will notice.
On top of that, display advertising pays when the user sees the ad. Coin mining only pays when the user has completed enough useful work to send results back to the server. The odds of that happening for a huge number of websites are very low.
I've never come across this concept before. What is it? Is it something like, to see the page content, you must have a key which is either time-consuming to compute, or is just given to you if you first view an ad?
I would guess that some people imagine a business model for websites where the user pays with computational power. Using GPU would be a way to make this worthwhile.
Ad blocking will always work as long as: a) ad networks insist on having users connect to their servers and b) users have the ability to blacklist said servers.
If you treat it like proof-of-work system it's rather depending on c): Making sure that search engines can still index your content. Because if that is not a requirement it's actually rather trivial to defeat any ad blocker: pose a computational problem that takes x seconds to compute on average hardware. Show ad while computation is going on. Only serve content once client sends back the answer. Sure, blockers can still block the ad itself, but it will just make the user stare at a blank screen.. or more likely loose him forever ;).
This is just the example and encouragement I needed to entice my friend to get the interest in what I have been thoroughly absorbed by the past months, and yet failing to impress upon my buddy how much he has been missing.
(dude is my best friend in the business and until amicable split recently my partner in a number of projects. His latest is precisely what I know your effort will help him to figure out now : a delivery related app which is kinda a consumer breakaway from the last thing we worked on together.
As cheesy as this is, I have to say I am very grateful for your provision of a great chance to get my friend a good direction, right when I have been afraid for him. My best bud he is, but not a hacker [edit] of the modern mold and his glory days are from 40 years ago banging out mainframe clients. (cool though, he sold a db2 front end on the first IBM pc, which was a tour de force of memory management caching the data in a prefect scheme to create a floppy copy of the query results from partial tables. Example job was what I would use a Excel pivot table for today: flip sales orders against regions and tally sales rep targets, margins and success rates. Querying the big iron is not happening, it's shared and hosted and the computation cycles cost (I forget the term for IBM capacity charges) so the job is to juggle the PC as input terminal for the sales office secretary inputting the work rate stats, write local copies of the input and raw query results, and pull up missing data from the mainframe to grab the entries from other sales offices and e.g if they sold to a new customer in a new city get any other results which are sales to the locale by other departments, as they were only measured by senior management by the margin made and terms of supply length. To be clear the different offices were different departments and not restricted by geography, but knowing that they were more profitable than the other departments was a key factor in bonuses and morale. A big company, this could be a good deal of data. Only if the corporate already had customers in a area did bonuses get affected by relative performance, because they knew they would grow their market after sunk costs were eaten and they advertised the more profitable products first and piggy back sold off that as matter of policy. This meant that the exact price of a sale could be critical to the departments and securing advertising for their business lines was a significant advantage. Once the company had established enough gross sales overall it applied a less biased marketing policy, but the idea was to increase the incentives for people to both keep their prices high and to open new geographical areas. Since the advertising affected the margins quarterly, the department my buddy worked for later got him to work on the ability to target customers in areas which had trade magazine circulation in the highest densities, to increase the efficiency and also leave the other departments to less efficient advertising deals. Anyhow this was a rolling process of growing a new database continuously and it was a further tool for departmental management to book sales in the most advantageous times and that meant that both their and the mainframe were only eventually consistent. I'm told that the hardest part was because effectively the entire storage capacity of this PC was all in use and getting a scratch space would bust the performance horribly, everything was sequenced down to disk spinning to be sending the inputs and query results to final location in discreet action, no later sorting. I have only the stories and I was in school when my friend got this job, but I love the balance of the specification he had to deliver. What's cool is my buddy got the job on spec, they didn't believe because they considered him a kid, he was barely graduated and visited the business because his dad was a supplier of theirs. And his pitch was basically "I can make a database do anything you want but think is impossible, hire me!" His pay was nominal because of the speculative pitch, like $500 but he got a new PC and the expansion as part of his deal, and so got himself bootstrapped. At 20, in Chicago, in like 1982. I can only imagine how that must have felt like...
But I don't think that modern Web stacks is a world my friend can come to terms with. I've struggled and I'm still in my youthful 40s. I'm trying to get him into the whole Web assembly thing, which I think is the best thing to happen this century. Time to get rid of all the JS implementation grief, and love the primitives. I like how it started with a Scheme and is devolved to be almost a pure lambda layer. I have such admiration for my friend though, but what keeps getting between us is he is a amazing salesman and lost no energy in all the years, but the way it is just not clicking for him is a showstopper every time to getting shipped. We have a long friendship and I just feel that he feels he lost a touch without which he fears that things would get out of his grasp to fix. I'm secretly writing a parallel app to the same timetable as his, hell bent on rolling his place in the likely event that he is unable to get the perfection he needs to show his stuff. I'm not giving up on the guy, your self promotion is my absolute pleasure, thanks again for posting!)
--- Sorry if this is too far off topic ---
This makes me find it funny that the Brave browser purposefully blocks all ads by default, seemingly without much negative recourse to the end user -- not directly, anyway.
edit: Point being the browser that capitalizes off of blockchain virtual currencies/tokens would purposefully block such a sneaky device.
Companies who want to advertise through the browser's ecosystem have to purchase the company's tokens on the Ethereum public chain and the end user has to OK viewing ads on sites they visit regularly in order for those publishers to see returns and the user doesn't have to OK anything at all. The browser capitalizes off of publishers/content providers who don't want to lose ad views, and the user base grows because of the late and growing public desire for privacy and dismissal of unwanted rampant advertising that is sometimes even quite inappropriate.
----
All that said, I'm quite excited to try this out, too.
Due to my regular dev workflow in JS I'm looking forward to it being published to NPM, so it can more easily be used as a module and bundled for the odd front end project/experiment.
I'd think pegging CPU/GPU at 99% would be even more of a turn off than banner ads. Especially on mobile where it'd drain your battery.
Interesting thought though. I wonder if Windows or MacOS had some hidden process that did bitcoin mining at ~5% CPU, how much that would be worth worldwide.
Okay, Bill Gates alone is worth more than twice all the bitcoin ever mined. Apple is worth almost 20x that. So, per risk/reward, probably not a great strategy for either company. https://howmuch.net/articles/worlds-money-in-perspective.
Nitpick: Bitcoins can't effectively be mined on anything but specialized ASIC hardware. Not even high-end GPUs. They could theoretically mine other cryptocurrencies, though.
Nitpick of your nitpick: GPU and even CPU can mine Bitcoin. It's considered ineffective because of the power consumption and hardware time relative to the value (and compared to what custom ASICs can do once you've made that investment) makes it prohibitively expensive for the returns one gets. If it was being mined on other people's power and time it's more effective than not mining at all and you'd not be paying for the power or hardware time. Etherium or Litecoin would be more effectively mined that way than Bitcoin, but anything you're getting for free is free. Of course the ethical and moral implications here are very negative.
Did you run the benchmark? I did. It shows 60x faster speed.
I'd like you to close your IDE and start integrating our web site with Stripe to start taking payment. I want the order page up by 5 PM. I'll give you the pricing information by then.
Also can you bump the displayed version to 5.1, it just feels more mature?
Good job getting this out the door, now let's start making some money so we can pay your salary.
Is anyone else getting consistently slower speeds on their GPU compared with their CPU? I seem to be getting:
CPU: 0.426s ±7.6%
GPU: 2.399s ±4.7%
Running Chrome, Latest Stable. Windows 7. It seems odd to me that it would take 6x longer when my graphics card (GTX 690) would theoretically be much faster than my CPU (Intel i7-3930k)
Never really took off. In the end OpenCL and CUDA were the winners in this space and OpenCL and CUDA, while explicit GPU languages, can be simulated on the CPU. I think this pattern will continue.
Looks like the JS parser is created using Jison, a parser generator, and the generation of GLSL from the AST is performed in src/backend/webgl/function-node.js using a single-pass tree walk.
Is it repeatable? It could be a driver problem. If it doesn't happen again, it may be path dependent on other things you've run since boot. Or it could be bad hardware, or bad cooling.
It's definitely not the fault of the GPU benchmarks - that shouldn't cause system wide problems.
Yes this use WKWebview, now that Google has made iOS chromium open source; is there any project trying to integrate blink engine to it? Although of course it cannot be released to Appstore.
Notice the ±51.8% in the second benchmark. You probably got unlucky and hit garbage collection, or some other process decided to use the CPU while you were running it.
What I learned: Not to make a compiler during a hackathon.