There are some rough edges but this combo is quite refreshing after React. The best thing is that I could omit npm from my stack. Having just a monolith (Go) server greatly simplifies things if you're an indie dev.
Yeah, I also wondered how quick would it be to ~~rip off~~ replicate “diskprices.com for X” and took a weekend (ok more) to build https://tvpricesindex.com (Also using HTMX).
TVs are much less of a “commodity” of course, but it’s a nice experiment that I’ll iterate on a bit more with time. Also, first deploy on Railway (was nice enough, definitely compared to Heroku, but they have a way to go), and using other libraries I wanted to learn.
Will probably start grouping models better soon, and offer other filters.
BTW: If you plan to do something like this with PAAPI (Product Advertising API), know that Amazon has the constant axe of banning you if you don’t generate “qualified” sales for 30 days straight.
The Amazon PAAPI is "interesting" because to even access it, you need to have 3 qualified sales. But... I wanted to use the API to create a website where I'd post the affiliate links on. It's a chicken-and-egg problem. That's why I went with eBay.
Yep, I don't have much experience with affiliate programs (I like to build more SaaS-like apps[1]) but from my research Amazon have gotten really stingy circa 2019. Happy to see an alternative.
[1] I didn't know about the 3-sales precondition either. I guess some people bought albums on Amazon on my other music-related app.
Yep, I'm fighting this battle at the moment. I got the required 3 sales, but then they rejected me a few days later. From the rejection email "Your website/app did not meet our content standards as the content on your site is insufficient."
There's clearly precedent as I know Jeremies diskprices qualifies. Fingers crossed that Amazon play fair...
I don't understand their worries. What are they trying to achieve with this? I can understand e.g. not wanting to have affiliate links on fraudulent/deceptive websites, but how is a search interface hurting anybody?
Yeah, my first lines of Python (or code) were Selenium/BeautifulSoup. The problems with these are sensitivity for (DOM) changes, speed, and scale (not that I'm too concerned with those at the moment, it's just a fun refreshing project).
> Error: Could not find aspects. Most likely because the server is under heavy load. Please contact support at magicsourceltduk@gmail.com and tell me where you saw the link to this website because I have honestly no idea where all this traffic is coming from. Many thanks!
It’s a privacy measure so that third party sites can’t see which page a user is coming from. It’s not necessarily anyone’s business to know the previous URL in a user’s browser history.
This would be very useful to me in helping to source CPUs for a +40PB Ceph cluster that I’m building at a University. Could this be extended to include server grade AMD SP3 CPUs? I’m in the market soon for +30 Milan processors.
This is awesome, but would be cool to see a GPU version. Friends keep telling me 2nd market GPUs from crypto or ai is the sweet spot of ~10-20% discount on ebay
This is a myth. I ran 150k AMD gpus. We overclocked/undervolted all of them. While they were run continuously "hard", they didn't fail. Some of them ran for years out in containers in the middle of dusty fields through 4 seasons. They don't burn out the way that people think they might.
That said, a lot of the GPUs used for crypto, don't apply to AI, unless you really want to be constrained. Just be super aware of what you're buying and what you plan to use it for.
The system I built would auto tune the GPUs. The failure case for a single GPU is to crash the entire system and there are 12 in a box. The machines would reboot hundreds of times until they were tuned to stability. Through 4 seasons (including snow). Again, we had the majority of these cards with zero failures, across multiple years.
The things that were more likely to fail were things like PSU's. One time we had a bad batch of those and had to replace nearly every one of them. We cracked a few open and they were clearly hand soldered by someone in China and shorting out internally due to failed connections. We would see a lot more thermal cycling failure from that than we would from a GPU card that was pick&place assembled by a machine with solder paste.
Can I ask what this was for? I'm struggling to think what you'd be doing in the middle of a field with a crate full of mostly-GPU compute. Something with machine vision?
> Some of them ran for years out in containers in the middle of dusty fields through 4 seasons.
...story time? This sounds more interesting than hardware prices;)
> That said, a lot of the GPUs used for crypto, don't apply to AI, unless you really want to be constrained. Just be super aware of what you're buying and what you plan to use it for.
Yeah, I'll have to do my homework. While AI would be cool, I'm actually largely interested in running video games, but I keep equivocating because I'm a terrible cheapskate:)
Some disagree because almost all miners underclock and run under close to ideal temperature to increase efficiency per watt. So it might turn out to be a deal because it was better treated than a random gamer
It seems like there is demand for simple search tools that focus on a single product. The big marketplaces don't have good search filters for most products and probably struggle to justify investing engineering effort. Paired with affiliate links, it looks like there's good money to be made building these. I like that these appear to be data-centric instead of review-centric; I don't trust random review websites.
diskprices.com has a good concept but every time I tried to use it, I found that it has too many "garbage" listings (weird brands, 3rd party sellers, obvious scams, etc.), to a point that it would be easier to just search a few reputable brands manually on Amazon.
Again nothing against the website itself, just unfortunate experience in practice.
Edit: Filtering appears to be fully broken for me, with "Error: Could not retrieve items. Please contact support." (Originally thought it was just filtering on sockets with a '/').
It may make sense to run an hourly or daily job to collect data from the API and then implement the filters exclusively within your back-end. This pattern can work well with rate-limited APIs and a dataset that changes fairly slowly. There's some risk that an item shown will already be sold (user would click back and try another).
When it comes to filtering, there's enough unique selections a user can make that, if you're letting the eBay API handle filtering for you, will cause far too many cache misses.
At a previous job I considered a system that would use synchronous API calls to the backend API until it went down (or we got rate limited). When the backend was unavailable we'd switch to filtering in our service using the data we'd previously cached.
I.E. if a cached query asked for (cpu>=3.0Ghz, cores>=2) we can also answer (cores>=4) by filtering the previous result. This wouldnt be able to find any CPUs with less than 3Ghz, unless it there were other cached responses. This works well when a "best effort" response is desirable, even when it's incomplete.
That's a very good idea, thanks! I think I'll have to do exactly that. Maybe in the fallback scenario, I can display a warning that data might be incomplete.
Kind of useless to have a giant list of entries for $10 with 1 bid without any indication of how long's left. Since there's no reason to bid until the last minute, I personally only want to see Buy It Now items or bids ending within the hour.
Also, some items are listed but out of stock, which should be filtered out.
Worth pointing out this is specifically regarding "cpubenchmark.net", not "cpu.userbenchmark.com" or anything else from User Benchmark, which is known to have a heavy bias against AMD.
Haha yes! I think even their website design is great, it reminds me of a purer form of the internet before cookie banners, popups and auto-playing ads.
I saw this when it was posted yesterday and was sad it didn't make front page, so I'm glad it got revitalized! It also sent me down a fun rabbit hole yesterday when I saw there was a 72 core chip for only $200 listed.. and I was introduced to the byzantine world of Intel's Xeon Phi (which you can buy in coprocessor form for <$50 on eBay).
Tried disabling adblock, checked devtools which has no failed requests or errors though there were 4 "itemspecifics" requests that all returned 204 with no body
Yes, apologies for that, I had to disable that call as it was making too many calls to the eBay API endpoint which resulted in the rate limit getting hit.
The Aus version went offline for a while so I built this quickly. The day I launched it came back online haha. Oh well, was a fun experience.
Initially was static html (generated by python) but that was leading to large page sizes, so I transitioned to ag-grid, which worked surprisingly well.
One thing to be aware is that for older CPUs, the limiting factor is often compatible motherboards. The original ones are more likely to have died than CPUs from the same generation, and new ones aren't being made.
I was thinking that an old motherboard may require a BIOS update for a relatively newer, but socket compatible CPU. I remember having to ebay in a random CPU just to able to turn on a motherboard to update the BIOS for the better CPU I bought earlier.
No webpage scraping is involved here, data is from the eBay API. It's not using any database at the moment, just in-memory caching in the Go server. Problem is there are a lot of cache misses which result in API calls of which eBay only allows 5000 a day.
Funnily, the website seems to be crashing due to the API issues while the web server is at <1% CPU utilization.
The Coming Soon columns make it hard to see the two main price and name columns on mobile small screen at the same time. I wish to hide those. But nice idea.
Please include cheaper GPUs as well. Make I'm just looking for more/different video ports for my PC? The name doesn't imply any restriction for being for training. Also, using best buy as the only data source seems odd, maybe there's a way to use PCPartPicker or similar as a data source, and expand from there?
It was a great experience, at least to the extent I needed it. What sold me on it is its support for template composition[1], something I've struggled with using the standard `html/template`. And I really like the idea of generating statically typed templates which are just Go functions in the end of the day. In theory, I can share them between many of my apps, especially as I'm using a monorepo for all my projects. This is yet to be tested though.
There are some rough edges around using `script` templates[2], for example it doesn't allow you to pass `this` and `arguments` from HTML event handlers[3]. This is not a show-stopper though, because you can always fall back to just defining your event handler in an HTML <script></script> block, albeit without type checking of the arguments and deduping the scripts. Also it hurts Locality of Behaviour which I'm a fan of.
That being said, I think it has some great potential. I can completely imagine people building fully-fledged UI libraries in this once component reusability is improved.
But the best part of using templ for me was that I could completely get rid of Node and NPM from my stack.
I’ve been using pongo2 for a year or two as I always found the standard html/template to be annoying to work with for inheritance as you mentioned. I’ve got years of Django experience and always liked their templates, and pongo2 is basically just Django templates in Go so it’s nice.
But being able to break things into reusable components with templ is compelling (and works fairly well from my experimenting). I’m just not in love with how you pass data into it and I ran into a few awkward points with the string interpolation of data when doing inline scripts.
I’m sure if I spent more time with it, I’d overcome these issues quick.
I remember checking out pongo2 briefly before deciding on templ, but my lack of familiarity with its syntax kept me away. I guess the calculus is much different if you have prior experience, but for me it was attractive that templ uses Go syntax in its templates (with some exceptions).
Yeah, passing data if for some reason you can't use templ's `script` block can be annoying, if that's what you meant! I think they should make sure that the `script` block works in every situation and then it's going to be way less awkward.
https://www.cpuscout.com/
https://diskprices.com/
https://gpuprices.us/
https://instances.vantage.sh/
https://listofdisks.com/
https://tvpricesindex.com/