Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: CPU Prices on eBay (cpuscout.com)
152 points by attilakun 11 months ago | hide | past | favorite | 89 comments
Tech stack: Go + templ + htmx

There are some rough edges but this combo is quite refreshing after React. The best thing is that I could omit npm from my stack. Having just a monolith (Go) server greatly simplifies things if you're an indie dev.





I would also add

https://shucks.top/


Yeah, I also wondered how quick would it be to ~~rip off~~ replicate “diskprices.com for X” and took a weekend (ok more) to build https://tvpricesindex.com (Also using HTMX).

TVs are much less of a “commodity” of course, but it’s a nice experiment that I’ll iterate on a bit more with time. Also, first deploy on Railway (was nice enough, definitely compared to Heroku, but they have a way to go), and using other libraries I wanted to learn.

Will probably start grouping models better soon, and offer other filters.

BTW: If you plan to do something like this with PAAPI (Product Advertising API), know that Amazon has the constant axe of banning you if you don’t generate “qualified” sales for 30 days straight.


The Amazon PAAPI is "interesting" because to even access it, you need to have 3 qualified sales. But... I wanted to use the API to create a website where I'd post the affiliate links on. It's a chicken-and-egg problem. That's why I went with eBay.


Yep, I don't have much experience with affiliate programs (I like to build more SaaS-like apps[1]) but from my research Amazon have gotten really stingy circa 2019. Happy to see an alternative.

[1] I didn't know about the 3-sales precondition either. I guess some people bought albums on Amazon on my other music-related app.


Yep, I'm fighting this battle at the moment. I got the required 3 sales, but then they rejected me a few days later. From the rejection email "Your website/app did not meet our content standards as the content on your site is insufficient."

There's clearly precedent as I know Jeremies diskprices qualifies. Fingers crossed that Amazon play fair...


I don't understand their worries. What are they trying to achieve with this? I can understand e.g. not wanting to have affiliate links on fraudulent/deceptive websites, but how is a search interface hurting anybody?


Using Selenium, even headlessly, Amazon prices can easily be scraped. This could be turned into an API without much difficulty, I imagine.


Yeah, my first lines of Python (or code) were Selenium/BeautifulSoup. The problems with these are sensitivity for (DOM) changes, speed, and scale (not that I'm too concerned with those at the moment, it's just a fun refreshing project).


Hugged:

> Error: Could not find aspects. Most likely because the server is under heavy load. Please contact support at magicsourceltduk@gmail.com and tell me where you saw the link to this website because I have honestly no idea where all this traffic is coming from. Many thanks!


> I have honestly no idea where all this traffic is coming from.

Aren't there HTTP request headers that tip this off?


The referer header behavior changed for https 7+ years ago. As I recall, it can be crippled by the source site (e.g. inbound google traffic).

https://stackoverflow.com/questions/21922143/why-isnt-the-th...

Why this is good for anyone other than Google Analytics is beyond my current understanding.


It’s a privacy measure so that third party sites can’t see which page a user is coming from. It’s not necessarily anyone’s business to know the previous URL in a user’s browser history.


> Why this is good for anyone other than Google Analytics is beyond my current understanding.

Better privacy. There have long been extensions to strip out or fake referrer information from requests.


This would be very useful to me in helping to source CPUs for a +40PB Ceph cluster that I’m building at a University. Could this be extended to include server grade AMD SP3 CPUs? I’m in the market soon for +30 Milan processors.



Yes, it's possible. Will do this as soon as the rate limiting troubles with the eBay API subside a bit.


This is awesome, but would be cool to see a GPU version. Friends keep telling me 2nd market GPUs from crypto or ai is the sweet spot of ~10-20% discount on ebay


> 2nd market GPUs from crypto or ai is the sweet spot of ~10-20% discount on ebay

I thought those were iffy because they were run so hard? If not, I need to go shopping:)


This is a myth. I ran 150k AMD gpus. We overclocked/undervolted all of them. While they were run continuously "hard", they didn't fail. Some of them ran for years out in containers in the middle of dusty fields through 4 seasons. They don't burn out the way that people think they might.

That said, a lot of the GPUs used for crypto, don't apply to AI, unless you really want to be constrained. Just be super aware of what you're buying and what you plan to use it for.


Electronics (within spec) don't usually mind heat, thermal cycling is what they really hate.


The system I built would auto tune the GPUs. The failure case for a single GPU is to crash the entire system and there are 12 in a box. The machines would reboot hundreds of times until they were tuned to stability. Through 4 seasons (including snow). Again, we had the majority of these cards with zero failures, across multiple years.

The things that were more likely to fail were things like PSU's. One time we had a bad batch of those and had to replace nearly every one of them. We cracked a few open and they were clearly hand soldered by someone in China and shorting out internally due to failed connections. We would see a lot more thermal cycling failure from that than we would from a GPU card that was pick&place assembled by a machine with solder paste.


Can I ask what this was for? I'm struggling to think what you'd be doing in the middle of a field with a crate full of mostly-GPU compute. Something with machine vision?


Mining ethereum when it was proof of work. The middle of the field was just one location out of 7. We did "real" data centers too.


> Some of them ran for years out in containers in the middle of dusty fields through 4 seasons.

...story time? This sounds more interesting than hardware prices;)

> That said, a lot of the GPUs used for crypto, don't apply to AI, unless you really want to be constrained. Just be super aware of what you're buying and what you plan to use it for.

Yeah, I'll have to do my homework. While AI would be cool, I'm actually largely interested in running video games, but I keep equivocating because I'm a terrible cheapskate:)


The things that will fail, are fans...


Some disagree because almost all miners underclock and run under close to ideal temperature to increase efficiency per watt. So it might turn out to be a deal because it was better treated than a random gamer


They probably ran at a constant temperature too, none of the getting hot then cooling down to fatigue solder joints.


Would you buy a rental car for only a 10-20% discount?


"AI" GPUs like the 3090 and 3060 are starting to shoot up in price, unfortunately.


Looks just https://diskprices.com/ but for CPUs :-)


It seems like there is demand for simple search tools that focus on a single product. The big marketplaces don't have good search filters for most products and probably struggle to justify investing engineering effort. Paired with affiliate links, it looks like there's good money to be made building these. I like that these appear to be data-centric instead of review-centric; I don't trust random review websites.


diskprices.com has a good concept but every time I tried to use it, I found that it has too many "garbage" listings (weird brands, 3rd party sellers, obvious scams, etc.), to a point that it would be easier to just search a few reputable brands manually on Amazon.

Again nothing against the website itself, just unfortunate experience in practice.


Edit: Filtering appears to be fully broken for me, with "Error: Could not retrieve items. Please contact support." (Originally thought it was just filtering on sockets with a '/').


I suspect eBay may have banned the project. They are not very happy about scraping, like most e-commerce sites.


Yes, got IP rate limited by eBay, sorry about that! Thought my caching was enough but apparently it is not!

eBay only allows 5000 API calls per day for most APIs useful to me which is very easy to hit: https://developer.ebay.com/develop/apis/api-call-limits

My infinite scrolling implementation probably didn't help either but I couldn't help myself, it was so easy to implement with HTMX.


It may make sense to run an hourly or daily job to collect data from the API and then implement the filters exclusively within your back-end. This pattern can work well with rate-limited APIs and a dataset that changes fairly slowly. There's some risk that an item shown will already be sold (user would click back and try another).

Also, I don't think I see any affiliate links in use. I believe eBay runs a program, so you could make some money here: https://partnernetwork.ebay.com/solutions/creating-affiliate...


> It may make sense to run an hourly or daily job to collect data from the API and then implement the filters exclusively within your back-end.

Absolutely! I thought I could get away with just in-memory caching for the MVP but it looks like I can't.


When it comes to filtering, there's enough unique selections a user can make that, if you're letting the eBay API handle filtering for you, will cause far too many cache misses.

At a previous job I considered a system that would use synchronous API calls to the backend API until it went down (or we got rate limited). When the backend was unavailable we'd switch to filtering in our service using the data we'd previously cached.

I.E. if a cached query asked for (cpu>=3.0Ghz, cores>=2) we can also answer (cores>=4) by filtering the previous result. This wouldnt be able to find any CPUs with less than 3Ghz, unless it there were other cached responses. This works well when a "best effort" response is desirable, even when it's incomplete.


That's a very good idea, thanks! I think I'll have to do exactly that. Maybe in the fallback scenario, I can display a warning that data might be incomplete.


It's working for me. It's probably due to the high traffic from posting the site here on HN.


Kind of useless to have a giant list of entries for $10 with 1 bid without any indication of how long's left. Since there's no reason to bid until the last minute, I personally only want to see Buy It Now items or bids ending within the hour.

Also, some items are listed but out of stock, which should be filtered out.


Very good points, thanks. Will be looking into these.


Cool. It's a really good idea!


cpubenchmark score would be much more useful than GHz, when comparing CPUs across 10+ years of generations.

It doesn't look like you have EPYCs listed, or Threadripper? (These are some of the best CPU deals on eBay.)


Worth pointing out this is specifically regarding "cpubenchmark.net", not "cpu.userbenchmark.com" or anything else from User Benchmark, which is known to have a heavy bias against AMD.


I'll have to look into whether there's some public API to pull cpubenchmark score as I agree, it'd be immensely useful.

To your other point, EPYC and Threadripper are listed now:

https://www.cpuscout.com/?Processor+Type=Epyc

https://www.cpuscout.com/?Processor+Model=AMD%2520Ryzen%2520...


I guess after https://news.ycombinator.com/item?id=39066480 showed up on HN, every hacker with some spare time wanted to replicate diskprices.com for other commodities.


U saw diskprices.com and copied that, didn’t u ;)


Haha yes! I think even their website design is great, it reminds me of a purer form of the internet before cookie banners, popups and auto-playing ads.


Both look like a codecanyon script. Copying is not the issue, I hope the scraped data is correct.


I saw this when it was posted yesterday and was sad it didn't make front page, so I'm glad it got revitalized! It also sent me down a fun rabbit hole yesterday when I saw there was a 72 core chip for only $200 listed.. and I was introduced to the byzantine world of Intel's Xeon Phi (which you can buy in coprocessor form for <$50 on eBay).


Every entry just get stuck on "loading..."

Tried disabling adblock, checked devtools which has no failed requests or errors though there were 4 "itemspecifics" requests that all returned 204 with no body


Yes, apologies for that, I had to disable that call as it was making too many calls to the eBay API endpoint which resulted in the rate limit getting hit.


This is great! I would love to see a passmark/$ metric on here as well.


This is only useful for USA residents


Yes, although this might be a not too difficult fix. I'll look into it.


I'm another fan of diskprices.com :-)

https://listofdisks.com/

The Aus version went offline for a while so I built this quickly. The day I launched it came back online haha. Oh well, was a fun experience.

Initially was static html (generated by python) but that was leading to large page sizes, so I transitioned to ag-grid, which worked surprisingly well.


You know what would be helpful is some way to reference when it was released, generation, maybe a cpu bench?


One thing to be aware is that for older CPUs, the limiting factor is often compatible motherboards. The original ones are more likely to have died than CPUs from the same generation, and new ones aren't being made.


Since we're talking about the used market, has anyone even seen a CPU fail? I'm used to CPUs and RAM being insanely reliable.


I was thinking that an old motherboard may require a BIOS update for a relatively newer, but socket compatible CPU. I remember having to ebay in a random CPU just to able to turn on a motherboard to update the BIOS for the better CPU I bought earlier.


https://www.cpuscout.com/?Socket+Type=Socket%2520TR4

Error: Could not retrieve items. Please contact support.


Yes, sorry, got rate limited by the eBay API!


So you don't use a database to store all the data and update it when it needed? Are you allowed to scrape the data?


No webpage scraping is involved here, data is from the eBay API. It's not using any database at the moment, just in-memory caching in the Go server. Problem is there are a lot of cache misses which result in API calls of which eBay only allows 5000 a day.

Funnily, the website seems to be crashing due to the API issues while the web server is at <1% CPU utilization.


https://www.cpuscout.com/?Brand=AMD

Got a 4xx or 5xx error on this (probably important?) URL. Told me to contact support.


Looks awesome! Isn't there one of these for laptops as well?


The Coming Soon columns make it hard to see the two main price and name columns on mobile small screen at the same time. I wish to hide those. But nice idea.


I need this for GPUs.


Yep


Working on it: https://gpuprices.us/


Please include cheaper GPUs as well. Make I'm just looking for more/different video ports for my PC? The name doesn't imply any restriction for being for training. Also, using best buy as the only data source seems odd, maybe there's a way to use PCPartPicker or similar as a data source, and expand from there?


There are plenty of cheap options. You can remove default 12GB memory filter, adjust utilization, and limit price to $300.

On the data source: BestBuy just had the best least walled off APIs and prices seem lower that Amazon, so it is the first.


"Error: cound not retrieve items. Please contact support" immediately after the first click to some chrckbox. Was this even tested?


Very nice! How are you getting the data? eBay API?


"If the given search has not been performed recently, it's pulled fresh from the eBay API. Otherwise, records are cached for 2 hours."


Number Of Cores filter immediately 500s.


How are you liking templ? I've played around with it a bit and am not totally sold on it yet.


It was a great experience, at least to the extent I needed it. What sold me on it is its support for template composition[1], something I've struggled with using the standard `html/template`. And I really like the idea of generating statically typed templates which are just Go functions in the end of the day. In theory, I can share them between many of my apps, especially as I'm using a monorepo for all my projects. This is yet to be tested though.

There are some rough edges around using `script` templates[2], for example it doesn't allow you to pass `this` and `arguments` from HTML event handlers[3]. This is not a show-stopper though, because you can always fall back to just defining your event handler in an HTML <script></script> block, albeit without type checking of the arguments and deduping the scripts. Also it hurts Locality of Behaviour which I'm a fan of.

That being said, I think it has some great potential. I can completely imagine people building fully-fledged UI libraries in this once component reusability is improved.

But the best part of using templ for me was that I could completely get rid of Node and NPM from my stack.

How was your experience?

[1]: https://templ.guide/syntax-and-usage/template-composition

[2]: https://templ.guide/syntax-and-usage/script-templates#script...

[3]: https://github.com/a-h/templ/issues/494


I’ve been using pongo2 for a year or two as I always found the standard html/template to be annoying to work with for inheritance as you mentioned. I’ve got years of Django experience and always liked their templates, and pongo2 is basically just Django templates in Go so it’s nice.

But being able to break things into reusable components with templ is compelling (and works fairly well from my experimenting). I’m just not in love with how you pass data into it and I ran into a few awkward points with the string interpolation of data when doing inline scripts.

I’m sure if I spent more time with it, I’d overcome these issues quick.


I remember checking out pongo2 briefly before deciding on templ, but my lack of familiarity with its syntax kept me away. I guess the calculus is much different if you have prior experience, but for me it was attractive that templ uses Go syntax in its templates (with some exceptions).

Yeah, passing data if for some reason you can't use templ's `script` block can be annoying, if that's what you meant! I think they should make sure that the `script` block works in every situation and then it's going to be way less awkward.


Hell yeah, includes stuff like zilogs.


If it could tell the difference in socket 2011 v1, v2, v3, and v4, I might use it.


I have great experiences building webapps with Rust and HTMX.


Looks great! Can you please add LGA 4677? Thanks!


It could be useful but currently too broken


It works fine for me, I like it!


Interesting idea (very out of scope):

CPUs go into motherboards which use power supplies and RAM and a storage device to boot and run operating systems/code

This all adds up once plugged in to "at the wall" power pull (basically measured in watts per hour * 24 hours = how many kilowatts a day)

Then, you take how efficient the hardware is at say, a proof-of-work mining algorithm (hashrate)

Then you can sort by which CPUs are most efficient

I've yet to find a database that does this (probably for good reason, mining is pretty dumb)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: