Hacker News new | past | comments | ask | show | jobs | submit login

I love this blog post.

“It can display seven equally spaced vertical columns of text (critical importance), has driver issues (minimal importance), wake issues (who cares), it costs as much as four smaller monitors (this is good), I need a huge desk (hell yeah), there are multiple image quality issues (well it’s not like I have to look at it all day)…”

It is like “I spent fifteen hundred dollars on a multitude of hassles due to purchasing the wrong type of display, but due to the lack of bezel this is a prime efficiency move “




> Multiple image quality issues

Only the first one (dirty screen) is a real issue, but it is subtle and irrelevant to programming; the second one (checkerboard), as the post explains, is solved by toggling an option in settings.

> Driver issues

The post explains that it works perfectly with current NVidia drivers on Linux, and on Windows both AMD and NVidia on Windows have had driver support for HDMI 2.1 for years.


I chuckled at "The 8K display is only $1500 at BestBuy!" the "only" lol I spent $400 on my projector that I use for my main screen and it works great. But when I did that I had previously only bought $200 projectors. So even that was not an "only" for me.


Reality warps when you and everyone you know pulls $200k+ annually.


I've never spent more than $75 on a monitor. I only buy used. Monitors depreciate like crazy and businesses are constantly getting rid of them, even when they're only a few years old. Yeah, you aren't going to get some 9001Hz 10K giga-OLED whatever, but I'm a programmer. If it displays text with reasonable contrast without hogging my whole desk, it does everything I need it to do.

The most expensive one - the $75 one - is a 24" 1920x1200 IPS display with HDMI, DP, VGA, 2x DVI, S-Video, and YPbPr composite. Never seen those last two on a monitor before, but there they are. I don't use that display as my main one anymore, but I keep it around because it's awesome and it plugs into literally anything.


It's an 8k projector?

Remember dropping a grand on a 30- inch 2560x1600 on the day and thinking that was the ultimate.

I The 40 to 45 inch is the ideal, otherwise screen real estate goes too far in the peripheral vision.

The other issue was a lot of really big screen. Real estate is managing lots of Windows. With dual screens you can usually been in Mac's ride of applications more easily than with one cuz when you Max on the super big screen it just takes up everything.

And pushes the usually the most relevant stuff is the upper left hand corner that goes to the upper upper left left corner, which actually is pretty far out of your main field of vision.

But I still love the 43-in 4K TV I've been using since 2010 or so


No, I wish! It's 1080 the picture isn't amazing but it works fine and it's 100" on my wall across the room from my couch, so I'm happy. I've toyed with the idea of 4k projectors but they're usually magnitudes more expensive than 1080 projectors!


May I ask what projector? I’m thinking about getting one as well


Sorry, this recommendation will probably disappoint, it's from Walmart. It's a Vankyo Performance V700W. I can't necessarily recommend it.

I have a problem with over-shopping for things, such as spending too much of my life researching and frustrated before either never buying, buying above budget, or just impulse buying making the research time wasted. So, if I can instead work with something I can drive to Walmart and spend $300-400 on, I am happy.

It's been fine but it's nothing special. Does the job and the picture is pretty clear when focused properly. It's bright enough and has good color and picture quality for my purposes. It's 100" on my wall across the room from my couch so we use it a lot for gaming and watching videos. For programming stuff it works but can't optimize for space in the IDE by bumping font size down like I would with high DPI monitors.

I'm also on my second one as the first was left on constantly and started to develop dark spots. They were kind of fun to watch as they'd start really bright and then fade but obviously only in hindsight because it made the screen hard to see. The last time I bought it, the price was dropped I think it was <$200. I have had it for about a year and turn it off when I'm not using it and it's holding up a lot better!


Is it a dumb projector?


It is and not very high quality. Sorry I didn't mean to recommend everyone get a projector here or pump projectors. I just enjoy my setup and it was relatively cheap!

It will do screen mirroring though. It has 2 inputs and I use those directly and it doesn't offer apps or anything from what I've seen.


Once you no longer see pixels you'll never want to go back.


I'd give a lot to go back to my 20 year old eyes that could see pixels without special glasses. Sure I can't see pixels (well maybe I still could on an janky third party CGA monitor from 1983), but it isn't worth it. (I'd say save your eyesight, but realistically I'm not aware of anything you can do to keep it past about 45)


I think you'd have to sit further back than is otherwise natural (and then have the issue of legibility/lost workspace) to achieve "can't see the pixels" on this.

Sure it's 8K but it's 65", it's only got a PPI of 135. For comparison Apple (computer) displays and a handful of third parties that target Mac use are generally 200-220 PPI. That is can't see the pixels density, even if you smash your face against it.


220 ppi output with no subpixel rendering (ie modern Macs) has clearly visible jagged edges in angled lines and letters if you've got good vision or correct your vision to better than 20/20 (my case: I get headaches if I don't).

If you are coming from typesetting world, laser printers from the early 1990s did 600dpi (dots per inch), and that remains sufficient for smooth lines, though newer printers will do 1200dpi too. Going down to 300dpi printouts is crap.

Heck, newer Kindles do 300ppi and that can clearly be improved.

Apple's "retina", like all things in life, does work for 90% of the human population as advertised, but there's still a big number of people who have better angular resolution than what they target.


There's always "better" of course, but my point was more that "can't see the pixels" doesn't usually mean "I can't see the pixels if I sit back from the display a bit". When the iPhone 4 was introduced, no one said "what's the difference, I can just hold my (other phone) away from my face further and I don't see the pixels!"

Can you reference an example that shows this phenomenon with angled lines? I haven't had an eye test specifically, but my vision is generally fine, and I don't see the effect you're referring to, on for example a lower-case "y".


I have a 55" 8K and I can't see the pixels while sitting 2ft away. Everything is crisp and I have a huge workspace. For mac I use 4k native so 2x integer scaling.


I've used both. I quite honestly don't care. I've heard many people that share your sentiment. But some of us just don't. Visible pixels are totally fine for me.


I went back from using different displays in HiDPI to using a single 43” 4K screen set to 100 % scaling. Screen estate trumps invisible pixels [for me, at the moment].


Don't you long for the warm glow of CRT phosphors??


Not at all. CRTs are terrible in every way. What I'd like is uLED displays to maximize contrast and minimize lumens.


I didn't see any mention of how many times he has to pick up his mouse when it gets to the edge of the pad to get the mouse from one edge of the screen to the other.


Author here: I use a Logitech G Pro X Superlight but also I use the i3 window manager and rely on keyboard shortcuts for a lot of the navigation. I have the mouse sensitivity set so that the cursor can traverse the width of the screen when moving the mouse about 13 cm, without any acceleration. This is still precise enough that I can move the mouse pixel by pixel if needed.


I find it annoying that they've kind of got rid of mouse, tails and other easy ways of finding the mice pointer.

That's one of the main drawbacks of a massive screen is if you lose the pointer it takes a lot longer to find it. It's not a linear scale based on the width of the monitor. It's with the square.

So 50-in monitor is going to be about four times longer to find a mouse pointer than a 30-in one.

I don't like those hotkeys where you know it highlights it. I like the the mouse tail. That's the one that I can most easily find it. But generally those came out of fashion about 15 years ago


Pointer trails are still a feature in windows last I checked, and hitting ctrl to animate a circle around it works pretty much everywhere. I don't use either of these features nowadays, and usually find my cursor by moving it until I get to somewhere with high contrast.

I haven't seen anything I like quite as much for quickly finding the cursor as macos's "wiggle for giant cursor" feature.


Just switch to a neon pink pointer and make it larger. Problem solved.


Who needs a pad? And who needs the mouse to move more than 2 to 3 cm max, with dynamic acceleration? I'm just doin' 0.5cm twitches most of the times.


That's easily solved with mouse sensitivity settings, it doesn't matter the size of the screen if you set it properly.


With all that text, I'm hoping their religion is keyboard shortcuts.


i3/sway was recommended in the post, so yes.


Maybe they use a marble mouse or something like that.


The example he's chosen is of a ridiculously sized TV. 65" is living room TV size.

There are smaller, OLED displays that would be more suitable(while still rather big). Many are 'just' 4k, but the smaller sizes should give one a decent pixel size.


I actually spent $3500 on mine haha, back in 2021. Early adopter tax...


yeah but I kind of get it..


Absolutely. I remember watching Swordfish and wanting Stanley’s setup




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: