I was tickled to scroll down and see my own stupid avatar. Claimed my profile and wrote a note.
Thank you so much to everyone in the Ruby community. I spoke at conferences around the world for 13 years, made a ton of friends, and officially retired from speaking at Rails World in Toronto back in September. Some of the best people I've ever met, and the community transformed my life and career.
Was lucky to be at the conference when you gave it and it's stuck in my head since. When I'm struggling for motivation or my side projects seem to be just too big I give it another watch.
If I was building a competing expense system, the pricing slider would show how $5/user/month compared to how many dollars in rewards you (as the business owner) would be giving away by using a card issued by a "free" SaaS app based on total spend.
If a company decided to sign up for corporate cards directly such that _they_ were reaping all those rewards, that'd be perfectly fine and reasonable and it happens all the time. But as it is, employers are giving away the store by singing up for cards issued by these apps—and the software landscape is such that they don't have a ton of other options.
If your company is worrying about whether they or the employees are getting the credit card rewards for spending, instead of focusing on whether they are delivering a useful, valuable product or service that their customers will happily pay for, then they have lost their way.
As the author, I can answer this: because I believe there is such a thing as "enough money." And I'm personally at a point where the decision in my mind is not between "build a great product that makes less money" vs "release a compromised product that makes more money". Instead, the decision is "build this app for my own private use" vs "release it and put up with dealing with users so that others can use too". And I'm genuinely on the fence about the latter most of the time.
Also, I have a proven track record of stopping doing things once they've run their course, so I'm pretty confident I'll be able to resist the temptation. Recent evidence: announcing my latest conference talk would be my final one https://www.youtube.com/watch?v=loTaZAkIZP0
This is a huge reason why I still use StandardJS and—shifting back to Ruby—why I rejected the countless requests for implementing line-length or any other metrics analysis rules for Standard Ruby (https://github.com/standardrb/standard). There is always a legitimate edge case when it comes to length of lines and functions and the alternative—chopping them off arbitrarily—is rarely an improvement.
The HDMI port on the RTX 3090 is 2.1 (48gbps) and should be able to push resolutions higher than 4K, but it's no better than the DisplayPort. Both HDMI & DisplayPort carry audio signals, but the Apple Studio Display only pipes audio over USB, so a single passive HDMI or DP cable will not transmit audio
I am lifelong a gamer, have excellent vision, and routinely fail side-by-side tests comparing frame rates in excess of 60hz. Not everyone can perceive the difference. (One explanation is that some people may be able to essentially interpolate frames well enough that the information lost is not perceptible, in the same way that you might not be able to tell the difference between 144hz and 240hz).
That being the case, I'd much rather trade away frames in exchange for improved image quality, as I'm doing in this case.
This fascinates me. I hear it all the time that some people can see the difference and others can't. I was thinking maybe it's age related, like the degrading frequency response of the ear, but many people my age don't perceive it either. I'm in my twenties and the difference between 60hz and >=120hz is like day and night. My vision is not very good, though.
I even prefer using a high refresh-rate screen for office and coding use. I notice it a lot by dragging / moving stuff and scrolling.
Yeah, indeed. Not rocket science. I use a Mac all day for work and productivity. I play games sometimes on a PC. And as it pertains to games, I personally care way more about resolution and image quality than I care about frame rates exceeding 60hz, which is why I am super grateful to have this display and a beefy GPU that can easily drive it at native resolution at 60hz
Important to note that this is a 110dpi monitor and therefore not retina nor suitable for integer resolution scaling, whereas the Studio Display is 218dpi. I care a lot more about resolution and dot pitch than about frame rates higher than 60hz
That’s a good point — what I meant was that the DP connector found on graphics cards is pretty common (“standard”), so should be included. As a frame of reference, a similarly priced monitor I have has 2x HDMI, DisplayPort, thunderbolt 3, and a few USB A ports if you use it as a thunderbolt hub. And it’s a 144hz ultrawide with HDR support.
And while I acknowledge that Apple made a trade off for pixel density, they weren’t forced to make tradeoffs for other features that are common on high end monitors.
> the DP connector found on graphics cards is pretty common (“standard”)
So are D-sub connectors and HDMI. Should it have those as well? What about S-Video? Composite?
> they weren’t forced to make tradeoffs for other features that are common on high end monitors
They don't sell any products with DisplayPort connectors, nor do I think they've ever done so (certainly not recently). Selling a product with that connector is a commitment on some level to supporting what people plug into it, and as demonstrated by the OP, PCs are not really supported. It's also more engineering effort, but that's probably marginal.
This is a standard connector with cheap cables that adapt it to legacy devices using the DP alternate mode of USB-C. Apple are known for getting rid of legacy connectors. USB-C is a superior technology, it's only a matter of time before it's used for everything, and I appreciate Apple's efforts to hasten that day.
I have a Mac. I also have a work-issued PC. I have both machines connected to the same displays, and switch between them.
DisplayPort isn't some obsolete technology here. I understand why this display doesn't have HDMI, or DVI, or VGA, or Composite, or S-Video, etc. But DisplayPort would be nice.
That's how data cables/ports work. There is a DisplayPort "protocol", then there are DisplayPort physical connectors and cables. Then there are other types of physical connectors and cables that also can be used with DisplayPort "protocol". Same thing for USB, or HDMI etc.
Yes, it's confusing. But it's also a good thing as you can carry lots of stuff over the same cable and connector. A USB-C type cable and connector can carry USB, Thunderbolt, DisplayPort, HDMI, even analogue audio.
Before USB-C, Thunderbolt was using mini-DisplaPort connector and cables.
I don't think Apple really wants you to buy their monitor for use on a PC. It's just a support headache, between the cables and graphics cards and refresh rates and gamma and such.
That it happens to work on a PC at all is, I guess, an unintentional side effect that they'd rather not even acknowledge. You can try it unsupported but it's not their problem...
They've never been good at supporting standards not invented there. Hell, ever tried to use a Magic Mouse with a PC? Everything about it feels completely off. Part of me wonders if subtle incompatibilities, where hardware works 80% of the way, is maybe even part of their deliberate strategy to introduce small headaches to the PC experience to frustrate you into switching.
(Of course OSX has its own share of frustrations too, but nothing like using PC hardware on a Mac or vice versa).
I have one and somehow managed to get it into a mode where the UI was an appropriate size and it ran at 120Hz. Worked perfectly for almost a year, then I updated to Monterey and cannot for the life of me get it to run in HiDPI mode at my old preferred scaling. Tried all the third party monitor controlling apps (EasyRes doesn't have the scaling I'd like in the HiDPI list of resolutions)
I might just get rid of the monitor. I'm not convinced I loved it for my workflow anyway compared to 2 monitors where I can switch my Mac workspaces independently