(Am picking this nit because the PC is historically very important to our broad industry, and maybe people in the future should still be aware it was a very solidly-constructed piece of office equipment, not a relatively flimsy-looking home computer.)
I grew up using an IBM PCjr, and it's not flimsy at all---it is solid and well constructed (with the possible exception of the keyboard). I will grant you that it's marginal at being a PC, but it's not cheaply built.
This week we had a UTF-8 issue that only impacted Safari. One of my colleagues had bought a brand new Macbook Pro and our web-based software was fatally struggling with its default browser. Chrome was working fine.
It took us days to get to the root cause. Turns out that the issue wasn't Safari, it was Mac OS. Chrome was working fine because presumably they do the UTF-8 handling on their own.
I always hear rosy thing about how polished the Apple ecosystem is, but you can imagine how unamusing it is for my small business to spend $3000 of staff time on such a basic thing. I guess I understand it better now when people rant about Safari being the new IE.
At the risk of repeating comments, welcome to Unicode Hell. And possibly to cross platform development :)
Also, just because Chrome feels faster and logs you into your google account without permission, it doesn't mean you shouldn't support alternative browsers. Do not only test your website in Chrome. It may get ignored as not working by non chrome users.
> At the risk of repeating comments, welcome to Unicode Hell. And possibly to cross platform development :)
As who has spent a huge amount of time the last few weeks due to an extremely obscure and hard to reproduce Unicode bug in Windows, I can vouch for this. So can my ever receding hairline. Turns out, Unicode is hard.
I had one a few weeks agos where, if the user had updated Safari, but hadn't updated macOS, then their browser didn't _actually_ support WebP, even though Safari 14 lists it as "supported".
File format support is baked into the OS. Apple actually makes static linking very difficult for exactly this reason--so they can reliably introduce support for new file formats on OS upgrade, rather than some apps needing to catch up.
The classic Safari nonsense bugs. When I see a new sentry issue popping up, I can instantly guess if it's going to be Safari or a real error just by looking at the error text.
I've had a problem with Google Slides in Firefox, but I put the blame for that solely on Google.
Using right-click context menus to copy/paste only works in Google Chrome. My understanding is that for security reasons, the browser doesn't allow websites to initiate a clipboard read/write. Google Slides recreates the right-click menu in JS, and so no clipboard events can be made from it. But Google Chrome apparently gives special permissions to Google websites, allowing them to have special enhanced privileges to initiate clipboard events.
I'd say that's definitely something that is an abuse of market share, as Google is using market dominance in the browser space to have a competitive advantage in the unrelated presentation software market.
> Chrome was working fine because presumably they do the UTF-8 handling on their own.
this is why I use Qt and ship all the dependencies vetted manually on every platform (freetype, libav, etc). Only way to stay sane is to bypass the operating systems as much as possible and use the exact same code no matter the platform except for the barest-bone operations (opening a GL context, getting key/mouse events, etc). Text handling and rendering, video and image decoding, ... that should never depend on the OS you're using, those are properties of the apps. The more time passes the more I'm thinking that networking also should be deported into the apps too and bypass the OS network stack entirely given how much trouble it tends to give (hello the Bonjour / DNS-SD / ... mess).
> Text handling and rendering, video and image decoding, ... that should never depend on the OS you're using
I like having my fonts look the same everywhere, and only needing to configure them once. I also like my video and text do decode securely and accelerated.
We had unicode characters in strings in our database. When those were being fetched by the front-end on Safari, the browser was making them a different character. This meant if the exact same string was going back to our server after being in Safari, any kind of string-matching was failing. So let's say you were looking up something based on the string, it was not there.
We did some unicode testing on different platforms and this was never an issue. Also to note these weren't complicated emojis or anything like that (e.g. character & skin-tone). Just usual characters with accents that are not in English.
The resolution for now has been using String.Prototype.normalize() with some tricks. It works for our case, but obviously it is not a silver bullet.
That's not a UTF-8 issue. That's Unicode. And macOS is not incorrect here.
> So let's say you were looking up something based on the string, it was not there.
What this tells me is you're storing unicode text in a database without using unicode-aware string comparisons. This means that regardless of macOS, you could run into this issue just by having someone submit text in a manner that uses a different normalization form than whatever you've been using, your lookup will fail.
And it's not just NFC vs NFD. There's also NFKC and NFKD, which are the compatibility forms. For example, if I type fi (U+FB01 LATIN SMALL LIGATURE FI), that has the same NFC and NFD forms, but in NFKC and NFKD it becomes fi.
> The resolution for now has been using String.Prototype.normalize() with some tricks.
Please fix your database to use a proper unicode-aware comparison instead. I don't know what database you're using but this may involve simply telling it what the text encoding of the column is. This is especially true when you start thinking about compatibility normalization (i.e. NFKC/NFKD). I believe the compatibility forms are generally appropriate for string comparisons (e.g. if someone searches for "fi" they should get results for "fi"; try it out in your browser's search field right now if you like) but you certainly don't want to store your text that way. And since you're presumably relying your database to do the lookup for you, this means you need your database to be unicode-aware, otherwise you have no choice but to store the compatibility canonicalization form in order to get search to be correct.
Heh... This reminds me of a bug we ran into a few years ago when rehydrating JSON content for a website my team was working on.
To render a page, we pulled up some JSON data from our database and rendered HTML based on that. To rehydrate the page, we baked the JSON data itself into the page too. We did that by naively JSON.stringify()'ing our database values into a script tag at the bottom of the page. The browser was essentially eval-ing the content in into a variable for us.
But oh how young and naive we were. Before long one of our clients started complaining that our web app was broken for them. We took a look and sure enough, the page was broken. We were scratching our heads for a few hours over that one. The client had somehow managed to insert a weird, invisible character into a string in one of the JSON objects we were rendering. It turns out, contrary to all expectations and sense, JSON isn't actually a strict subset of javascript. There are some perfectly valid JSON values which aren't valid javascript. Somehow our client had accidentally created one such value. When it was rendered back out into our script tag, the browser threw a parse error because our JSON wasn't valid javascript. And that made it abort loading our page's javascript bundle, and everything went sideways.
Of course, the other problem with this approach is that if any string happened to contain "</script>" then the browser would have considered that point the end of the javascript content. That would have also done weird, wild and potentially dangerous things to our page. But luckily we found that before any of our users stumbled on it. At least, as far as we know.
I've noticed on several occasions that text written by Mac users and published online would replace the Swedish letters ä and ö with a¨ and o¨. I think it's due to some combination of how they do Unicode normalization and how Firefox on Windows renders the text. I haven't seen it for at least a year, so I'm guessing it's resolved now.
Same with Firefox on MacOS. Chrome and Safari do find "fi" for "fi" and "fi" for "fi".
This suggests some questions.
1. Are fi.com and fi.com the same as far as DNS in concerned, or could both exist as separate websites?
2. If they are separate websites, do the various password manager browser extensions recognize they are different, or might they fill in your fi.com password when you are getting phished with fi.com? Same question for the various browser's built-in password and form savers.
Thanks for trying to be helpful. Of course, I over-simplified our 4 years of development and the specialized field we are in. There isn't really a database involved here. I am sure there are many more correct ways for us to do certain things, that said, I respectfully disagree and macOS is the issue here.
When you can be compatible with everything else, please try to be. MacOS isn't in this case. I would rather stand on the shoulder of giants who spare my staff from having to learn about "NFKC" and "NFKD". It's nicer to spend that time with our families. That's my humble opinion and personal values of course.
> When you can be compatible with everything else, please try to be. MacOS isn't in this case.
You say that like macOS is bucking the trend. NFC is a very common normalization form for text input, regardless of whether the browser chooses to normalize. And older versions of the W3C "Character Model for the World Wide Web: String Matching" note[1] even recommended using NFC explicitly, although the current version says that skipping Unicode normalization is the recommended form of matching for "new specifications".
As for Safari, it's been doing NFC normalization ever since 2006[2], and it sounds like this was originally at least partially-motivated by fixing a compatibility issue with Windows.
In any case, Safari converting text to NFC is certainly differing behavior from non-WebKit browsers, but if you give someone text in NFD and they delete and retype it, it's likely to end up in NFC anyway, which means this is a problem regardless of browser.
You said I always hear rosy thing about how polished the Apple ecosystem is. But this more sounds like a bug in your software. Many accented characters have two representations in Unicode, so you always have to normalize strings before making comparisons or use a comparison function that takes different encodings into account.
It sounds like this would be a nightmare in Unix filenames; even if everything in my system is perfectly clean utf-8, I could be trying to open "café.txt" and get a No such file or directory error because the input has é encoded a different way from what is saved on disk. You would need to list the directory contents, decode everything, perform the comparison in Unicode code points, and pray there is no more than one match. Any web UI that deals with files would have this kind of issues.
Some older filesystems, like HFS+ would normalize filenames. In APFS Apple decided not to normalize file names, but this caused a lot of issues. Later they have made APFS normalization-insensitive (so you cannot store two files with the same name, but different normalizations). Different normalizations are handled transparently by macOS frameworks.
On Linux with btrfs:
$ touch schön
$ cat $(echo -e "scho\u0308n")
cat: schön: No such file or directory
$ touch $(echo -e "scho\u0308n")
$ ls -l
total 0
-rw-r--r-- 1 daniel daniel 0 Dec 17 12:00 schön
-rw-r--r-- 1 daniel daniel 0 Dec 17 12:00 schön
$ ls | hexdump -c
0000000 s c h o 314 210 n \n s c h 303 266 n \n
We learned that the hard way. Still, the basic dev expectation is to get the same thing out that they have put in. It shouldn't sound outrageous that we expected to get the same representation. That's all.
With all due respect, software development is complicated and it sounds like you just weren’t aware of this aspect of it. Which is fine. But it’s a bug in _your_ software, not Safari.
Yes, and this threw us off because we thought we weren't communicating the encoding correctly with Safari. A day later we finally realized that's not the issue.
From my perspective, the hours worked may add up to $3000 in salary, but the staff member who solved the problem may feel very good about their accomplishment.
Apple has severely cut back on QA and simply cannot claim to publish quality software anymore. The number of very easy to find bugs I run into makes that obvious. As an example, the latest iOS update introduced a bug that causes the input box to disappear completely on an iMessage thread so I can’t send messages unless I close it and come back. Happens daily. Embarrassing.
It doesn’t help that I have a friend who works at Apple and acts as though the continued bugginess is my fault unless I take the time to undergo the pain of submitting bugs, for which one never receives a response and which are ignored for literally years on end. We are all unappreciated beta testers.
> Several readers used the same term for what they think the industry has made them become: gamma testers.
> "Gamma testing is just like beta testing in that you get a product that's too buggy for prime time and you help the vendor iron out the problems," explained one. "The only difference is that now we're bug testing what's supposed to be a shipping product that we paid for. And likely as not, these days they'll try to charge us for the privilege of reporting the bugs we find."
> Plus, WORDVISION is designed to be "support-free" after the sale, and our unique Pioneer's Club of "gamma testers" is helping insure we live up to that goal.
> ... at least a decade since I started calling myself an unpaid gamma tester, for example. But people thought that one meant I specialized in displays and monitors...
Yea, I’m feeling this. I submitted a bug report, and while there was a lot of “care”, the result was “not a bug, won’t fix”. On the one hand, I can understand that. But on the other, it’s a crappy user experience, and the apple I supported, I’d expect to fix it.
It’s why I’d rather support opensource now. As, at least if no one cares to fix it, it can do it myself, or pay someone else to.
The bug: the default iOS mail client chokes if you copy and paste an email link into the to, cc or bcc field. Ie: “mail to:test@test.com”.
The response was “that’s how links work, that’s not a bug in the client”, and while technically that’s true. The best user experience would be parse out the “mailto:” when pasted into the to,cc,bcc field rather than failing to send the email.
And yes, from safari you can select “open in mail” and it will work. But if you need to copy a second mail link to cc, you’re stuffed.
And there’s no way to edit out the “mailto:” portion.
So the solution is to copy and paste the link into a text editor/notepad, remove the “mailto:” portion and then copy and paste the remaining portion into the mail client to,cc,bcc field.
And I think a better solution would be to either:
A) automatically parse out the mailto: portion when pasting into the to,cc,bcc field.
Or B) prompt the user before doing A. Ie “do you mean to send to test@test.com?”
I had a Macbook pro for about 6 months about 5 years ago. I filed afaik 5 bugs, 4 of them for screen related issues. (no external screen without powering the laptop, external screens don't come back after sleep, etc ..) all bugs that have been in the whole series (other colleagues got the same 2500$ beast, with the same issues)
Not a single one was answered in the time I worked there. A few got some 'me toos'
At the same time I filed several bug reports for the new Gnome desktop that was rolling out and they all got answered, fixed or at least closed with a short answer.
It was only a short time I worked in that ecosystem. But as far as it worked out for me, I got zero support for several major bugs. Definitely not what I expected. Definitely not a OS I am willing to spend productive time on.
When was the last time you heard of someone actually buying a copy of Windows, though? I've owned a lot of different versions of Windows in the past 30 years, but never paid for a single one.
I know someone who bought a copy of Windows 10 for their computer, because they asked me to help install it. It was definitely not an OEM thing, and the packaging had a USB drive in it.
Don't you just get a free license when you buy their expensive ~locked down~ hardware? Afaik you don't get free osx licenses for hackintosh or virtual build machines.
> Don't you just get a free license when you buy their expensive locked down hardware?
Apple hardware capable of running macOS may be expensive, but "locked down"? Definitely not. You can run Debian on anything from old PowerMacs to the new M1 model, although I admit that the driver situation on M1 isn't feature-complete.
Don’t be like that. Providing it for free does not mean that it has to work on your device. By that definition, no software (including non-Apple) for macOS is ever free because it requires a more expensive computer to run.
Besides, there’s guides-a-plenty all over the internet about making “Hackintoshes”.
Then the comparison on pricing between Windows and MacOS that grandparent made doesn't make sense if one cannot be bough without accompanying hardware, unlike the other, effectively making MacOS priced into the hardware itself, so the comparison falls flat from the start.
By that logic iOS and Android are also free and you can alo get Windows for free with various OEM Laptops and system if we use that logic with the difference that you can choose to purchase Windows licenses to use on a machine you build yourself vs the inability to purchase an separate MacOS license without any Apple hardware.
So any comparisons on the price of the OS are moot since the cost of MacOS is priced into the accompanying hardware, MacBook, iMAc, etc as you cannot buy it without.
Characterizing macOS as something Apple intends to distribute as free is mostly false. They intend to charge for it with the purchase of a Mac.
Microsoft also distributes Windows "for free" on their website and if you care not at all about license terms as we evidently don't when it comes to Mac OS, then it presents no moral quandary to disable its activation annoyances. Is Windows free?
There is no license that can be purchased for MacOS. It literally has no cost in any circumstance. This is just tortured sophistry to arrive at the tremendous conclusion that Windows does, and sometimes doesn’t, cost money. Is windows free? "Sometimes!"
You're intentionally stepping around my arguments. Sure, it has no cost for Mac owners, but the ISO is free to download, and Hackintoshes are very much a thing. Again: if we use your definition, then anything requiring paid hardware cannot, by definition, be free. Which is absurd to suggest. No one I've met claims that free macOS exclusive software isn't actually free. Why is the OS any different.
Violating the EULA that Apple makes you agree to during install (by making a Hackintosh), while a tortuous action, is not necessarily illegal, and it doesn't cost money.
It's wonderful that so many smart people are uncovering bugs in widely used software, but it gives me a sense of dread for all the countless bugs of varying severity still undiscovered. Our digital society is like a Jenga tower. I think I need another coffee.
For Safari users seeing persistently confusing or inconsistent results: it’s also worth noting that Safari aggressively caches and ignores cache-invalidating HTTP headers quite a lot. So you might just be seeing what’s in memory, over and over.
Er, the different experiences Safari users are reporting may very well have to do with how it’s showing cached resources. If some people are reporting frequently different results for a race condition and others are reporting stable results, aggressive caching very well might be a factor.
When your browser caches an image it doesn't cache the pixel buffer itself, it just cached the original file (the PNG in this case). Because the PNG is non-deterministic it can render differently every time you open it even if you save it to disk and open it from there.
It's a race condition. Under the right circumstances, eg hardware at the right speed, under the right load, it might very well only result in one thing or another. The page is only ever serving one image anyways, so you can cache it as hard as you want.
Let's not forget that Apple want us to trust them to review our image files before they call the police, when their client-side content scanner detects too many matches against an opaque police database of hashes.
I don't think Apple have revealed the specifics of what process their human reviewers use (as a "security by obscurity" approach, and to prevent the bad press of admitting that they potentially subject low wage contractors to reviewing mentally damaging illegal images), but it's at least conceivable that an iPhone user could receive a series of images which look like (for example) bad scans of paper documents, but, when rendered on a reviewer's screen, look like images that require police attention.
I see the blend of both on iOS Safari as well, and it loads differently each time I refresh. It’s a race condition as described, so presumably whichever concurrent sub-render finishes first wins. That’s more than enough computing noise to produce that output.
That is indeed the "intended" behaviour. I don't own an iOS device, and based on initial testing by someone else, it seemed like it wasn't working. But from what I hear now, it does work on iOS too.
The "ambiguity" is introduced by Apple's incorrect algorithm. So, sure, Apple can introduce whatever mistakes their users will put up with and you can call this "ambiguous" and you can write tools that report oh dear, this data was "ambiguous" but what anybody who isn't inside the Reality Distortion Field sees is that Apple screwed up as usual.
If you're hosting an image, you can also have some fun with referrer, user-agent, geo-ip, etc, based redirection. Where you post an image in a forum or similar, and different users see a different image.
Same here, but I found the image flipped randomly over several reloads. Seems the race condition exists in that POC also. Perhaps there's a hardware condition in the authors' particular iDevice that causes it to consistently favor one path.
Apple is optimizing some PNGs so they can encode/decode them in parallel. These PNGs are still compliant PNGs, so non-Apple decoders can still read them. This optimization includes an `iDOT` chunk that macOS handles but other PNG readers do not, enabling the optimization. There is a bug in the macOS parallel decoder, however, causing these images to look different on macOS than they do elsewhere. (Even within the same macOS instance, you can get two different results from the same image.) The author has also created a PNG that demonstrates a race condition within the macOS PNG decoder: https://www.da.vidbuchanan.co.uk/widgets/pngdiff/race.html
Surely someone has realised by now non-Apple plebs will have no idea what this is about. Any screenshot/photo of screen to show the "problem"? thank you
https://en.wikipedia.org/wiki/IBM_PCjr
https://en.wikipedia.org/wiki/IBM_Personal_Computer
(Am picking this nit because the PC is historically very important to our broad industry, and maybe people in the future should still be aware it was a very solidly-constructed piece of office equipment, not a relatively flimsy-looking home computer.)