Hacker News new | past | comments | ask | show | jobs | submit | bestham's comments login

There is also mkcert by Filippo Valsorda (no relation to mkcert.org) at https://github.com/FiloSottile/mkcert

Yup, mkcert is used by caddy which is used by localias :)

Is it really true that the phone must be passively listening? The field of the payment terminal will induce current in the NFC-coil and that should be able to wake the phone as necessary.

That's a common way of doing it, but Apple devices actively amplify the signal in card emulation mode as well, which gives them longer range than physical cards or "purely passive" devices.

But it also means they can't do the neat trick of paying with a completely dead (i.e. not even reserve battery power) phone that some early Android and Windows Phone devices could do.


Maybe I’m not understanding properly, but iPhones absolutely can do NFC payments when the phone is dead. Your nominated “express” card will work for transit payments, and I believe car and house keys continue to work too.

No, that still requires some battery on iOS, i.e. it's only possible in the same "power reserve" mode that still sends the occasional "Find my iPhone" Bluetooth beacon.

Field-powered mode is possible in at least some NFC chipsets, but I suspect that Apple either values a consistent NFC range more than usability even with a completely dead battery (the amplifier that grants a significantly higher NFC range to Apple Pay obviously needs power), they see it as a security feature (reserve mode is capped to a few hours, I believe), or their NFC controller simply doesn't support it.


that's seems like an obvious security vulnerability. if the phone has no power then how does it authenticate the payment request?

It only works with approved transit providers and you have to explicitly enable it so the exposure is fairly limited.

https://www.apple.com/uk/apple-pay/transport/


If you think about it, it's still more secure than a physical card – that also doesn't have any authentication method at transit terminals, but unlike the "Express Transit" option on iOS, you can't turn that functionality off at all.

That is true. But why would you replace a simple known to be working filter with this thing that yields worse performance? There are no upsides to this, and that is why the domain expert is baffled. We have been making filters like this for a long time, it is known how they should be evaluated and yet this thing is still touted as state of the art but its creators. If the image quality is lower with this filter and you say that is does not matter then the quality of the stream is to high. This filter is not going to solve that.


It apparently did better with users' subjective evaluations. I guess they liked the "extra detail" look, even if it's fake. End users aren't zooming in to freeze frames or, heaven forbid, taking screenshots (think of the copyright!!)


End users aren't domain experts. That's like if I was allowed on the factory floor where my favorite products are made so that I could make subjective, uninformed decisions about the manufacturing process which also affect everyone else.


Is it? Or is it like if you were allowed into a focus group where they let you try out two manufactured products, so that you could make subjective, uninformed decisions about the product lineup which also affect everyone else?


Laundering manufactured consent for cheaper, inferior products as consumer choice is a pretty old trick.

A consumer's perceived preference can be modulated by any number of issues with which the consumer is not properly informed, such as preferring one sweetener over the other, even if there are valid health concerns, thanks to generational brainwashing and an intentional lack of consumer training. What we have today is the consumer market equivalent to the unsophisticated investor market. People getting economically exploited and feeling like it was their own idea.

In the case of video encoders... Actual domain experts should not be forced to cede to the whim of an untrained eye simply for the purpose of capitalists extracting more money from them.

It's one thing for Netflix to experiment with the latest advances in machine learning, as a lot of us do. But when the opening sentence is to their blog is "When you are binge-watching the latest season of Stranger Things or Ozark, we strive to deliver the best possible video quality to your eyes", it's hard not to find issue with A) the commodification of entertainment and patronizing consumer speak and B) a misleading preposition about commitment to quality which the cited response article makes clear is untrue in the case of this product.

Deferring to domain experts in this case is better for every consumer, as they will not be duped into exchanging an increasingly degraded experience in exchange for increasing monthly rates. People feel very strongly about film quality, and for good reason. We're talking about the preservation of art and culture, but Netflix doesn't see it this way, as to them, it's all a commodity, and they manufacture consent for commodification. If the average user can't see this and push back against the degradation of quality that comes with commodification of art, I have a hard time deferring to them over an actual expert.


Yeah, but there are classical algorithms with tunable sharpening that are cheaper and widely known (e.g. catmull-rom or the "magic" kernel sharp algorithm by John Costella).

My suspicion is that none of this mattered though, because the evaluation was probably "perceptual equivalence" vs bitrate. I can easily believe it might be a marginal win over traditional algorithms from that perspective.


That is the exact same phenomenon. Artificial sharpening is introducing high bandwidth components to a signal. If you bandwidth limit (low pass) a signal to fit below the Shannon-Nyquist limit you will get ringing as the signal cannot be represented accurately and will smear in the time domain. Given a bandwidth constraint, artificial sharpening above a certain threshold will result in ringing.


Images don't have infinite bandwidth, so that doesn't apply. The filter used in H264 and newer codecs is exact and nearly reversible, there aren't artifacts from applying it. The artifacts come from the rounding afterward.


I guess Tailscale or Mullvad should consider hosting you.


This begs the question; Is stills still how people use Wikipedia?


Due to the way iOS apps are sandboxed together with their user created content a lot of users have video projects that are locked into CapCut without an easy way to access them following the ban of the TikTok suite of apps. Remind me how your iPhone is yours, when your creations on your device can be locked away from you.


Well, I have access in Files to a lot of content from my apps - that’s a decision of the app creator to not use this and keep the content created in the locked area of the app.

For example, the apps from Omni do this, as do obsidian, Linea…

Let’s assign the blame where it should be here.


> Let’s assign the blame where it should be here.

Obviously the blame lies on Apple for locking away your device's contents from you. Developers should not be able to have more control over what you can access on your device than you do. Even if they make bad choices (like making accessing the files hard) it should be you who has the final say, not them.

Apple making it possible for developers to make bad choices and go against users' control over their own devices is to blame.


But all other platforms also make it possible for developers to make bad choices, so I’m not sure why Apple is being singled out here?


Because Apple invented this kind of walled gardens.


Try to hide the files on GNU/Linux.


That's easy, just store all the user-generated data in an encrypted file. You might be able to copy the blob but the vast majority of users won't be able to extract the files within.


This is much harder and can't be done accidentally, unlike on Android.


An iPhone is a very non-typical device. Apple is a non-typical company which builds lock-in to every step of the process.

If you chose to use iAnything then it's a bit late to start complaining about lock in now.


When ‘not typical’ is actually the norm for a huge swath of users, perhaps non-typical is not the right term?


In a pile of devices, Apples are non-typical. The number of users is not terribly relevant.

However, sure, lots of users chose Apple knowing exactly what it is. Apple's not going to change since their model clearly appeals to lots of people.

If you don't like Apple's model, then don't choose Apple devices. What everyone else chooses is somewhat irrelevant to you. (Other than network effects noted earlier.)


The two things you brush over are the most important though - and feed into each other: network effects are relevant (and very much so because they affect all sorts of things you can do with something) and they are directly influenced by the number of users, which makes them incredibly relevant. What others choose are also relevant because of these network effects.

I can hack up a "device" with a raspberry pi zero or whatever and call it "HaxyDeck" and claim it is all open to anyone who wants to tinker with it, but at the end it'd be irrelevant because only me (and perhaps a couple other people) would have it. The aspects you want to ignore (number of users, be something other than Apple, what others are using) would actually affect my use of HaxyDeck directly: since i'd be the only one (or one among a tiny number) using it, i'd be the only one having to make it do things i want, it wont have software from others, it wont support software other people may want to use for communication, software some services that theoretically have nothing to do with phones or computers (e.g. banks) wont work because HaxyDeck's userbase is irrelevant for them, etc. All of these have to do exactly with what others are doing.

Basically see how all the non-Android Linux phones (like PinePhone) are faring. You can't just ignore what effect having a large user base some platform (be it a device, an OS or even service) has and say "just use something different".


They’re roughly 18% of the phone market (as percent of users), but 68% of the market as a share of revenue.

They are hardly irrelevant, especially if you like money.


Non-typical compared to what? It's not any better on Android, unless you root it. Google has been going out of its way to deny users access to data stored on their phone, by allowing and encouraging apps to claim sole ownership on data, as well as removing interoperability features (around which Android OS was initially designed), all in name of sekhurity.


That's not iOS fault. Apps can store their files in a folder visible in the Files app, or can ask the user to open a file or folder from a file provider (also visible in Files app), or to save a file or folder in a file provider (always visible in the Files app).

It's not the 2011 iOS anymore, if an app today hides its video projects from the user, it's entirely the app fault.


Arguably this is still on Apple, because they don’t let you access the full filesystem as you can on other operating systems, and in particular because an app developer may rightfully want to create a class of internal-use files that are not explicitly exposed to the typical user, but would be available to users seeking them out.

I imagine, for example, that if the internal project files for a popular video editing app were accessible, we’d see competing and/or open source apps emerge that could parse them, were the original app to become suddenly unavailable. Instead they’re just lost because your phone won’t let you access them.


Well, you can access them by using something like iMazing. I agree that there should be a way to see the entire file system, even if read-only.


Blame can be shared. The OS vendor for providing a way for applications to hide files on the user's filesystem from the user, and the application for using it instead of making the user's documents available to the user. They are both working together in unison against the user.


imo it's the platform's choice to have default-visible or default-sandboxed program outputs and data.

while possible, it is fairly non-trivial for iOS apps to have read/write access to a shared folder where they can drop arbitrary files, which can then be accessed by other apps, or be discovered by the user. it often requires copious permission negotiation handling codepaths by the developer, and a fearlessness of scary permission-warning dialogs by the end-user.

even on modern (commercially popular flavors of) Android which no longer imbibes the "free software" ethos of the linux core the OS was built around, you can't access formerly accessible application sandbox folders without installing third party browsing tools or plugging into a desktop computer to mount the storage, and cross-application sandbox access is similar to iOS.

in the "personal computing way" mentioned by the article (even today on desktop environments, less so on MacOS) program outputs are default-visible, and developers have to go out of their way to firewall or obscure or encrypt it from being accessible by the user or other programs using OS-provided pathways.

i think this is 100% on the OS + hardware + application platform provider (with Apple as all three on iOS).


VLC used to have a built-in compressor as a plugin by default. Try it and see what you think.


You can buy a LG compatible remote on Amazon for $ 10-20 with a pause button that will most certainly work. The codes are still being interpreted, just that they removed or repurposed the physical button on the remote. My kid destroy LG remotes on a yearly basis and I have purchased the cheap one (non magic wand or whatever they call the Wii like experience). It does what I need it to do. Change the brightness, input, channel and volume as well as HDMI-Arc play pause.


No, IIRC Apple Intelligence does not make such claims. It can keep context within interactions and know about information in some of your silos. Recall has a much bolder feature set in that it wants to beware on everything you see on your screen.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: