Doesn't even have to be a surprise. Pretty much startup employment agreement in existence gives the company ("at the board's sole discretion") the right to repurchase your shares upon termination of employment. OpenAI's PPUs are worth $0 until they become profitable. Guess which right they'll choose to exercise if you don't sign the NDA?
The Pico 4 is actually a really good headset. It's not officially released in the US market, but can be imported from EU sellers, which is how I got mine. The resolution is pretty good and good enough to replace a monitor, imo. I've used it with the Immersed app and liked it. It's also really small and light compared to other headsets.
The Pimax Crystal has great resolution and virtually no glare, but it is too heavy and bulky to wear for extended periods of time.
At work, I have setup and support the Meta Quest 3. It's also a good headset, but I don't feel compelled to get one at all. The visuals comfort are on par with the Pico 4, but the Pico 4 is cheaper and not dragged down by the Meta software. Though, it should be noted that I use it for PCVR almost exclusively with Virtual Desktop.
fantastic list! although obviously no such plan can really cover the whole space, i cant help but want to suggest adding some languages representation to the list; a simple interpreter or compiler project, or alternatively some symbolic computation eg algebra or differential calculus. being exposed to this stuff via Racket was hugely transformative for me (e.g. https://beautifulracket.com/stacker/why-make-languages.html), but if lisps arent your thing theres plenty of great projects in e.g. java as well: https://craftinginterpreters.com/contents.html
Very cool! Here are some of my favourite Eastern European films, in case anyone wants recommendations.
Poland:
Rękopis znaleziony w Saragossie (1965) !!!!
Sanatorium pod klepsydrą (1973) !!!!
Ziemia obiecana (1975) !!!!
Osobisty pamiętnik grzesznika przez niego samego spisany (1986)
Pociąg (1959)
Pokolenie (1955)
Kanal (1957) !!!!
Popiół i diament (1958)
Czechoslovakia:
Holubice (1960) !!!!
Ostře sledované vlaky (1966)
Marketa Lazarova (1967) !!!!
Obrazy starého sveta (1972)
Spalovač mrtvol (1969) !!!!
Slnko v sieti (1962)
Zlaté kapradí (1963)
Údolí včel (1968)
Hungary:
Csillagosok, katonák (1967)
A Pál utcai fiúk (1968)
A tanú (1969) !!!!
Kárhozat (1988)
Két félidő a pokolban (1961)
Az ötödik pecsét (1976)
Szegénylegények (1966)
Szindbád (1971)
Szürkület (1990) !!!!
NOTE: the films with !!!! next to them are my absolute favourites and if you don't have time to watch all of these films, I implore you to at least watch these.
Nothing is secure. Once we remember that, we'll stop nitpicking improvements.
Use your own server? Great, it's secure software-wise, but if someone broke into your house, it's all of the sudden the worst liability ever. The next thing you know, your entire identity, your photos, everything is stolen. You have excellent technical security, perhaps the weakest physical security.
So new plan, you use a self-hosted NextCloud instance on a VPS somewhere. That's actually not much smarter than using iCloud - VPSs handle data warrants all the time. They also move your data around as they upgrade hardware, relocate servers, and so forth.
So new plan, you use iCloud E2E encryption. You have to trust that Apple does as they say, and trust that their algorithms are correctly functioning. Maybe you don't want to do that, so new plan:
You use a phone running GrapheneOS, with data stored on a VPS, with your own E2E setup. Great - except you need to trust your software, and all the dependencies it relies on. Are you sure GrapheneOS isn't a CIA plant like ArcaneOS was? Are you sure your VPN isn't a plant, like Crypto AG? And even if the VPN is legitimate, how do you know the NSA doesn't have wiretaps on data going in and out, allowing for greatly reducing the pool of suspects? Are you sure that even if the GrapheneOS developers are legitimate, the CIA hasn't stolen the signing key long ago? Apple's signing key might be buried in an HSM in Apple Park requiring a raid, but with the GrapheneOS developer being publicly known, perhaps a stealth hotel visit would do the trick.
So new plan, you build GrapheneOS yourself, from source code. Except, can you really read it all? Are you sure it is safe? After all, Linux was nearly backdoored with only two inconspicuous lines hidden deep in the kernel (the 2003 incident). So... if you read it all, and verify that it is perfect, can you trust your compiler? Your compiler could have a backdoor (remember the "login" demo?), so you've got to check that too.
At this point, you realize that maybe your code, and compiler, is clean - but it's all written in C, so maybe there are memory overflows that haven't been detected yet, so the CIA could get in that way (kind of like with Pegasus). In which case, you might as well carefully rewrite everything in Rust and Go, just to be sure. But at that point, you realize that your GrapheneOS phone relies on Google's proprietary bootloader, which is always signed by Google and not changeable. Can you trust it?
You can't, and then you realize that the chip could have countless backdoors that no software can fix (say, with Intel ME, or even just a secret register bit), so new plan. You immediately design and build your own CPU, your own GPU, and your own silicon for your own device. Now it's your own chip, with your own software. Surely that's safe.
But then you realize there's no way to verify, even after delidding the chip, to verify that the fabrication plant didn't tweak your design. In which case, you might need your own fabrication plant... but then you realize that there's the risk of insider attacks... and how do you even know those chip-making machines are fully safe? How do you know the CIA didn't come knocking and make a few minor changes to your design, and then gag the factory with a National Security Letter from giving you any whiffs about it?
But even if you managed to get that far, great, you've got a secure device - how do you know that you can securely talk to literally anyone else? Fake HTTPS Certificates from Shady Vendors are a thing (TrustCor?). You've got the most secure device that is terrified to talk to anybody or anything. You might as well start your own Certificate Authority now and have everyone trust you. Except... aren't those people... in the same boat now... as yourself... And also, how do you know the NSA hasn't broken RSA and the entire encryption ecosystem with that supercomputer and mathematicians of theirs? How do you know that we aren't using a whole new DUAL_EC_RBG and that Curve25519 isn't rigged?
The rabbit hole will never end. This doesn't mean that we should just give up - but it does mean we shouldn't be so ready to nitpick the flaws in every step forward, as there will be no perfect solution.
Oh, did I mention your cell service provider always knows where you are, and your identity, at all times, regardless of how secure your device is?
Edit @INeedMoreRAM:
For NextCloud, from a technical perspective it's fantastic, but your data is basically always going to be vulnerable to either a technical breach of Linode, an insider threat within Linode, or a warrant served (either a real warrant, or a fraudulent warrant, which can happen).
You could E2E encrypt it with NextCloud (https://nextcloud.com/endtoend/) which would solve the Linode side of the problem, but there are limitations you need to look into. Also, if a warrant was served (most likely going to be authentic if police physically show up, at least more likely than one they served your data over), you could always have your home raided, recovery keys found, and data accessed that way. Of course, you could destroy the keys and only rely on your memory - but, what a thing to do to your family if you die unexpectedly. Ultimately, there's no perfect silver bullet.
Personally... It's old school, I use encrypted Blu-rays. They take forever to burn, but they come in sizes up to 100GB (and 128GB in rare Japanese versions), they are physically stored in my home offline, and I replace them every 5 years. This is coupled with a NAS. It's not warrant-proof but I'm not doing anything illegal - but it is fake-warrant-resistant and threats-within-tech resistant, and I live in an area where I feel relatively safe (even though this is, certainly, not break-in-proof). Could also use encrypted tape.
> [TikTok]'s addiction-based advertising machine is probably close to the theoretical maximum of how many advertisements one can pour down somebody’s throat.
Well put. It's interesting that we pivoted in my adult lifetime from:
1. Myspace's emphasis on sharing things on your own webpage, essentially a hosted blog
2. Facebook's evolution from "hosted blog" to "friend update aggregator" to "chat client" to "friend update & ad aggregator"
3. Instagram's callback to simple update sharing (with pictures) and a chronological ad-free news feed
4. Snap's emphemeral sharing
5. Facebook's slow agglomeration and bastardization of all of the features that made Instagram and Snap distinct.
6. TikTok's addictive advertising machine that barely includes any friend connections at all.
Initially I was concerned that this would mean the death of real social media, just like the article initially suggests. But I really like the conclusion the article ultimately comes to: we basically don't have social media right now, we have advertising engines masquerading as social media. Better that Facebook, Instagram, and Snapchat show their true colors and become disgusting advertising machines just like TikTok.
If we're lucky, that means a federated, open, mostly-ad-and-suggestion-free open source social media experience can fill the power vacuum for intimate, interpersonal, high-latency communication over the internet. microblog seems promising, but I think even mastodon could provide the experience I'm looking for.
No matter how many times you say that, it still won't be true. A mere few minutes spent looking at the Matrix protocol and a few minutes spent looking at XMPP (if you can get past the fact that it's XML) will make that clear. Matrix is basically a case study in how not to design a federated chat protocol. It's bloated, it ignores all prior art and reinvents the wheel everywhere it can, and its excessive complexity means there is unlikely to ever be a wide ecosystem of good-quality implementations (especially of servers). Just because they made a foundation and threw around the word "standard" a lot doesn't mean it deserves to be the standard for chat protocols.
> Every policy or process doc I write now has a section called “Reasons to Revisit.” It is essentially a reverse success criteria. Rather than a short list of things I would expect to see if the policy was successful
Wow, what a great piece of advice. It seems so obvious and simple in retrospect and yet I never thought about something like this.
One of the things I miss working at Facebook is the internal FB (called Workplace). It basically replaced email for the whole company, and worked really well for long form posts where maybe you would write some proposal up or make an announcement and get a bunch of comments. I now work at a slack centric company, and it just doesn't compare - there is no 'newsfeed', so unless you remember to check all the channels, you end up missing stuff. You can send out a group email, but people are reluctant to reply-all, so this doesn't get good discussion either.
We sell a desktop app with an involvement in dev and support of about 0.25-0.50 FTE, with revenues in the range of $50K/month although it was launched 9 years ago and the first year was only about $2K/month. The server side is just one Windows 4GB server for user signups, billing and license validation. One good thing of desktop apps is that the server side is so cheap, you are basically selling IP.
It has this features:
* B2B in a niche market (TAM < 100K-200K users)
* Some viral component so you do not have to spend money on ads for growth.
* Sold as a subscription and only as a subscription. Don't innovate with licensing focus on product, this is important. When users have fewer buying options, they decide faster. That's why Steve Jobs reduced 50 Mac models to just 3.
* When the subscription ends, the application must stop working. This is also very important. You want your entire user base to be able to install the last version. You do not want to support older versions, you only want to support one.
* Has to have a very generous trial so that users have time to find use cases with your product. Better a trial based in actual usage instead of exploding trial base in calendar days. You want your users to actually use your product and depend on it.
Restricting background apps by default is probably good. But Google didn't have to lock users in to their own service to provide a good push notification system on Android.
The correct solution here would have been a system-level, open source, provider-agnostic push notification API built into AOSP. The Web Push API is a great example of how a push API can be provider-agnostic: https://developer.mozilla.org/en-US/docs/Web/API/Push_API. It makes sense that push notifications should be consolidated to one provider -- this way, your device only needs to maintain one 24/7 TCP connection to a server which it receives all notifications from all apps through, rather than having each app run its own notification service in the background, killing your data and battery life.
Google's push service works by providing an HTTP endpoint to apps that they can use on the backend to deliver notifications. The Web Push API also works this way, but the returned endpoint can be for any provider. If you use Chrome, the domain will be `google.com`, but on Firefox, it's `mozilla.org`. But each provider's endpoint uses the same standardized API, so the backend doesn't have to care what the URL is. And this isn't a security risk, because everything sent through the service is encrypted.
This could have been done for Android, but that would have given Google much less control over the platform, so they decided to do it in a monopolistic way. This is one of the many ways Google aims to maintain their monopolistic control over "open source" Android. Another example is SafetyNet hardware-backed attestation.
Projects like GrapheneOS are really interesting because they are finally providing a real secure, private, de-Googled OS option, with excellent app compatibility thanks to sandboxed Play Services (which allows you to run Play Services without giving it root access to your device [1]). But will we ever be able to fully decouple Google from Android? I'm not sure. I expect most users of custom ROMs to continue installing Play Services on top of them for a long time, if they want something as basic as push notifications to work.
By the way, while I personally have qualms with Google, I believe users should have the choice to use Google apps, if they are comfortable with the inherit privacy risks. This belief is shared by the GrapheneOS developers. [2] My problem with Android is that it was not built to function without Play Services. Community projects that enable it to do so, like microG, will always be a cat-and-mouse game with Google. I think sandboxed Play Services is the most sane approach the problem of Play dependencies the community has come up with so far, but I think if we want real change to happen, we need to target app developers. Each developer has the choice of whether to include Google or not within their app. We just need to convince them that they don't need Google.
Google owns `developer.android.com` and points developers towards using Play Services APIs wherever possible. Perhaps the community should create their own open-source alternatives to the most commonly used APIs (e.g. push notifications) and developer documentation that provides instructions on how to switch from Google (ideally making the transition as easy as possible by emulating the Google API syntax). It would probably be possible to offer an installable platform-agnostic push notification API that developers can use. Providers could be installed as separate apps. Google could be one such provider. Perhaps we could implement a Web Push compatible API, and reverse engineer Chrome's Firebase integration to implement it as an option from the get-go, hoping other providers show up over time. We could perhaps allow users to self-host their push service as well. Modify Gotify, an open source self-hosted push server [3], to accept push notifications in the Web Push format.
In the hacker circles it seems there's two groups of people: those who think Android is a lost cause because it will always be controlled by Google, and think Linux phones are the only real alternative, and those who believe we can actually "steal" Android back from Google and make it into a true open source project. I fall into the latter, but I think a lot more work still needs to be done if we want to achieve this.
> We aren't against users using Google services but it doesn't belong integrated into the OS in an invasive way. GrapheneOS won't take the shortcut of simply bundling a very incomplete and poorly secured third party reimplementation of Google services into the OS. That wouldn't ever be something users could rely upon. It will also always be chasing a moving target while offering poorer security than the real thing if the focus is on simply getting things working without great care for doing it robustly and securely.
I'm not sure that the problem is missing manpower on the implementation side - it's more that it's hard to know as a developer which current blend of XEPs is the recommended combination and which might have the most chance of working between a given client & server (and server & client) combo.
Things like the XMPP compliance suite XEPs have helped a bit with this, but looking at the XEP list and trying to work out which XEPs you should be using on a given day is still daunting - as well as trying to track which clients are most likely to be supporting them.
The idea on Matrix is that you say "Hi, I talk Matrix CS API 0.4" and be done with it - and you end up with much more social pressure to keep up to date with the current latest spec, because otherwise you are simply falling behind (rather than happening to chose not to implement some XEPs).
It boils down to a question of governance & social dynamics rather than anything related to writing code (magically or otherwise).
Have you ever heard of "The Narcissist's Prayer"? It goes like this:
That didn't happen.
And if it did, it wasn't that bad.
And if it was, that's not a big deal.
And if it is, that's not my fault.
And if it was, I didn't mean it.
And if I did...
You deserved it.
Tether defenders are really working their way through the steps here.
18 months ago, it was "That didn't happen." (Tether is 100% backed by USD cash.)
6 months ago, it "wasn't that bad." (It might not be 100% USD cash, but it's cash-equivalent assets like short-term commercial paper.)
Now that there's strong evidence the commercial paper is just fake money shuffling between Tether/Binfinex/other shady crypto investments we get "that's not a big deal." (Look at the way banks work! They only need 4% collateral! Tether's probably got at least that much...)
Next step is finding out that their actual liquidity isn't capable of holding up under a real-life stress test, and the defenders will be talking about "not my fault." (This was a once-in-a-lifetime crash, they couldn't have foreseen it, crypto's still way better than the fiat banking system!)
When thousands of people lose their retirements in a gigantic defi crash, it'll be "you deserved it." (Everyone knows crypto is risky, you shouldn't have believed Tether was the same as USD.)
Hello,
While I'm employed to develop an agpl software, and I fond of this license, it's clear that with the wrong actors it can be a threat to some businesses.
I'll tell you a little story that happened around 10 years ago:
I got a call from a representative of Oracle, he asked me if we where using MySQL, and if I could described him how, because he wanted to help us make Better use of this tool.
We where pretty happy about MySQL at the time, and I went into deep details about how we used it.
At the end, he told me point blank that the PHP's MySQL driver was licenced under the GPL and that we had to licence our whole codebase under the GPL since it was contaminating our code as a whole.
(Even if we had encapsulated all the accesses to the driver around a single class)
The alternative was to pay the right to use it under a non GPL contaminating way.
Oracle then called and threatened us many times.
The argument that made them stop was when we told us that we where hosting the applications. This argument would not have been sufficient with the agpl.
There claim was unfonded but I can assure you that I didn't sleep well for a while!
Unless something has recently changed, XMPP also failed on the server side for normal users:
- Finding a practical provider is nearly impossible, because all of the major ones closed their public XMPP service years ago, and most people don't know what qualities to look for in a small provider, let alone understand why those qualities matter.
- Even if they get help from a friend and manage to find a reliable server that supports the magic combination of XEPs that are needed to make things work, chances are that server is no longer accepting new users.
- Even if they manage to get an account on a featureful[1], trustworthy, stable, free[2] server, many don't last more than a few years, so there is a high chance that the user's new online identity will evaporate before long.
[1](Required features include end-to-end encrypted group chats with offline delivery, among other things.)
[2](Yes, free. That is critically important for mass adoption today.)
I'm not a normal user, in the sense that I have far more technical knowledge than most people who use messaging services. I actively used XMPP for at least a decade; probably closer to two. I tried last year to piece together an XMPP-based solution that my friends and family could use, but even I was unable to solve all the problems I found in a way that would work for normal users.
And then there's the client side, which is also problematic, as soupbowl said.
This has been the state of things for quite a few years. The only people I still see pushing for XMPP are people who have major blind spots. Some of them have experience only with small groups, and have not considered how a whole world of people conversing presents different challenges. Others (usually tech folks) have had a good XMPP provider and comfortable client software for so long that they have forgotten how much knowledge it takes to get set up in the first place. A few are people who run small for-profit XMPP services or develop XMPP software, aka conflicts of interest.
Dear Jabber, I'm sorry. I liked you. I had high hopes for you. I might even still consider you for niche purposes. But as a general purpose messaging network for regular people, you have failed. It would take a lot of effort and resources to make you viable, and I don't see any sign of that coming. The closest you came was when Google and Facebook embraced you, but those days are long gone.
Meanwhile, Matrix is succeeding. The rough spots are being fixed. The bloated clients are getting competition. The missing features are appearing. Even as a work in progress, it is already usable for some of us, and it's consistently moving in the right direction. Importantly, I can tell a regular person where to get it and (often with no help) they can be chatting with me in a few minutes.
I'd extend this even further: never use a single account for more than one purpose. Create a separate account at the same company for the other purpose. Some examples:
- Every product/project you manage should be on a separate account on Google, Facebook, Amazon, etc.
- Nothing tied to your consumer accounts should be used for anything business related.
- Your Amazon shopping account and your AWS account should definitely not be the same account.
- Don't use services where you're required to mix accounts like this.
Some examples:
- If the issue in this thread is actually an automatic ban, account siloing very likely would have avoided the issue. ([edit] likely, it would not have been possible to silo accounts in this case, as a child comment points out)
- Facebook apparently lacks the willingness to block an account from Marketplace, but they will block an account from all Facebook properties.
- Paypal has banned businesses because the account was created years ago by someone who at the time the account was created was 17.
The point of this isn't that it's impossible to connect the dots that you are the same person. The point is to make it difficult for an automated system to deal too much damage to you, and make it difficult for someone looking at a single account's history from accumulating too many "strikes" against that account.
Reposted here since it seems the page linked is no longer visible (at least for me).
------------
Apple’s great GPL purge
Posted on 5 February 2012 by meta
Apple obligingly allows you to browse and download the open source software they use in OS X. Since they have listings for each version of OS X, I decided to take a look at how much software they were using that was only available under the GNU public license. The results are illuminating:
10.5: 47 GPL-licensed packages.
10.6: 44 GPL-licensed packages.
10.7: 29 GPL-licensed packages.
This clearly supports the idea that Apple is aggressively trying to remove all GPL-licensed software from OS X. While the removal of Samba and GCC got some attention, the numbers show that there’s a more general purging going on.
The 29 remaining GPL-licensed packages aren’t too healthy either. Lion apparently ships with bash 3.2. That’s from 2006. The current version is 4.2.10. Why no upgrade? Because Apple’s shipping the last version of bash that was under the GPL version 2.
The message is pretty obvious: Apple won’t ship anything that’s licensed under GPL v3 on OS X. Now, why is that?
There are two big changes in GPL v3. The first is that it explicitly prohibits patent lawsuits against people for actually using the GPL-licensed software you ship. The second is that it carefully prevents TiVoization, locking down hardware so that people can’t actually run the software they want.
So, which of those things are they planning for OS X, eh?
I’m also intrigued to see how far they are prepared to go with this. They already annoyed and inconvenienced a lot of people with the Samba and GCC removal. Having wooed so many developers to the Mac in the last decade, are they really prepared to throw away all that goodwill by shipping obsolete tools and making it a pain in the ass to upgrade them?
There's a quote about this rule (get started and the job will be easier to complete):
"What you can do, or dream you can, begin it,
Boldness has genius, power, and magic in it.
Only engage, and then the mind grows heated,
Begin it, and the work will be completed!"
- Goethe