Hacker Newsnew | past | comments | ask | show | jobs | submit | amethyst's commentslogin

*parts of the Bay Area. I'd say the majority of areas are still monopolized by Comcast, including my neighborhood of course.


If you haven't already, check to see if either Google Fiber or Monkeybrains is available in your area. Last I checked, the regs are still in place that prevent landlords from denying you access to an ISP of your choice.


Google Fiber isn't rolled out to most of the bay area. Monkeybrains is wireless with speeds significantly slower than what Comcast offers me. I've checked just about every wired ISP possible, and Comcast is the only option that services my neighborhood.

And FWIW, I own my house in the east bay — I am the landlady ;)


> Google Fiber isn't rolled out to most of the bay area.

I'm aware. And it's my understanding that -sadly- much of the Google "Fiber" deployment in the area is a WISP, just like Monkeybrains is. Quite a while back, Google Fiber bought Webpass and continued doing WISP deployment in the SFBA under the Google Fiber brand. (Because it's politically dreadfully hard to run fiber optic cables in the area.)

If you haven't contacted Monkeybrains for a minimum and expected speed quote at your site in a year or five, it's worth doing it again. It's my understanding that they aperiodically upgrade the hardware in their core network as well as the sort of hardware that they deploy at customer sites.

Monkeybrains' down-to 100/100 service is -on paper- far, far slower than the up-to 1400/40 service I was getting from Comcast, but the actual, delivered speed that I'm seeing from Monkeybrains varies between 300mbit and ~1000mbit (sustained) depending on what other folks are doing on their network. [0] I'm in a fifty-apartment building, so it's possible that they've installed faster gear on my roof than they install in smaller (or single-family) buildings. Reports on the Internet seem to be somewhat mixed, with some single-family buildings reporting ~1gbit service, and others reporting ~45mbit.

[0] Typical prime-time speed is something like 400mbit. Off-hours speed is frequently very close to 1gbit. The only time I've seen the minimum speed was when I had a poorly-crimped Ethernet cable between my router and the rest of my LAN that would intermittently only link up at 100mbit.


Seconding monkeybrains - they can usually get to houses that the other ISPs can't/won't service, and speed is pretty spiffy


In markets where Comcast has actual real competition, they "include" the unlimited data (aka no cap) with no extra charge when you sign up for their gigabit plans.


You pay for their highest tiered and highest bandwidth plan and they have the audacity to impose a cap making that bandwidth work against you? Crazy. My household internet usage is quite modest, nothing anyone on HN would call data intensive—casual video streaming being the lion's share, and I blow out 1TB every month without fail.


And yet it's the default when formatting a device on macOS.


Being afraid to not use the default is evidence of not being a power user!


Default or not, are there sensible alternatives on a Mac? I'm not sure if I'd consider OpenZFS on Mac "sensible" - but I haven't owned a Mac in decades, so... what are the alternatives to APFS?


Occasionally that is mistaken by newbies as such, oftentimes it is the voice of experience and battlefield scars driving sensible decisions.


And yet, as someone working on core language infra, we apply exactly that sort of ideal when making changes. If a diff doesn't break any tests, then it's "safe" to land, and if something does indeed break afterwards, then it's the broken team's responsibility to fix forward or otherwise provide proof that it's a big enough problem to roll back. If we end up in SEV review for a change, and there were no broken tests on the diff, then there are going to be some hard questions for the team that didn't write tests.

Ie, tests aren't mandatory, but if you aren't writing tests, it's your responsibility when someone else's change breaks your project.


Tests are hard for UI components. Even when the web page has all the expected elements, the appearance may be broken. At least for UI projects, your approach will fail.


I still miss my desktop sheep every once in a while: https://github.com/Adrianotiger/desktopPet?tab=readme-ov-fil...

Edit: the best part was running it a couple dozen times to get an entire flock walking, falling, and rolling all over your desktop, and watching everything grind to halt under CPU strain!


I added this to my website a while ago. You can open a terminal and summon as many as your computer can handle with something like `sheep 100`.

https://dustinbrett.com/


The right click menu on the site is quite the trick. It took me way too long to figure out it wasn't the native Firefox menu.


(Also, among other things, all the posts are "editable" texts within a "text editor" in the simulated desktop. The whole site is bonkers ...)


very cool website! I did sheep 4000 and my pc immediately exploded


quite cool. how did you make that website ?



this is so sick


Aww memories. One of my old colleagues would mess with my computer and added a bunch of these. I left them there much to his chagrin. I got revenge as one night he was in the office late at night by my desktop, it was completely dark in the office and the sheep baa’d and it scared the crap out of him.


Oh wow! Not sure if it was this exact program, but I remember some similar sheep roaming my desktop when I was young. It had the ability to draw pictures in MS Paint, and would often do so when you were working on something...


I was a big fan of VirtuaGirl myself. (NSFW if you Google about it).


Incredibly enough, the "VirtualGirl" executable is archived at the Internet Archive:

- https://archive.org/details/virtuagirl265

It has been found worthy of preservation.-


I had to admit it was also my first thought. The sprite was very very well done, actually ...


You’re getting downvoted but it really was pretty fantastic.


Also: having Ayanami Rei from Neon Genesis Evangelion sitting on your windows.


It looks like this repo is is a rewrite of an earlier "scmpoo.exe" that roamed the internet in the mid-1990s. That was fun to set up on school computers to automatically launch at random times.


That's the one I had. I got it from my friend in sixth grade, and who knows where he got it from!


Haha thank you! This was my first thought too!


Wow, thank you! That brings back unexpected memories of playing on my dad's first laptop.


You might prefer Eternal Terminal as a TCP mosh alternative:

https://eternalterminal.dev/


My 2017 Golf R would intentionally turn off the dashboard backlights at night if the lights weren't on, providing me excellent feedback that they weren't on. Either head lights should always be on auto by default, or more cars need proper feedback to the driver when they aren't.


Justice runs on a log scale, and the base is left for individual jurisdictions or judges to decide.


Or it's a simulation and someone keeps pushing changes to production.


Which would also count as new physics.


With even more literal meaning of new.


Someone keeps running gparted on our partition


All the expert software engineers agree this is the most likely explanation. Have physicists looked into this?


> All the expert software engineers agree this is the most likely explanation.

That's quite a strong claim. I'm skeptical. Sources?

> Have physicists looked into this?

They shelved it right next to "God Made The Universe" in the "Unfalsifiable Propositions" section, under the title "Grad Students Made The Universe."


I'm reading their comment as a joke about how software engineers tend to overestimate their own expertise on things like physics and are not actually anywhere close to experts.

Software engineers presenting weird pseudo science as serious physics is one way this manifests.

I could be wrong.


I wonder if their introspection is good enough to have our population on a Grafana dashboard somewhere


Somewhere aliens are making fun of how shoddy our simulation is coded.


It's definitely a simulation at this point


> Python makes it incredibly tough.

I disagree, Python makes it incredibly easy to work with threads in many different ways. It just doesn't make threads faster.


In what way? Threading, asyncio, tasks, event loops, multiprocessing, etc. are all complicated and interact poorly if at all. In other languages, these are effectively the same thing, lighter weight, and actually use multicore.

If I launch 50 threads with run away while loops in Python, it takes minutes to laumch and barely works after. I can run hundreds of thousands and even millions of runaway processes in Elixir/Erlang that launch very fast and processes keep chugging along just fine.


> If I launch 50 threads with run away while loops in Python, it takes minutes to laumch and barely works after. I can run hundreds of thousands and even millions of runaway processes in Elixir/Erlang that launch very fast and processes keep chugging along just fine.

I'm not sure that argument helps your position on threading. I once saw a java program spin off 3000 threads doing god knows what. Debugging the fucking thing was impossible.


The point there is that processes in Elixir and Erlang are effectively like functions, in that you do not need to "manage" them in any sort of way. They are automatically distributed across all cores, pre-emptively scheduled, killable, have a built-in inbox, etc. One doesn't need to worry about what concurrency library to use nor manually create mailboxes using queues or whatever else. It just works, and you fire them off to do whatever you need. So there is no ceremony. Threads in many other languages and in Python in particular, require a huge amount of ceremony and management.


> require a huge amount of ceremony and management

I think Java made it quite easy to spin off threads, and again, it doesn't help the argument. It just made the f'ing thing worse. Race conditions are still f'ing hard to solve. Particularly when a shared-mutable-state exists outside of the program.


The whole purpose of threads is to improve overall speed of execution. Unless you're working with a very small number of threads (single digits), that's a very hard to achieve goal in Python. I wouldn't count this as easy to use. It's easy to program, yes, but not easy to get working with reasonably acceptable performance.


And the python people would just point to multiprocessing...which works pretty well.


Which has its own set of challenges and yet another implementation of queue.


Yes, but the shared-mutable-state issue goes away.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: