> Technically Gentoo is also in the running, but can you imagine trying to compile all your packages from scratch on a system that benchmarks worse than a raspberry pi 3?
Uh, I actually did this, it wasn't so bad honestly it just took about a day to rebuild everything.
Honestly the Sony VAIO that I had was _awesome_ in some regards, the hi resolution display was extremely crisp! It fit comfortably in my inside jacket pocket, the battery didn't suck.
The only issue I had honestly was the proprietary connector to get ethernet (though this was more annoying in 2012 when I was doing this, these days laptops don't seem to have ethernet); the only other issue was that the GPU was extremely slow with Linux.
it was probably extremely slow in Windows too, but vista (which was installed on the thing) was far-far too heavy to understand why it was slow at all.
The nearest best laptop I've found that is in all areas superior than the Sony VAIO P-Series (aside from being a bit taller) is the GPD P2 Max which is basically perfect.... if only it had a passively cooled ARM CPU.
Compiling your own software is a really humbling experience. When it takes way more time to compile a browser than a full fledged OS or you find out that seemingly simple programs need to pull a mind boggling amount of dependencies you really start to question the state of the software world
I think the main reason browsers are so extremely slow to compile is the heavy templating.
But, I agree, I can compile my entire OS including user-space software and desktop environments in about the same time it takes to compile chrome.
Which is scary.
But then again, people want it to do everything (WebUSB, WebGL etc; etc; etc;). So it stands to reason that it's inherently complicated and difficult to compile.
I wonder if the high iteration time hampers development...
> I returned to Microsoft as a Program Manager on the Edge team in mid-2018, unaware that replatforming atop Chromium was even a possibility until the day before I started. Just before I began, a lead sent me a 27 page PDF file containing the Edge-on-Chromium proposal. “What do you think?” he asked. I had a lot of thoughts (most of the form “OMG, yes!“) but one thing I told everyone who would listen is that we would never be able to keep up without having a cloud-compilation system akin to Goma.
Maybe you already know, but in case not or someone else needs this:
try with --no-install-recommends, it skips a lot of bs.
I don't recall exactly what it was, but I remember installing something like a tiny library and it wanted to also install mysql-server or something like that >_<
Why is that not good advice for a novice too? I do this by reflex every time I install Debian or Ubuntu, and to my experience it did not create a situation that needed "expertise in apt".
Gentoo was fun, too bad I don’t have time for it anymore. I used to go for nice walks when Firefox was compiling. Great opportunity to go outside and take a break.
USE flags in Gentoo also allows for a much more configurable system.
I use a GPD Micro PC throttled to 6 watts TDP, which means the fan can stay off permanently. It fits in a jeans back pocket, and has an ethernet port. And a serial port. And a full size HDMI port. And three full size USB ports, and a USB-C port.
I also did this (around ~2008), a friend of mine and I built near identical Atom boxes with first gen (Diamondville) 64bit atoms on Intel motherboards running 865 chipsets IIRC. The GPU/Chipset was louder than the CPU because the CPU was completely passive. I did emerge Xorg on that... it took I think a day and a half(ish) even optimizing the heck out of compile options to use everything march=native... it was slow as heck. But it lasted me for years as a little project box until I replaced it with an 4th gen i5.
You really do start to ask yourself if you need a package if compiling it will take a day or two. Hence OpenOffice never got installed.
Save for the processor being better than any VAIO's, I disagree. I find all of these to be absurdly unreliable (crappy firmware) and very cheap hardware for the price, not comparable at all to the typing experience on the P-Series. And the "trackpoint substitute" is a disaster, resembling a "tiny touchpad" more than a trackpoint.
Hrm, interesting.. I disagree with your opinion about the hardware quality, it feels sturdy and keys travel quite nicely. The screen is fantastic in color reproduction (for my needs), has high resolution and gets bright enough.
There’s no trackpoint/nipple and I hadn’t considered that a problem as I’m weird and spent a lot of time getting used to only using the keyboard some years ago- so an oversight on my end and you’re completely right, the touchpad sucks.
The firmware is extremely bare bones, but I wouldn’t say it sucks since I don’t have any reason to believe it’s bad. (Nor good, it just works for me.)
I think you are mistaking the GPD Pocket 2 Max with the non Max version. The Max version has a real touchpad while the GPD Pocket 2 has that tiny thing on the upper right.
I have the GPD Pocket 2 and while the trackpoint thing is not the most precise you get used to it and I have no complaint to do regarding the hardware. It just works.
At one point in college, I was using an old Thinkpad x41 tablet and wanted to mess around with gnuradio. I wanted it on my tablet laptop since I had that on me most hours of the day. Compiling gnuradio took several hours. I was running arch so I want unfamiliar with compile times for things I grabbed from the AUR but it was atrocious. I started it in my first class of the day and would just throw my laptop in my bag while it was still compiling and walk quickly to my next class so I could grab a power outlet before Pentium M sucked up all the battery.
Glad you enjoy your life at 800 MHz! I appreciated your article although the plural form to address a single person (not the editorial "we") makes me uneasy for political considerations.
So many more things could be easily enjoyable on such hardware if the software ecosystem allowed it. I'm also curious what hardware modularity like Framework is doing could have achieved two decades ago: if you could easily plug in a chip to decode/encode video quickly, this computer could probably play any kind of video.
> We have no idea what crates.io thinks it makes sense to require javascript to look up packages but here we are.
Apparently, without a specific Accept header, crates.io thinks i want a JSON response for a crate lookup, not the homepage. Now i don't even remember why i was requesting this URL to start with (not in a script) but i don't understand the logic of that and the maintainers in the chatrooms seemed to consider it's not a bug.
I'm also very curious about antiX "proudly anti-fascist" distro but that they're two debian releases late (still on stretch) does not exactly attract me.
I don't want to speculate about Artemis specifically, but first-person plural pronouns to refer to oneself typically isn't a "royal we" or anything like that, it's just what helps some folks feel comfortable, especially those who have DiD or who label themselves as plural. See https://www.reddit.com/r/plural/wiki/index (keywords: "plurality," "multiplicity," ...)
I'm dating someone who refers to themself in the first person plural; it becomes perfectly natural pretty quick :)
I never knew this was a thing. I'm not on board with promoting the use of "we" as a replacement for first person singular as being an acceptable societal norm, unless you're the Queen.
Sorry, but it is too close to contributing to mental health, or personality, disorders for me.
> I never knew this was a thing. I'm not on board with promoting the use of "we" as a replacement for first person singular as being an acceptable societal norm, unless you're the Queen.
Sorry, but it is too close to contributing to mental health, or personality, disorders for me.
Wait until you find about about languages like Hindi where the plural form can be used for respect even when referring to an individual :)
I hadn't realized that calling an individual in plural was even a point of contention until comments on this thread pointed it out (likely because I'm used to it from Hindi). Don't forget, the author may be bi/multilingual.
> Wait until you find about about languages like Hindi where the plural form can be used for respect even when referring to an individual :)
In fact it’s pretty common amongst a lot of languages. Most Latin-derived languages use the plural to show respect. But of course, never to talk about yourself. You’ll use the pluralized form when talking to strangers or to people who are over you hierarchically (but this usage tends to disappear in a lot companies).
As a French, reading someone speaking about itself as "we" is shocking not because it looks like there is multiple people involved (but it also does) but because it looks like the person tries to be "above" you hierarchically. Of course i know it isn’t what’s intended but language interpretation is an automatic mechanism.
> As a French, reading someone speaking about itself as "we" is shocking not because it looks like there is multiple people involved (but it also does) but because it looks like the person tries to be "above" you hierarchically. Of course i know it isn’t what’s intended but language interpretation is an automatic mechanism.
That's interesting, culturally. In India, it's very common for example for people from North India to be much more "pride-based" where individual identity is important and people often use the plural for themselves, while in more southern states there's lesser emphasis and singular is much more common. A lot is dependent on culture.
> Wait until you find about about languages like Hindi where the plural form can be used for respect even when referring to an individual :)
I speak German, which uses uses the plural sie/polite Sie.
It's completely different to "we" being used by an individual to refer to themselves in English.
The fact that a language construct might exist in other languages is irrelevant. Calling a girl "it" in English would similarly be bizarre, although that is the grammar in German.
Something cannot be a disorder unless it causes harm. Things that are not disorders and are out of the ordinary can be considered adaptations and can be advantageous.
Harm to whom? Many things can cause harm to oneself (socially, at least) without harming anyone else. Being odd about your pronouns is one of those things.
Pedantically, there is no such thing as "correct" or "incorrect" English as there is no standards body that dictates such rules. You can verb any word you want and people will understand what you mean.
You don't have to read the article if it doesn't meet your muster. Why bother caring?
Who's to say it's "incorrect English?" You clearly understood what was said. This person's identity seems like the least interesting and most irrelevant part of this discussion.
After finishing the article, my main take-away was how impressive it is that such a quirky tech setup could work for both of them. I was comparing it to my relationship and how difficult it is to share any item/space which is also customized to either of our preferences. It gave me hope.
I grew up with the ~10 MHz 8086 PC, and I was on bulletin boards and the Internet around the 486 era, still stuck in the "tens of megahertz" era. Even wireframe 3D rendering at 640x480 was glacially slow. CAD applications on a CPU without a floating point unit were just unbelievably painful.
800 MHz and a solid state disk is luxurious if you're not wasteful with it. As the article's author points out, this is "not up to you" ("we"/"us") any more, other people get to decide how much JavaScript to shovel on top of web applications.
It seems ambiguous to me, I was honestly trying to figure out if there was more than one person using the author's laptop, or if it was a multi-author article or something.
Not that English isn't chocked full of ambiguity - I just haven't managed to identify a benefit over using the more commonly accepted "I" here.
I interpreted this as the "editorial we" or perhaps the "author's we":
> The editorial we is a similar phenomenon, in which an editorial columnist in a newspaper or a similar commentator in another medium refers to themselves as we when giving their opinion. Here, the writer casts themselves in the role of spokesperson: either for the media institution who employs them, or on behalf of the party or body of citizens who agree with the commentary. The reference is not explicit, but is generally consistent with first-person plural.
I find that sad. Do you also hold that view when interacting with people different in other ways? People that dress differently, hold different political or religious views, people from other places to you?
I have acquaintances who was lyrical on the topic of LGBTQ+ folks, feeling that somehow, non-LGBTQ+ folk are being "oppressed". I keep asking them, "What are they taking from you? What could you do before that you're not allowed to do now?". They typically don't respond or change the topic. We both know that the types of behaviours that are no longer "acceptable" lie on a spectrum that starts with "being casually disrespectful", to outright *ism. It's sad to me that a lot of people value the rights of some to be jerks, over the rights of others to partake equally in society and to feel equally safe and valued in public spaces.
Oh I’m very much supportive of living your life as you see fit. But it’s the pronouners who insist on you using they/them (grammatically silly) or who insist on sharing their pronouns when it’s obvious what they are, then attempting to guilt you into sharing yours even though they know what yours are. This behavior tends to bleed into other annoying personality traits as well.
And yes I do tend to avoid other annoying types of people too, across the spectrum.
> or who insist on sharing their pronouns when it’s obvious what they are, then attempting to guilt you into sharing yours even though they know what yours are.
Don’t worry, you’re already using singular “they” without even realizing!
This argument is always brought up - "it was used centuries ago, so it's still valid now!"
Except: all normal human beings trip up on it; those who believe in the they/them BS trip up on it; and almost no one who uses this argument supports other things that were done centuries ago, so it's not really arguing in good faith.
Arranged marriages at very young ages were a thing centuries ago, should we bring those back?
If you want to use custom pronouns, nobody is stopping you. The rest of the world is just annoyed and tired of hearing you desperately announce it every time we see you and we're not going to play along.
I wasn’t using it as an argument for reverting to some past usage, just pointing out that language and grammar evolve. Just like it evolved away from that usage, it may evolve back.
I find the fact that some people are massively triggered by this fascinating. Personally I’m happy to use whatever pronouns people desire for themselves if they make it clear to me. I get it wrong occasionally cos I have a lifetime of doing something different, but if someone has explicitly made their desire known to me, I’ll make the tiniest effort of referring to them as they wish. It’s not a chore for me. I think of it as being polite to that person.
If I tell you my name is Mike, and prefer to be called that, would you insist on calling me Michael or Micky or Mickster? Even if I told you I was uncomfortable with that (for my own reasons which I don’t have to share with you)?
> every time we see you and we’re not going to play along
Im guessing “we” is not referring to yourself here.
Ugh. Hacker News has traditionally been one of the worst places to discuss queer politics, but reading this has made me feel so frustrated that I can't help but weigh in.
> I find that sad. Do you also hold that view when interacting with people different in other ways? People that dress differently, hold different political or religious views, people from other places to you?
Yep! Ready to get nihilistic? Their existence is pretty much inconsequential to me. Sexuality, gender identity and appearance has quite literally zero bearing on the way I address other people. Unless someone make a concerted effort to be my acquaintance, I will likely forget about their existence within the hour. That doesn't mean I can't sympathize; but the internet has greatly distorted our idea of how important other people actually are. We conflate identity with politics and alliances, we grok importance by follower count and Google search results, it's a disgusting mess that can only be effectively deterred by not caring.
Is it sad? Hard to say, but I certainly feel like it's a less frustrating way to live your life when compared to bending over backwards for everyone. I operate with my own interests at heart; as much as I despise Ayn Rand's philosophy, she wasn't wrong when she said that the greatest minority is the individual.
> I have acquaintances who was lyrical on the topic of LGBTQ+ folks, feeling that somehow, non-LGBTQ+ folk are being "oppressed".
I don't think it's hard to sympathize with that sentiment, even though I'm a gay man myself. I feel embarrassed by the level of entitlement that the rest of the community seems to push, in public and online. A decade ago, the LGBT movement was pretty cut and dried - queer people wanted to integrate into society as normal individuals, without any pretense or opportunity for judgement. In response, they became a protected class and everything was pretty much solved. There hasn't been a legitimate reason to be mad as a gay person since those bakers refused to make a gay wedding cake, and even that only incensed me because it was against the law. As far as I see it, the modern LGBT movement is far too infatuated with liberties that don't exist, and hunting boogeymen that don't care. It makes me ashamed to be queer and wish that I could live in a world where my only identity didn't boil down to "the gay guy".
Which era of English would you like to certify as being the one and only correct English? US English? British English? 20th century, 17th century, or 14th century English?
> Which era of English would you like to certify as being the one and only correct English? US English? British English? 20th century, 17th century, or 14th century English?
How about the one that I personally use, not the one you insist on me using?
> Languages evolve.
Naturally and logically, over time, voluntarily - not through a small subset of the population smugly correcting you and shoving it in your face when they themselves (pun intended) regularly mess up the they/them pronoun BS regularly.
I have heard it from a "sovereign citizen". They seem to use it when wanting to talk about themselves (flesh) inclusive of their various personhoods and corporate entities. I imagine that traffic cops find it unsettling for a lone driver to say "we" are going somewhere, as if there are other people somewhere unseen in the vehicle.
I have a laptop from 2009 or 2010 running at 800 mhz with a 32 bit CPU. It has to run an older version of Ubuntu (18.04) because nothing supports it nowadays. Even 32 bit packages are hard to get. I see no reason to use antiX or other esoteric distros since ubuntu runs fine on it and supports the hardware. I doubt antiX supports more hardware.
Someone else recommended it here, but I don't see the advantages over a robust package repository like ubuntu 18 or a minimal ram only distro like puppylinux. https://cheapskatesguide.org/articles/antix.html
Funny enough I got puppylinux running from a dos (windows) partition and running out of RAM on just 2gb on a Toshiba Portage m200. I've even got Windows XP Tablet edition running on SSD, but it can't really connect to much online due to the TLS limitations. And newer versions of the linux kernel don't support the wireless chipset. It is also difficult putting an old non-PAE kernel into a newer distro.
TLS really killed the utility of a lot of older computers with regards to using the "modern internet".
I have an old Dell with a 32-bit 2.33 Ghz T2700. Linux fully supports the GPU, and no issues with missing 32-bit packages on OpenSUSE Tumbleweed. It's a spare browsing / retro gaming machine hooked up to the TV in the guest room. For gaming, it runs everything from arcade MAME to Mario Kart 64 like a champ. For browsing, it's not speedy but not bad on heavy HTML sites like gmail/youtube.
I agree antiX was a poor choice. No issue with PAE kernel on Tumbleweed i686. If OpenSUSE ever drops x86 support, there's always Debian or Arch 32 (if I want to stick with a rolling distro).
I still have a tablet PC from 2005 in rotation, and the lack of 32-bit apps is definitely a killer, but not terrible.
My original reason for reviving it was for use as a whiteboard in Zoom calls, but there's no 32-bit Zoom app - and I'm sure screen sharing while decoding 15 people's video would've been out of the question anyway. So I run a VNC server on it, and share out a VNC session from my work laptop instead.
I've also hit the issue where I've had to compile software for x86 using modern build toolchains. It takes forever, and more often than not, I run out of RAM (only 1GB). To get myself out of a pinch, I've mounted a 16GB USB 2.0 flash drive as swap space. Sure, it makes compiling even the most basic software a multi-hour process, but where this machine isn't my daily driver, it's still easier (to me, at least) than cross-compiling.
> I'm also very curious about antiX "proudly anti-fascist" distro
"Anti-fascist" doesn't actually mean that - it's a political dog-whistle.
> they're two debian releases late
That's in line with their use of Palemoon, which lags behind normal Firefox feature (and security) releases due to their decision to support older features (mostly XUL) (not that this is very avoidable, because maintaining an XUL fork is very hard work, and not for the faint of heart).
There's no unified "anti-fascist" movement, but the common theme among the self-described anti-fascists I know is the belief that physical violence has a legitimate place in democratic processes.
Frankly they remind me of a line by Nietzsche about staring too long into an abyss.
> the belief that physical violence has a legitimate place in democratic processes.
That's not exactly the point, though. We are not in a democratic process (unless by democracy you mean giving away powers to congress), and our society is very violent towards the most vulnerable segments of it.
Do you think giving back just a tiny portion of that daily violence we face is immoral or wrong? How is it justified for people to threaten us with guns if we don't pay rent to some arbitrary landlord or to detain us if we dare steal food for basic survival, yet attacking bank windows or punching an actual genocidal nazi in the face is seen as violent?!
Communists and anarchists who are willing to use force against their opponents. (this is based on both media reports and first-hand experiences in Portland, Oregon)
If you mean against the ruling class threatening a million species and the neonazis promoting eradication of many branches of our species, then yes i'm certainly advocating to stop these people by any means necessary.
> > We have no idea what crates.io thinks it makes sense to require javascript to look up packages but here we are.
>I've had a similar experience with crates.io:
They do have an API (ps: I built crates.live on top of it). I think they have a very good reasons to block the crawling of their main website. Otherwise, people might abuse it. Actually, they recommend you identify yourself when crawling their API to not limit you. I didn't do it, and found no problem constantly calling their APIs.
So first, "crawling" a website is not abusing it. It's simply using the website and there's nothing wrong with that. Then, i believe that "not found" JSON message was not intended as an anti-scraping measure, but was in their view a meaningful error in the sense that i did not request info about a specific crate so the API responded "not found".
What's weird is no specific Accept header, http://crates.io returns 301 with some HTML, and https://crates.io/ returns 404 with some JSON, while in a browser you get a proper 200 with HTML. I just found that pattern very confusing, but hey maybe i'm just an old dinosaur and that is the future of web development.
> Glad you enjoy your life at 800 MHz! I appreciated your article although the plural form to address a single person (not the editorial "we") makes me uneasy for political considerations.
You assume Artemis identifies as a single person. In all likelihood, they are a plural system. Statements like yours are microaggressive at best.
A plural system? You say that as though it requires no explanation.
I don't get the "political considerations" part, but this is the first time I've encountered anyone referring to themselves as "we" online, and I also found it jarring.
A plural system is multiple identities or personalities in one body/mind. Plural systems are increasingly demanding to be recognized and respected as such -- and companies are starting to comply. Much like trans and nonbinary identity, plurality is an aspect of identity we're all going to have to deal with now.
Genuine question: How do you differentiate this from full-blown mental illness? Because this sounds 100% like what society traditionally recognizes as schizophrenia/split personality disorder. Or, in more extreme cases & phrased less politely, insanity.
Does the United States current recognize plural folks as a protected class? Do we even have the infrastructure to recognize them in any meaningful fashion? To extrapolate on that, how much research has gone into understanding the dysphoria that these people experience? Do we have a medical basis of understanding when it comes to how plural systems affect the mind? Do we even know if it's healthy to address plural systems as their individual components?
I apologize in advance if this sound antagonistic, but putting plural identities on the same levels as queer and trans ones seems... a little premature, if you ask me.
She got the Dragon Speech software, and I was surprised at how
good it was.
You can of course dictate all your notes, documents emails.
It also provides means to navigate your OS, start programs, close them,
and a lot more.
It is expensive but she could do most of her work with two hands that didnt work.
A while back I saw a video about a guy who wrote code using such software
(not sure what he used in particular).
This can be tedious "Open bracket", "new line" etc.
He had spent a long time tuning it so it was fast and efficient.
He used a set of custom grunts and noises as "macros" for all the bracket
brace, and other symbols that are in heavy use in programming languages.
If you were just listening to him and didn't know what he was doing
it sounded a bit distressing.
You'd use a custom vocabulary as well. So rather than "curly open" you'd use "heck", and instead of "enter" it would be "bark". I'm just making the actual words up here, but the point is to use a different/more simplified vocabulary that's also easier to understand by the computer.
I worked with a guy who wrote code like this. He was, indeed, pretty productive, but it was hell sitting next to him without good headphones. Was this guy you're referring to a long haired, kinda scruffy guy who had worked at Amazon at one point?
Did your GF or friend ever consider using foot pedals at all? I knew a programmer once who used various foot pedal combinations for different punctuation marks and tabs.
I have been looking into pedals before this ever started,
and we looked at some different options, but
could not find something that seemed worth it.
I really want a set of foot controls to act as my mouse
since growing a third arm is currently not practical.
I keep looking around and I know there are some solutions
out there, but not in my price range that seems solid.
A cozy laptop sounds nice. I bet IRC is more than fast enough, surprised it didn't get a mention. Also, if you just want to read some text on the web as fast as possible, w3m might be worth a shot. I use it in TTY2 all the time to look stuff up. Browser CDN caches like Decentraleyes or LocalCDN might also be worth trying especially with the mnestic set up: you would only have to load certain JS bundles once per session.
>a dishonorable mention to twitter for being slower than Discord, we wish we were making that up
If you're just browsing Twitter, then the Nitter frontend (https://github.com/xnaas/nitter-instances) is way, way faster. Does not have algo-recs either, which could be positive. If you need to post, I assume you've tried spoofing user agent to mobile? This might help with bloated sites in general.
The 1000x480 resolution seems interesting. Maybe this machine would make a good single-purpose device for writing.
Also, somewhat related: Former Debian maintainer Joey Hess famously used a Dell Mini 9 for all his coding [1, 2]. I wonder if the Sony has a better, less cramped keyboard compared to the Mini 9.
Another interesting guy doing valuable work on low-end, underclocked hardware is Nils M. Holm [3].
Myself, I can get most of my stuff done on a Thinkpad T42 (underclocked to 600 Mhz to reincarnate its dying GPU). With the ram-booted Tiny Core Linux, this thing still flies. I'm having a hard time ditching it because of the 4:3 IPS screen and excellent keyboard. I've even used it to produce lengthy radio programs for my country's public broadcasting.
Aside web browsing, there seems to be more than enough software solutions, hacks, workarounds and programming languages for doing valuable work on rather old hardware these days. Really interesting times we're living in.
Then again, might be true that with yesterday's hardware, you're limited to solving yesterday's problems. I guess I'm fine with yesterday's problems in many aspects of life.
Some people doesn't know that, aside of media creation and consumption, we don't need so much power to do other things.
Most of my university assignments were done on a Acer Aspire One netbook (1.3/1.6 GHz Dual Core Atom, 2 GB DDR2 RAM) and I had no problem. To program in C, C++, and Python in Debian is simple great, and to simulate circuits with SPICE related software on Windows 7 is also good.
I started using it because it was more light and more comfortable than the newer laptop I had (15" 4th gen Intel i5 laptop), and as a small device for reading PDF is great, so i ended up using it more and more, and for more tasks, leaving it for exclusive academic usage and letting the other for games and media.
I still have my Samsung NC10, had it running an IRC bot until recently in power save mode with no fans. Opening a modern version of a browser is pretty revealing about how heavier the web has become though.
The NC10 was/is a great machine. Considering the dimensions, it had a remarkably good keyboard. I also liked the "fanless mode". It felt quite sturdy, and, iirc, you could open the screen all the way down, to 180 degrees. The one I had for some time did suffer from its symptomatic "white screen" issue, though.
I did some writing on this machine, and I always felt really concentrated, quite possibly because of the small screen.
Seconding this. The NC10 was an amazing little thing for writing a lot when on the go. Too bad the white screen issues have killed most of them off by now.
As for media creation, SaaS is where its at for weak endpoints. My ancient chromebook battery is going and it could never run CAD, office, or video editing natively, but it runs onshape which is SaaS 3-d CAD, and Google Workspace/suite/apps whatever its called this week, and Wevideo SaaS video editing perfectly fast no slowdowns or problems pretty much ever. The onshape viewer works great on my phone and tablet so if I'm building something far away from my desk, I've got the prints with me. Unlike my desktop keyboard, my tablet touch screen is sawdust-proof.
Another discovery I made a long time ago was network connections are usually fast enough and small battery friendly CPUs are slow enough that its faster to send a video file to AWS (or have it there to begin with), spawn a linux box on AWS, run handbrake in CLI mode to convert the video to some obscure format on a very CPU beefy machine, and download the converted file, and delete the huge (and expensive) AWS instance, than it is to transcode video locally. Some CPU based transcoding is very slow if you don't have a lot of cores and its brutal thermally and to the battery.
If you only have one SaaS app in your life, the old meme was what do you do when the internet is down? Well, the internet is almost never down for me, I'd pick up my laptop and go to a cafe or library if it was, and everything I do is online or SaaS or VPN'd in so I wouldn't crabby about one app being down I'd be crabby about being completely and totally shut down.
That anti-SaaS argument in 2020's is like arguing that people have to drink bottled water because what would they do if tap water stopped working one day? If we're in a situation where the tap water stops working then we got bigger problems than which bottled water company to enrichen.
The linked article seemed surprised that a 2009 device could play video, but I had been using Mythtv for 7 years by that point including occasional HD video on a relatively weak settop box class of computer and doing youtube for awhile so his specs for playback seem very low compared to what I was doing in '09 on small devices, but whatever.
What is the allure and purpose of going back to 800 Mhz? I mean I did it myself this week, but was frustrated enough to think it's a really dumb idea, waste of time. I can't even articulate why I did it in the first place.
I used a Raspberry Pi 4 (1500 Mhz) as a daily driver for 4 days. Struggled with hidpi scaling, no Signal Messenger, overheating CPU, Youtube at 360p, HTML Gmail.
I went so far to upgrade Pi to SSD, plus heat sink. Considering adding active cooling... but the said nope, back to Macbook Pro. Why do we even try?
Change your workflow. You cannot expect a less powerful system to perform the same as a more powerful system.
Rather than watching YouTube directly, use youtube-dl with VLC. Rather than using HTML Gmail, use IMAP and a native email client. Rather than using Eclipse, use vim.
We all fall into patterns. We grow to find comfort in those. But, we can't expect to maintain those patterns when circumstances change.
I also found it confusing. I was wondering if it was this person's preferred pronoun but their Twitter [0] lists "she" as of "January 2022" and all the testimonials use "she" too [1].
"They" is almost exclusively singular in those circumstances, regardless of its etymology. Similar to how "you" derives from the old English second person plural pronoun, but is in virtually all variants of modern English acceptable for the second person singular.
"So this thing’s main job is to help us stay off our phone, since touch screens are the hardest on the health of our hands."
I have never heard this before. On the other hand I have heard about keyboards being an issue many times. Anybody else know anything about touchscreens being harder on hands than keyboards?
I can only speak for myself but I find using a smartphone upright one-handed for longer than a few minutes uncomfortable because of the bizarre positions it forces my fingers and palms into. It's a flat, slate-like object and our hands are designed to grip round things that protrude into our palms. I also need to keep my thumb free to use the touchscreen and my hands are on the smaller side so I end up balancing it on a protruding pinky and contorting the inside back edge of my hand. It's no problem for me to use two hands or hold it in a screen-up orientation but holding it in front of me for more than a minute isn't fun (despite having decent dexterity and grip strength from deadlifting, typing, playing instruments, etc).
I do wish landscape typing hadn't gradually gone by the wayside. I used to be a firm landscape-typer and found it much more comfortable for my hands, but I've accepted that phones just aren't designed for it anymore - too often the keyboard fills too much of the screen to see the textbox adequately.
I sometimes tend to let my phone fall into the nook of where my pinky finger meets my hand, when I want to hold it near-vertically/upright. I can agree that this doesn't really work well long-term, it's much more comfortable to hold it flat since it's less likely to simply fall out of my hand (...) that way.
I occasionally brace my index finger against the top edge of the display; this used to work great on my Note 3 with its giant bezel (particularly at the top), my current Mate 20 Pro's notched edge-to-edge screen doesn't play well with this though :(
I defintely find "large" phones to be very uncomfortable and cramping to use for more than a few minutes. Thats why I haven't upgraded from my first gen iPhone SE. Thats even a little too big--the iPhone 4 and previous iPhones were the perfect size. I'm not a heavy smart phone user, but for the two weeks I had an XS, it was the most difficult phone I ever had. I couldn't hold onto it, and was dropping it constantly. So I gave it to my brother and bought another SE used (before the XS, I had an SE that got water damaged.)
None of the studies you linked concluded any significant findings. The most significant, which was unremarkable, was from the first:
"There is limited evidence that MTSD use, and various aspects of its use (i.e. amount of usage, features, tasks and positions), are associated with musculoskeletal symptoms and exposures. This is due to mainly low quality experimental and case-control laboratory studies, with few cross-sectional and no longitudinal studies."
That tripped me up for a few minutes. I decided that since the author is using we in the singular not plural so it is that person's experience only and not meant to be a blanket statement. Also, I can only assume that the author meant a touch screen on a phone and not, say, a touch screen on a laptop because I can't imagine how that is difficult on someone's hands.
"our" is not the royal we here, this person refers to themselves with a plural sounding pronoun for identity reasons. They are not making a statement about ergonomics generally even though of course I can see how that can be confusing.
There's nothing really hard on healthy people when we use touchscreens, but this person clarifies they are disabled so this sounds like an edge case for them because if you're disabled in some ways, a nice fat keyboard is just going to be a lot more gentle on your hands and fingers than the tiny thumb keyboards of mobile devices.
Additional context: this is a person who wrote a brainfuck interpreter in sed on my couch using an iPhone. I tend to trust their input as to typing comfort implicitly.
That's hilarious (and very impressive), but also not applicable to the vast majority of humans that use touch screens! I mean, just how much do people type on touch screens?
Out of curiosity I checked eBay to see the going rate for this particular portable; of the two listed, one has a starting price of $350 and I watched the other go from a $99 bid to $150+ in an hour. Apart from the quirkiness or the need to replace one's recently dead machine, I can't wrap my head around such a high price for such low performance. For a little more than the higher priced unit, one can get a Gemini PDA or similar device with a more modern and faster processor, and come out even more portable and with excellent battery life (though I did note the author's need for a non-touchscreen device due to a handicap, the touchscreen on a modern portable doesn't have to be used if there's another pointing device).
Thanks kind of feel left out when folk here start remembering their c64 and Ataris and whatnot!
My first computer was a celeron 500MHz with windows 98 (maybe there was a 300MHz with win 3.1 but I never got it working)
So, this blog is nostalgia! Winamp and the Linux clone!? DDR2!? Back in my day we had some other thing that I don’t remember the name (sdram?), we ruled the city because with winrar we could use the T1 of the university to download stuff, then split it in 4 3.5” 1.44Mb floppy disks to install on our computers!
Oh, and CD-R changed the game forever! And usb… it took a while and a few dongles (parallel to usd, serial to usb, ps2 to usb) and hunting down the proper .inf file, but it was glorious!
I got my first computer in the days of CD-ROM and was amazed that a CD could hold more data than my Win95 (later Red Hat 6 (not RHEL)) Pentium Packard Bell's 512MB HDD could.
"Thanks kind of feel left out when folk here start remembering their c64 and Ataris and whatnot! My first computer was a celeron 500MHz..."
In a lot of ways, between the c64 and the celeron 500MHz is an enormous, almost unrecognizable leap, whereas between the celeron 500MHz and the machine in my hand is just a lot of incremental change. I had a machine ~2000 that was de facto a 500MHz Duron (running at its full 1GHz overheated very quickly), and I used the same basic paradigm on that as I'm using now. Emacs, browser, terminal windows, MP3 player. Wifi needed a dongle. The integrated webcam is new since then.
My first computer connected to internet was the family Pentium 100Mhz with 64MB of ram that ended up retired by my father for a much powerful athlon whatever. It used to run Windows 98. I inherited that one and immediately ditched the Windows 98 as it couldn't allow me to listen to music while browsing and use an office apps, audio was stuttering all the time.
This is actually the reason I started using linux. I remember internet was usable at 100Mhz back in the days, I could play some videos (obviously at a much lower res, I was using a CRT). The funny things is some of the apps mentionned here already existed at the time so it resonnate with my experience, although back in the days I tended to prefer apps running on terminals unless it was absolutely necessary. Emails, music player, I was a terminal for all of that. My computing life was not that different than today bar the videos resolution increase. And the web wasn't less interesting or usable.
It is incredible how crap internet has become that we can't reasonnably think we can browse it comfortably on what would have looked like a supercomputer at the time of my 100Mhz Pentium.
>I used the same basic paradigm on that as I'm using now. Emacs, browser, terminal windows, MP3 player.
I don't disagree with your overall point, but two parts of that paradigm, Emacs and the terminal emulator, do date back to the C64 era. Here's Richard Stallman on developing GNU Emacs: "There were people in those days, in 1985, who had one-megabyte machines without virtual memory. They wanted to be able to use GNU Emacs. This meant I had to keep the program as small as possible." [0] Emacs may never have run on Commodore computers, but my impression is that it ran on very similar computers.
Closer than you think since IIRC the PIC, like the Z80, takes 4+ clock cycles to complete one instruction and the 6502 can sometimes do it in 1 (albeit a much simpler/limited core, but obviously Commodore/Apple/Nintendo et al made it work).
That's basically the CPU running at 1.024 Mhz. The video hardware is dumb, runs independent of the CPU, and just scans a region of memory to send pixels to the display. All software pushing pixels otherwise.
You are not wrong with the NES, C64 and other machines using a graphics chip with sprites and other hardware features to assist in various ways. But, quite a lot happened on the CPU.
BTW, this game is done on a 1Mhz 6809, all software pushing pixels.
(I would skip out to the middle somewhere in that video to get a sense of what is being done)
On that game specifically, it's a single frame buffer. Screen divided into two halves, each drawn while the display is delivering the other to the player.
The Fujitsu FM-7 line of 8-bit computers actually shipped with two 6809 compatible CPUs (Hitachi 6309 IIRC) and the second one just did software graphics the whole time pretending really hard to be a GPU.
Oh wow, great to see someone else still enjoying and even using a Vaio P!
After lots of lusting over them back when they where new (1) I managed to find a used gen 2 one a while back and just adore it. To me the gen 2 series devices are still one of the most beautiful gadgets ever designed, but I am a huge Sony fanboy so ymmv.
I rock a neon green version with a blazingly fast 1.6ghz Atom and crisp 1600x768 screen - its still quite usable like OP describes, runs fine with Lubuntu, though nowadays I only use it to play some DOSbox games once in a while.
(1) I forgot the name/url, but there was this kinda famous website of some shop in Hongkong that would import all these great - mostly Japan-only - laptops to the us/eu, even in often very rare configurations (umts etc). Maybe someone else on here remembers!?
I used to go through dynamism.com and conics.net for importing Panasonic Let's Note and Sony laptops. Was it either of those that You were trying to recall?
Is it truly necessary to use such an archaic laptop to get the two essential features described at the beginning of the article: ultra-light weight and a trackpoint? I know the Surface Go 3 is light enough, but IIRC the type cover has a touchpad, not a trackpoint. In theory, with the ongoing miniaturization of electronics, there should be a modern option that meets these criteria. But of course, the mass-market nature of hardware means that there won't always be a current-generation device that is optimal for a disabled user like the author of this article.
For a brief window of time around 2013 Acer made a really nice little core i5 11" laptop (Aspire V5-171). Mine still works but I'm upset that the mass market seems to think that size is only for refurb and chromebooks now.
This was my thought as well. Even an older MacBook Air can run Linux and is lightweight enough to carry around, if weight is an issue.
It sounds like this person is just cheap (it's fine to be thrifty), but I'd rather spend a little more on hardware that doesn't get in my way of accessing medical information or communicating with others if that's my only method due to illness or medical conditions.
A Vaio P is not a cheap option. They were expensive new and are kinda collectable so prices are high on eBay and other places compared to other laptops from the same time. It's also considerably smaller than even the 11" MBA, weighs half a pound less too. Remember how Steve Jobs pulled the MacBook Air out of a manilla envelope? The Vaio P was designed to be pulled out of a jacket pocket.
It’s not even necessarily that low depending on the standards you apply / references you use, lots of chips have base clocks which are quite low especially for low-power CPUs or SoC.
For instance the Atom x6200FE has a 1GHz base clock. According to its spec sheet it can’t even burst (while the higher-rated X6211E has a 1.2GHz base clock and can burst to 3).
Your problem’s more likely to be that it’s an Atom from 2008 (which implies lots of performance-related concerns, like being pre bay-trail and thus in-order), than it being 800 base / 1.3 burst.
I believe 0.8 Hz would be about par with the earliest electronic relay machines. So (assuming it doesn't take 10MW), just about useful to compute admiralty tables.
Like you, I was a bit disappointed when I realized that I wasn't about to read some half practical computing at 800mHz
Near the beginning of the pandemic I got frustrated with the trackpad and battery life of my lenovo yoga, so I bought a ~$250 asus l204m.
Aside from my 2011 15 inch MacBook Pro which also had its issues, this has become my favorite laptop. I don't mind the small keyboard surprisingly, and I find myself getting light work and practice problems done while my wife and I watch TV.
The cons: video playback, the screen resolution, something about how the screen refreshes is also odd. 4gb max memory. I carry a dongle to use a generic usb-c charger.
The pros: Actual 10 hour battery life (mint xfce), and I can get 12 if I drop the screen brightness. Full size HDMI port. Great linux compatibility (from what I can tell). MicroSD expansion sits flush. Light and small, and I actually prefer 11-12 inch laptops now. Only costs $250 so I throw it in a bag if I'm going somewhere.
I get the fun around these devices and cyberdecks, and I have a couple raspberry pi projects, but at $250 for x64 processor and 4gb memory with a keyboard, screen and battery, it's not even a close call for me.
Despite being (ostensibly) state-of-the-art technology, I feel similarly about my 2021-vintage MediaTek MT8183-powered Chromebook.
Despite costing sub-$300, its CPU is comparably powerful (according to Passmark) to the Vaio VGN-P588E's contemporary desktop CPU, the Intel Q6600. Of course few PCs in 2009 had 4GB of RAM at the time (to say nothing about the GPUs of the time).
The MT8183-based machine offers a surprisingly capable computing experience, allowing for simultaneous Meet presentation + JavaScript-heavy web application usage, all at that retro computing price point.
Where it ceases to feel like my X61, however, is in battery life. Where the X61 only lasts a few hours of heavy usage with a fresh battery, the MT8183 chugs along for 12+ hours.
The "Cadmium" distro (Debian based) seems to have some support for a "Duet" device, which I assume is the mt8183 based Lenovo Chromebook Duet. They say that the cameras, hw accelerated video decoding, and external video output do not work.
> Ripcord is a desktop chat client for group-centric services like Slack and Discord. It provides a traditional compact desktop interface designed for power users. It's not built on top of web browser technology: it has a small resource footprint, responds quickly to input, and gets out of your way. Shareware is coming back, baby.
Some years of using this and I'm quite a fan. Voice works, but not streaming video, last I checked
Wow seeing XMMS brings back some memories. Before Pandora and streaming players, I had a machine under the bed that only ran XMMS to play music in the room. It allowed controls via game pad port, so there was a game pad in the room to play/pause/switch songs.
Via TLP on Linux, I've been capping my laptop's CPU (i7-8665U) to 800Mhz whenever I'm on battery. 800Mhz on a relatively modern CPU is quite remarkably fast and sufficient for most things.
Out of curiosity I've got CPU Frequency being polled periodically and updated in my taskbar, and the CPU spends a remarkable amount of time bouncing between ~600Mhz and ~800Mhz, because even when actively working, it's quite quiescent. Obviously compiling, running test suite, browsing etc. etc. will cause it to jump up to full speed (4.1Ghz with turbo, or there abouts).
One of the things I've found myself doing is paying a bit more attention to _what_ is consuming CPU resources when that frequency goes up. For example, I noticed that Zoom will randomly consume a couple of % of a CPU for about 20-30 seconds periodically. I know it also maintains some kind of notification hook to Zoom infrastructure. I don't need that persistent feature, so now I have a lightweight bash script that looks to see if I'm in a Webinar or Meeting, and if not, nukes zoom. The advantages are probably minimal, at best, but it took my fancy for whatever reason :)
> Via TLP on Linux, I've been capping my laptop's CPU (i7-8665U) to 800Mhz whenever I'm on battery.
in most cases, on modern hardware, limiting the frequency significantly below its nominal maximum will reduce battery life. for a fixed amount of work (e.g. parsing an HTML document), it is more efficient to complete the task as quickly as possible then return to a low-power state. the picture gets somewhat murkier when considering increased voltage requirements at higher clock speeds and certain fixed-wakeup workloads, but the majority of scheduler tuning for battery-powered devices over the past decade has been towards going to sleep as quickly as possible, even if that requires a high peak frequency.
The device referred to in the article gave me the idea to buy/make such a lightweight device with which you can surf the Internet, chat via XMPP, use e-mail, but at the same time receive calls and SMS. Also the device needs to run for a long time (ARM?) Sounds like something that can be done on a Raspberry Pi, but I'm not sure. It would be a good replacement for the phone, to be honest. It would be possible, of course, to take a simple phone for calls and SMS, but I recently read an article about the fact that such phones can have backdoors. In general, it would be cool to refuse calls completely, but unfortunately there are people who do not have other options. Besides, I go to places where the Internet does not work. In general, are there such devices in the real world that meet the above requirements?
Loved this, I found myself nodding along as I was reading the article. I had a similar setup a few years ago when I was still studying and this brought back a lot of memories. In my case it was a Compaq Mini with a 32 bit Slitaz install running on a similarly specced Atom CPU and 1 gig of RAM. Firefox was usable when combined with the noscript extension, but like you I limited myself to lightweight websites and to the mobile alternatives of the heavy ones. I even had the Audacious* Winamp skin that you mentioned in the blog post.
One other thing I remember being especially problematic was those websites that had large footers and navbars. Medium was one of the main culprits. The navbar and footer covered a large portion of the small screen, leaving me unable to read more than a few lines at a time. Back then I had "fixed" it by making an extension that removed them from the DOM, but now I realize that uBlock origin supports a similar feature. I'm pleasantly surprised to see that there are others out there with a similar setup!
I do wish that more laptop vendors would consider the really-thin-and-light market. There used to be all sorts of weird pocket/palm PCs available in the sub-2lb range, but I guess that phones and tablets have pretty much canibalised the market.
There are plenty of 10in Chromebooks out there, some can be setup to run Linux easily. I think some of the Acer models are below 2lbs. Other than that, there are some little 6 and 7in mini netbooks but they aren't cheap. You might as well just buy a decent phone or tablet for that price. I think the netbook phase was a means to an end where we now have very small, capable PCs in our pockets at all times.
I've got an XPS that I never turn on, an S21 phone that I use sparingly, and a USB-C to HDMI adapter at home that lets me turn my phone into my desktop computer (Dex).
This is me typing on a work computer, but I don't count that. My computer is a phone.
I've owned almost all generations of these little machines (in addition to a Psion Netbook) and to this date they're among the design & form concepts that I miss the most. With the docking station and optional extended battery, PCMCIA, etc, they would adapt to any work environment, but even in bare-bones mode, you could get some actual work done — Which isn't always the case with an iPad unless you add bulky extras. I wish Apple had the balls that Sony had back then.
I'm not following hardware news really; Just buying whatever Apple cranks out. But I'm considering buying an early 90s Psion MX5 as a mobile typewriter.
On Emulation, Mednafen should run fast if you set the right core for SNES, avoid any opengl and shader output, and compile it yourself with "-march=native -O3" and the rest of CFLAGS and CXXFLAGS.
It should emulate any 8 and 16 bit systems wells, even the GBA (which is 32 bit).
Also, on low end systems, solene@ from openbsd wrote a challenge on her personal site (gopher and gemini too) on keeping yourself on a single core device (grub/lilo option available just in case) and 512 mb of RAM at most.
Regarding games: PS1 games may run with PCSX-Reloaded aka PCSXR (the most-current-before-Retroarch fork of PCSX)—if the machine has any 3D acceleration. However, Retroarch's version is probably out, their requirements are higher.
N64 emulation may also be possible, however afaik it's a gamble whether accuracy is ok and games don't glitch all over.
Of course, mashing the gamepad—or the keyboard—is not good for hands. OpenTTD is indeed much more relaxing on the fingers.
Yes it was pretty limited and not quite useful yet for real work, but it shows what could be achieved on an 8-bit 2mhz CPU with less than 64K of useful memory.
Personally, I would use the iPhone 3GS as the baseline minimal hardware that can support all of the must-have features of real-world software, because the 3GS was the first model capable of running the VoiceOver screen reader (in addition, of course, to all the other things it could do). But then, I'm sure I'm over-emphasizing the must-have feature that matters to me.
I think both comments are true. Take something as lean as GEOS and upgrade it to support 5K displays, HDR color, unicode, etc., and I still think it would be several orders of magnitude smaller and faster than most of today's software.
The form-factor is great; Sony's implementation isn't. The keyboard is poor, the trackpoint not very responsive or accurate. (And I like Trackpoints!)
Mine is maxed out with 2GB of RAM and a dual-core Atom, and Intel Poulsbo GPU (GMA500). The author of this piece is running the screen at way below its true res of 1600x768. Not a typo: it's a high-DPI letterbox ultra-widescreen.
It came with Win10. It was unusable; 10-15min to boot and log in.
I tried Xubuntu $CURRENT then – maybe 18.04. Very poor; ~8min to boot and login.
It just about manages to run Windows 7 ThinPC, the "thin client" version of Win7. It works but it's not responsive. I am considering downgrading to TinyXP.
I have tried multiple Linuxes:
• Xubuntu (too heavy: used lots of RAM at idle, very slow)
• Devuan + Xfce (usable, took 250-300MB RAM, a bit sluggish)
• Crunchbang++ (worked, responsive, but surprising memory footprint of over 200MB)
• MX Linux (worked fairly well, looked weird, felt clunky; adjusted screen DPI & it broke the desktop cosmetically: text didn't fit inside buttons, etc.)
Currently it runs Raspberry Pi Desktop, x86 edition. This is surprisingly good. It idles at a bit under 200MB of RAM, LXDE (now PIXEL) works well and supports a vertical taskbar that works well. Screen DPI can be scaled. Quite snappy.
I tried installing Xfce and it substantially increased RAM usage, to circa 350MB — nearly double. This was an unpleasant surprise: the RasPi folks have cut Debian down hard and I am impressed.
I am playing with antiX in VBox as I type this (no, not on the Vaio P) and it's weird but it does work well, and it's idling, after a full update, at an amazing 106MB of RAM. I may try this on the Vaio.
I wish I could get the Vaio's GPU working but no modern Linux can run the ancient Pouslbo drivers.
Google search results show their repeated response to questions like that on their forums is anyone who disagrees with them in any way or criticizes anything they've done is literally Hitler. They're apparently very hard to get along with.
Its a rather interesting and extreme solution to endless bike shedding arguments. Should "our" editor be vim or emacs? Well anyone who disagrees with me is literally Hitler so we're going to use XYZ and you'll like it or go away. That's an interesting strategy to save time on eternal discussions.
It goes back a long ways and it stems from very minimalist only free software linux communities. Many of these communities I've been privy to for 15+ years take pride in running in the smallest memory footprint, are anti-Java, anti-javascript, anti-Poettering (systemd as a backdoor), etc. and this is the premise of it. This community will get very mad at you for suggesting to add in those things that they deeply hate, i.e., javascript or non-free software blobs.
I did poke around a bit to try and see if there was an explanation, or any explicitly anti-fascist features. I guess it is just intended as some kind of in-group identifying thing? I had thought maybe they were implying systemd was facist, as they stress that the distro doesn't use systemd.
I guess I assumed it would be a stronger connection, given that the distro is called "anti"-X and it's "anti"-facist, that it would have some explicit tor integration or something.
I think the subtext is some people associated systemd opposition to right wing politics (not fully unfounded if I understand) so the maintainers wanted to get out ahead of it when releasing another distro without systemd
>>antiX is not anti politics, it is anti fascist politics. Politics is everywhere whether you like it or not. Put simply, we do not tolerate politics or people spreading hate/prejudice/violence against people because of their skin colour, race, religion (or none), gender, sexuality.
The only other time I've heard of "IT-fascism" is when this 'abebeos' character threatened someone's job for not taking seriously a 50% claim on a $7k bounty for 'documentation, testing, and integration work'.
The idea behind it is using only free software, and nothing non-free. The guy who runs AntiX goes by the username Anticapitalista and the theme is not supporting corporate computer systems and consumerist cruft.
I recently spent a fortnight using a 2009ish Asus EeePC with a whopping 2Gb of RAM. It wasn't terrible running Manjaro Linux i3. For all the browsing social media it was basically the same experience as the author. But for a handy note taking and anki machine by my bed it was perfect. I could type out notes on its almost full sized keyboard without distraction. I could review audio and picture cards just as fast as on my phone, but no danger of a notification or "accidently" opening facebook.
On a small screen a tiling window manager is a must IMHO. No space wasted by bars or widgets, and any app can be full screen at the touch of a button.
The only reason I stopped was it developed a whine in the CPU fan that was particularly annoying.
Nice post - Reminds me of my experiences using an IBM Thinkpad T42 (2GB RAM, 1.6 GHz single core CPU) and Raspberry Pi Model B (512MB RAM, 700MHz single core CPU), quite a bit of compilation required these days due to a lack of binaries but still a fun exercise on a weekend
I have once clocked my CPU down to 400 MHz for a week or so (i7-7700K I think it was) when the pump of my cooling loop died to keep it below 100°C, although it would have throttled down on its own once at thermal limits. Other than games running terrible (of course, duh) I couldn't really tell much of a difference. Some things were slower like compiling code or things like 7z but it didn't feel like a throwback to the late 90s because what made computers slow those days were HDDs. Oh, and there is GPU acceleration for so many things these days ... like watching 4K videos was no problem at all.
I ran these specs or a very close approximation as a daily driver for many years on a couple Gateway Atom netbooks. I consistently ran Debian unstable (Sid) with minimal window managers and desktop environments from 2011-2015 or so with mostly i686-PAE kernels.
I was confused by the constant use of "we" in the writing here and at first assumed this person was sharing the netbook with multiple other people. By the end I came to realize it was something more like a split personality usage? I found it odd.
Wow. Atoms were slow when they were new. 2-4 times slower than a E5300 Core 2 Duo or whatever was common in 2009.
That said, with maxed out RAM and a cheap SSD they were 'enough' and they came in some neat formfactors. I had the Lenovo S10 netbook, but the 1024x600 was very hard to live with. They didn't offer anything special in the way of power savings or battery life, either.
For the price, a 2-3 year old Dell or HP laptop was a better choice, and then the iPad came out...
I have a PowerBook G4 with upgraded memory. The only caveat is the power cell that gets hot quickly so I have to wire power in everytime.
I installed a few applications including a web browser but then got bored. There is nothing a modern computer cannot do what it does, and miles better. I'm not sure what I'm going to do with it TBH. I made the purchase based on an impulse and now I'm paying for it. Fortunately it didn't cost too much.
I bought a Chromebook 4 on Black Friday (Celeron N4000, 1.1 GHz, $90 then, $120 now) and similar to the author, I find it pretty useful but sometimes requiring patience.
Best part is it doesn’t have any work stuff on it, so I can do my own light tasks on it without any temptation (due to inability) to have work leak into that time. That’s worth a multiple of the purchase price by itself.
A 733MHz Pentium III with 512MB of RAM (upgraded a few times; originally 128MB) was my daily driver for a long time, and if I stayed away from the web-based stuff, would probably be quite usable for a lot of day-to-day things today as well, including native software development.
> (Shoutout to lib.rs btw for offering a rust crate database that actually works without javascript. We have no idea what crates.io thinks it makes sense to require javascript to look up packages but here we are.)
Well, Rust is a language primarily targeted at web developers after all.
Oh I've been looking for a similar Sony VAIO but that was packing a Core 2 Duo .. there's just so many Sony laptops I could never find which model it was (college prof used that on trips).
> Technically Gentoo is also in the running, but can you imagine trying to compile all your packages from scratch on a system that benchmarks worse than a raspberry pi 3?
Hmm I knew at least one person who did it.. Yes, it's exactly how you'd imagine.
I initially started using Gentoo on a Pentium 166 laptop with 64MB of RAM in the early 2000s. It was pretty usable after the couple of days it took to install everything.
This computer reminds me of my fondness for the Compaq Presario 615dx a little higher spec, but it was that kind of computer the inch-thick ones. That laptop had a really nice keyboard imo.
Was using Linux Mint and Bluefish/Kate text editor.
I notice IceWM gets mentioned here; I'm always looking for a good environment to run for my VNC sessions. Right now I'm using FVWM, can anyone comment on pros/cons of switching to Ice?
Ah, to use an ancient device, be genuinely happy you won't waste time with video games only to eventually install some games that do run smoothly on this ancient CPU. I too have this experience
I like the new 16inch MacBook Pro but for sure prefer my iMac Pro and am looking forward to replace it with something new presented this year. Life is too short for crappy hardware.
There are other usages of we for an individual. I think the writer is trying to use what is called the editorial we, not the royal we.
I will say it's use in that article is rather jarring however.
edit: Looking it up, it appear this has a fancy name, encompassing the royal we, the editorial we, and more: Nosism. https://en.wikipedia.org/wiki/Nosism
Interestingly, the past blog posts use singular "I" up until the middle of the post "A Story of Microsuites, and Atrophy" from 21st of September, 2021: it starts with "Let me give you a view", but ends with "We lived there for three long years".
But diagnosing over the Internet, while a fun pass time for the diagnoser (diagnostician?), is very unreliable.
I figured maybe the author used they/them as their pronouns. But the main page has some testimonials referring to “she”. So idk what’s up with the royal “we”.
Sometimes in science, people use the collective “we” in their writing. But like you say, the OP is using royal “we”, so I don’t think scientific writing is the reason either.
Their next blog post talks about being banned from Twitter because they put in a birthdate that made them under 13, so I'm guessing at some point they decided that they were reborn as some sort of collective (Twitter profile says "hardware witches | server maids"). Thus, "we" instead of "I".
It's Hacker News. This kind of stuff is pretty normal here.
It is. You have to understand that the way you see yourself is not necessarily the way other people see themselves. For centuries people hid this from society, but now people are open about it and it turns out that it's pretty OK.
I'm not a native speaker, but AFAIK if you use "they" to avoid specifying a gender, it's "singular they" (https://en.wikipedia.org/wiki/Singular_they), so using the first person plural still doesn't really make sense...
First person pronouns aren't gendered, so even if this were a they/them using person, the darren.rogers@cybereason.com' doesn't make sense. What it did do was distract form the blog post, to the point that I stopped reading it.
I just kept picturing someone creepily talking to their hairless cat while saying 'we'
I stopped reading because there was no background to anything. I landed there because I saw it on HN, but I had no idea what was going on. After a few paragraphs, I lost interest.
Uh, they started to use an old laptop (presumably that they just had lying around) because they wanted to spend time on their phones less, and here's how it worked out. To be honest that's a fairly common story structure here on HN (with precise details different, of course), what more background do you want or need?
I can’t quite put my finger on it, but I’d appreciate something along the lines of who they are and why the reader should invest their time to read on.
funny, I had a different theory: A concerned tech savvy parent, trying to save its children and spouse from the hazardous impact when using touch screens.
> So this thing’s main job is to help us stay off our phone, since touch screens are the hardest on the health of our hands.
So lets heal this small children hands with this small keyboard :-)
I really wish someone like System76 would take this concept farther, like a HappyHacking style keyboard layout, no fan, no trackpad (clit mouse OK), and a focus on durability and repairability. IP67/68 would be cool. Sell the low specs as a feature for the ADHD-type to not do too many flashy light things, and for hobbyists to see what they can squeeze out of it.
Uh, I actually did this, it wasn't so bad honestly it just took about a day to rebuild everything.
Honestly the Sony VAIO that I had was _awesome_ in some regards, the hi resolution display was extremely crisp! It fit comfortably in my inside jacket pocket, the battery didn't suck.
The only issue I had honestly was the proprietary connector to get ethernet (though this was more annoying in 2012 when I was doing this, these days laptops don't seem to have ethernet); the only other issue was that the GPU was extremely slow with Linux.
it was probably extremely slow in Windows too, but vista (which was installed on the thing) was far-far too heavy to understand why it was slow at all.
The nearest best laptop I've found that is in all areas superior than the Sony VAIO P-Series (aside from being a bit taller) is the GPD P2 Max which is basically perfect.... if only it had a passively cooled ARM CPU.
https://gpd.hk/gpdp2max2022