Hacker Newsnew | past | comments | ask | show | jobs | submit | DiskoHexyl's commentslogin

Finally jumped ship to a Jellyfin based home server and couldn't be happier.

The ui is surprisingly good and polished (especially for the users who don't have to manage the library), video quality is amazing (with bd source files, who would have thought, but even DVD is often better than what modern streaming provides), and I can cache the movies on my phone when needed.

It works in ANY browser under ANY os, doesn't have ads, doesn't track me, and has all the content that I could ever desire (and wouldn't be able to find in any one service. In some cases, IN ANY service).

I can have any combination of a subtitle language and a voiceover.

Overall cost was only 500 for a used m1 air and a 16TB external storage.


I ditched all the services years ago and use a similar setup. Works well enough for me.

I find the free version of plex - once I config out all their own streaming junk - is perfectly good.. (and it runs acceptably on my ancient synology) Are there any compelling reasons for me to look in to jellyfin?

It’s free as in freedom and open source. This isn’t just a thing for people who are preachy but it’s also a sign that it’s less likely to change the terms of the deal, so to speak.

I'm perfectly fine with paying for software, so the price wasn't a leading factor in my choice (and I have contributed to Jellyfin).

It just looks to me that Plex (as a company) isn't really as reliable for self-hosting in the long run. So even though Plex has a better client support (for example, on xbox and playstation), I decided against it in favor of something that only I am in control of.

Initially intended to buy a license for Emby, but it doesn't support hardware transcoding on Apple Silicon yet, so Jellyfin it is then.

If you are happy with Plex, there's no reason to switch, IMO. If something goes wrong, it likely won't take you long to connect an alternative to the same media library


The main reason that finally pushed me to move over was that Plex works local only (no internet connection) for only so long and then you are done for without a beacon/login home I guess. Dumbest "feature" ever but I guess I get why they do it.

We had our power and internet go out for an extended amount of time earlier this year and shortly after I converted us over to Jellyfin vs Plex. Quite easy and painless setup. Mostly just recreated my libraries in Jellyfin and good to go.


Jellyfin is how you're serving it. Where is the content coming from?

They said Bluray and DVD.

Some of us have quite a few DVD/Blurays that we could rip. A lot of good stuff can be found in bargain bins and closing down sales (I got all the 9 series of the X-files for next to nothing). I am personally not bothering to rip them and download them instead, but I am not paying like 5 times for the same movie/tv series that I have already paid for.

A lot of films have been re-released now like on different media formats, with several different "cuts" which normally have maybe a few extra minutes of dialogue.


I have to investigate this further. I bought a bunch of bluray UHD documentaries that I cant watch because my 4K TV is a 'monitor'. Which I only found out was a problem after also having to buy some expensive 4K HDMI cable which was supposed to fix the problem.

Since I couldn't work out how to backup the discs, I now also just buy 2nd hand DVDs for when I need my pacifier box. Anyway, for this old man it turns out that the old movies are the best; no shaky camera and clear audible dialog. Plus some of the modern discs are now also polluted with adverts, horrible.

If we lived in a sensible world, buying the item once should give me rights to every format available. I might even consider digital purchases again if that was the case, especially since we have seen too many of these services fold and take our purchases with them.


...and the more low-trust becomes the society, as if it's not already the case in plenty of places.

It's no coincidence that people always judged and shunned such overt manipulators, as well as tried to downplay the underlying mechanisms of manipulation in general (outside of the sales types, which are often looked upon as slimy and not deserving of trust).

A low-trust society is not fun a place to live in


Hardly a surprise, given the nature of Spark and benchmark prerequisites. Comparing a positively ancient distributed JVM-based compute framework running on a single node, with modern native tools like DuckDB or Polars, and all that on a select from a single table- does it tell us something new?

Even Trino runs circles around Spark, with some heavier jobs simply not completing in Spark at all (total data size up to a single PB, with about 10TB of RAM available for compute), and Trino isn't known for its extreme performance. StarRocks is noticeably faster still, so I wouldn't right off distributed compute just yet- at least for some applications.

And even then, performance isn't the most important criterion for an analytics tool choice- more probably depends on the integrations, access control, security, ease of extendability, maintenance, scaling, support by existing instruments. Boring enterprise stuff, sure, but for those older frameworks it's all either readily available, or can be quickly added with little experience (writing a java plugin for Trino is as easy as it gets).

With Duckdb or Polars (if used as a basis for a datalake/house etc) it may degrade into an entire team of engineers wasting resources on implementing the tooling around the tooling instead of providing something actually useful for the business


SteamOS has way more appeal to gamers in 2025 than it could have had in, say, 2004.

On the surface the lack of popular multiplayer titles that require a kernel-level anti-cheat is a heavy downside, but gaming is extremely fragmented these days. In 2004 everyone, save for the casual players, at least tried DOOM3 and Half-Life 2. In 2025 Fortnight has an all-time peak of 12M players, but at the same time there are many millions of Minecraft players who never even launched Fortnight. And DOTA2/LOL players who've never launched either of those 2. And then you see a bunch of indie titles selling tens of millions of copies, and their player base is completely unrelated to those above.

The days of the gaming mono-culture are long gone, and inability to play a limited number of Game As A Service titles is not as severe of a handicap anymore, especially since people who play those kinds of games aren't typically as interested in any other titles. For better or worse, peer pressure doesn't work as heavy these days, as it used to


I was a heavy gamer in 2004 and never played HL2 or DOOM3. I know many such people. I think games like Mario party, smash, and Mario kart were far more ubiquitous.


That just sounds like all you had access to was a Nintendo console, not necessarily due to your own choice. I missed out on all the early zelda, metroid, and mario home console games because we were a playstation family until the wii.


I played plenty of PC games such as Warcraft, StarCraft, and random stuff on steam. I was just not much into FPS (although TF2 was an exception). I also had all 3 consoles (all of my teenage paychecks went into games), but I think it was really Nintendo games which were commonly played by everyone I knew. Even if you didn't have one you'd play them via local multiplayer at someone's house.


Saying “everyone” played those two titles is still incorrect. Personally I think the landscape was more fragmented then.


Absolutely is fragmented. Even though I own a Wii I've never played Zelda or any Mario games, and I don't think I know anyone who owns a modern Nintendo. We all live in bubbles. And we change bubbles occasionally; I no longer play Fifa or CoD mostly because of the kernel anti-cheat. I got bored of CSGO. I play less gory games now because of family. We play less Lego games because we grow up.


Until the 2010s, PC gaming was fairly niche in US/Canada. Growing up I didn't personally know anyone that gamed primarily on a PC.

I feel like it was when Minecraft took off that people started investing in gaming PCs, kids were asking their parents for them, etc.


Your definition of heavy gamer I think differs from the norm if your main plays were Mario kart, et al.


Yeah, I am pretty sure most heavy gamers in 2004 were knee deep into MMOs and FPSes.


There isn’t a single one way to be a dedicated gamer.

Inevitably everyone has finite time and access to games and has to make choices about what to play.

As a Mac guy, I always found the game platform wars weird because even on the weakest gaming platform there are still more good games than anyone can individually play. And even on Windows, probably the strongest gaming platform, you’re still missing out on many significant games.

I totally understand buying a system because it has some game that you absolutely must play. I bought an OG Xbox back in the day because I thought I desperately needed to play Deus Ex: Invisible War when it didn’t come to Mac. Got burned on that one, but at least I had Halo before it came to Mac (and was in the end much better there than on Xbox due to expanded online multiplayer).

What I actually don’t get is folks who have to play the hot game of the week every week. Just seems expensive in terms of money, time, and space for different systems, and you only scratch the surface of the games.


What made you go with comparing things to 2004? Seems random, there is so much that is different in the Linux ecosystem generally, Valve just put the situation on a rocket and shot it into space.

Point taken, it really is marvelous! When I was running Gentoo Linux, and Windows 2000 back then I never thought things would be so portable and simple!


> What made you go with comparing things to 2004?

I guess HL2 release?

Steam launch was late 2003 and first non-valve Steam games appeared in 2005, so "thereabouts" can be a reason as well for "Valve era"


> the lack of popular multiplayer titles that require a kernel-level anti-cheat is a heavy downside

It's a downside if all you want to do is play those games. But it's an upside if you're hoping they someday ditch all that nonsense. This puts more pressure on those publishers.


More likely is that some linux distro like SteamOS gets a large enough install base that it actually makes sense as a target and these big platforms make their anti-cheat work on at least that distro. As unfortunate as it is not having a very strong anti-cheat or a system like Valve's VAC ban to detect and lock cheaters out leads to really shitty online experiences in public lobbies for PVP games.


Some anti cheat works with proton if the game dev allows it. But anti cheats are generally not effective on Linux because you can just load your cheat as a kernel driver.


What kind of device & kernels driver attestation is possible in linux at the present time?


Secure boot, signed drivers, attestation, is all possible. But you can just sign your own driver anyway so kinda useless.

Might be possible with a more secure mode that is booted into when you launch a game that only allow specific drivers and programs like the game and maybe discord.


Your comparisons are a mess.

"Casual player" is very poorly defined.

You are comparing concurrent players with unique players (IIRC half a billion for Fortnite ?)

"Many millions" hardly means anything when you use it to cover 3 orders of magnitude.

And so on and so forth...


True. Things were better the old way with so many kids at least having a video game like Melee or CoD or Halo in common. I would've liked those to run on Linux, but that doesn't matter so much.


Eh multiplayer games are doomed.

Computer vision based cheats using an external machine that records the game's final rendered frames, process them with specialized YOLO models, and control "mices" and "controllers" to aim for you already exist.

If the aim for kernel level anti-cheats was to combat cheating, they have failed and are completely worthless.


You don't need an external machine. Since games are set up to allow twitch etc streaming, it's easy for apps on the same machine to get access to the video.


That's like saying online banking is doomed because rubber-hose cryptanalysis exists. The defense does not have to stop 100% of the exploits to be effective.

I hate kernel level anti-cheats but they do provide friction and reduce cheating.


There are open source solutions out there. If anything there is less friction now.


CC barely manages to follow all of the instructions within a single session in a single well-defined repo.

'You are totally right, it's been 2 whole messages since the last reminder, and I totally forgot that first rule in claude.md, repeated twice and surrounded by a wall of exclamation marks'.

Would be wary to trust its memories over several projects


create a instruction.md file with yaml like structure on top. put all the instructions you are giving repeatedly there. (eg: "a dev server is always running, just test your thing", "use uv", "never install anything outside of a venv") When you start a session, always emphasize this file as a holy bible to follow. Improves performance, and every few messages keep reminding. that yaml summary on top (see skills.md file for reference) is what these models are RLd on, so works better.


This should not really be necessary and is more of a workaround for bad patterns / prompting in my opinion.


I agree it's a workaround. Ideally the model should follow instructions directly, or check before running another server to see if it's starting. Though training cannot cover every usecase and different devs work differently, so i guess its acceptable as long as its on track and can do the work.


How big is your claude.md file? I see people complain about this but I have only seen it happen in projects with very long/complex or insufficient claude.md files. I put a lot of time into crafting that file by hand for each project because it's not something it will generate well on its own with /init.


I always just tag the relevant parts of the codebase manually with @ syntax and tell it create this, add unit tests, then format the code and make sure it compiles. There is nothing important enough in my opinion that I have felt the need to create an MD file


Where can I find docs about Claude @ syntax?


I think the parent comment is simply referring to “@“-ing files in the chat.

So if you want CC to edit “file.R”, the prompt might look like:

“Fix the the function tagged with ‘TODO-bug’ in @file.R”

That file is then prioritized for the agent to evaluate.


Also I am confused by the “wall of exclamation marks”. Is that in the Claude.md file or the Claude Code output? Is that useful in Claude.md? Feels like it’s either going to confuse the LLM or probably just gets stripped.


When I first got started with CC, and hadn't given context management too much consideration, I also encountered problems with non-compliance of CLAUDE.md. If you wipe context, CLAUDE.md seems to get very high priority in the next response. All of this is to say that, in addition to the content of CLAUDE.md, context seems to play a role.


At what point does futzing with your claude.md take time equivalent to just writing the code yourself?


What's the right size claude.md file in your experience?


My experience is with copilot and it uses various models, but the sweet spot is between 60 and 120 lines. With psuedo xml tags between sections

Might be different across platforms due to how stuff is setup though.


My AGENTS.md is 845 lines and it only started getting good once it got that long. I'm still wanting to add much more... I'm thinking maybe I need a folder of short doc files and an index in AGENTS.md describing the different doc files and when to use them instead.


I know copilot supports nested agent files per folder.


Very long OR insufficient. Ah yes, the goldilocks Claude.md


Yep -- every message I send includes a requirement that CC read my non-negotiables, repeat them back to me, execute tasks, and then review output for compliance with my non-negotiables.


More accessible by default than modern electron-based apps


In every hobby there are 2 groups of people: those who enjoy the actual act of doing the thing, and those who enjoy the tooling (equipment, methodologies, discussions around the thing etc).

Not saying that one is inherently more worthy than the other, but no surprise- the first group is usually better at actually _doing_ the thing


Difference is, AMD wasn't a competitor for ATi. One mostly built CPU's, while another- GPUs. These two, on the other hand, are competing in several major product categories. Overall, not a good look


I doubt we would be seeing Dell selling NVidia ARM CPUs anytime soon.

However I do imagine Intel GPUs, that were never great to start with, might be doomed, long term.

Also another possibility would be, there goes One API, which I doubt many people would care about, given how many rebrands SYSCL already went through.


>One mostly built CPU's, while another- GPUs.

I mean that also applies to Intel and Nvidia. Intel does make GPUs but their market impact is basically zero.


Anki helps the best when you are at the beginning of the language journey. With a vocabulary of 0 words you won't be able to read anything at all, and there's nothing to talk to an LLM about yet.

Those 'useless' static cards are extremely efficient for learning the 1st 2000-3000 words, which is key to start reading. After about 4000 there's little sense in using SRS anymore, and then I'd rather spend more time with an actual book, but getting there with anki felt like using a cheat code compared to how I learnt my first foreign language. It's not exciting, it's pure toil, but it does work.

And when it comes to the next stage, I can't imagine how random llm-generated texts are better than, say, graded readers or real books. Most people would likely find it more interesting to spend an hour or two a day following an exciting story and characters they care about, and it's (based on a sample of one) way easier to memorise all of those new words when there's an emotional connection for each one (just how we form associations between words and experiences while growing up).

As for the app itself- I have tried it with my native language, and at the advanced level it produced a sterile and slightly unnatural text with a complexity of a typical fiction. If someone could read this, then I don't see why they would bother. At the beginner level the app generated a couple of news stories which, though simple grammatically, had a vocab that I would never have recommended to a novice. Local news of a "a firefighter saved a kitten stuck on a tree" variety are much more useful for that kind of learning, and you get this from any free newspaper.

LLMs are extremely useful for learning foreign languages, but I feel like this isn't the way to go


With the number of Steam Decks sold estimated at 3-4 million, and the number of monthly active Steam users being around 130 million, I think it's safe to say that 0.21% does not represent SteamOS install base. As far as I know, SteamOS doesn't show as Arch, but rather as its own thing


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: