Hacker Newsnew | past | comments | ask | show | jobs | submit | homebrewer's commentslogin

Coccinelle for C, used by Linux kernel devs for decades, here's an article from 2009:

https://lwn.net/Articles/315686

Also IDE tooling for C#, Java, and many other languages; JetBrains' IDEs can do massive refactorings and code fixes across millions of lines of code (I use them all the time), including automatically upgrading your code to new language features. The sibling comment is slightly "wrong" — they've been available for decades, not mere years.

Here's a random example:

https://www.jetbrains.com/help/rider/ConvertToPrimaryConstru...

These can be applied across the whole project with one command, rewriting however many problems there are.

Also JetBrains has "structural search and replace" which takes language syntax into account, it works on a higher level than just text like what you'd see in text editors and pseudo-IDEs (like vscode):

https://www.jetbrains.com/help/idea/structural-search-and-re...

https://www.jetbrains.com/help/idea/tutorial-work-with-struc...

For modern .NET you have Roslyn analyzers built in to the C# compiler which often have associated code fixes, but they can only be driven from the IDE AFAIK. Here's a tutorial on writing one:

https://learn.microsoft.com/en-us/dotnet/csharp/roslyn-sdk/t...


Scary to see how many people replied to LLM slop without realising it.

Even simple Half-Life 1 mods built on textures and models from Half-Life 2 look much closer to 2 than one would expect. For example this mod, but not only:

https://moddb.com/mods/half-life-dark-future

You won't confuse it with modern Half-Life 2, but the original HL2 engine had far worse graphics than the latest version. Makes you realize how much of the difference between HL2 and HL1 is due to different textures and level design.


And Viktor Antonov (rip) art style.

edit: there is also the fact that map compilers for gold source games have advanced far beyond what they could do back in 1999. The lightmaps and light sources alone can be far more intricate nowadays than what you would get from the official valve ones in 1999.


The other thing though is that Original Quake Back In The Day ran on a Pentium 75 (needed the maths co-processor) with a dumb framebuffer. All the rasterising of polygons was pure software, as was all the geometry processing. Running GLQuake was a huge improvement but it required an expensive add-in card that piggybacked onto your VGA card, and a whole different binary.

Now you can just kind of pile it into a block of RAM, aim a chunky ASIC at it, and pull the trigger every frame.

In the late 90s a mate of mine did a phenomenal video of a Quake demo (you could record all player movements and camera positions as a "dem file") that he'd rendered out, raytraced in POVRay. I printed it to VHS for him as part of a showreel, and never thought to keep a copy myself.


While lighting is important, not using halflife.wad and going above the original budget of 500 polys per "scene" is what makes modern works look much better.

Most of the original textures are under 128×96 px and some suffer from awful palletisation artefacts with purple and orange halos. We still cannot use more than 8 bpp but we can use 512×512 textures and do a better job at reducing to 256 colours. I use pngquant for that.

In GoldSrc lightmaps cannot get more intricate though, they're tied to the texture scale so you cannot get a finer lightmap unless you also make larger textures and scale them down, and these two combined will wreck your "AllocBlock" budget in which all your textures and lightmaps must fit.

ericw-tools and its dirtmapping are still welcome improvements over the "traditional" *HLT compilers.


> In GoldSrc lightmaps cannot get more intricate though, they're tied to the texture scale so you cannot get a finer lightmap unless you also make larger textures and scale them down, and these two combined will wreck your "AllocBlock" budget in which all your textures and lightmaps must fit.

AFAIK some of the improvements include much better light bouncing techniques, transmission of surface colors like source does, more accurate lights, spotlights that emulate what source spotlights does and faster compilation (computers also got faster and MT support helps a lot). That alone allows level designers to be more ambitious by taking advantage of faster iteration and place even more lights.

I do agree that there are likely dozens if not hundreds of reasons why maps can and usually do look way better today than what could be done in the past. Hell, even level designer proficiency with the tools as time goes is also surely a reason.


I think the biggest reason is just better hardware. In 1998 many of the props were just blocky level geometry.

I used to do a bit of mapping back then (nothing that survived to this day, thankfully); as I recall, practically nobody used official map compilers. As it often happens, the community wrote replacements that were much faster for debug "-O0" builds, and generated lightmaps of a significantly higher quality for the release "-O2" builds.

It was either ZHLT or VLHT, or something like that; looks like more alternatives have been written since then.

https://gamebanana.com/tools/5391

https://github.com/seedee/SDHLT


The lighting is one of the main area's that really improved a lot.

For standard Q1 mapping ericw tools [0] is great (the page has some nice previews).

This project seems to use Nuclide for building which by default uses vmap compiler [1][2]. Which is really Q3 but I think FTE handles that well internally as the newer format has some more modern features.

> Powerful BSP compiler. Use VMAP to bake levels like you're used to from similar engine technology, with high quality lightmaps, cubemap-based environment mapping and adjustable vertex colors on spline-based meshes.

[0] https://ericwa.github.io/ericw-tools/

[1] https://developer.vera-visions.com/d4/d50/radiant.html#autot...

[2] https://github.com/VeraVisions/vmap


There was a similar path with Unreal3. The early games (2006) lighting looks quite harsh by modern standards, one of the highlights of Mirror's Edge (2008) was DICE using third party Illuminate's "beast" lighting, then Epic moved to "lightmass" around 2009 with the public UDK toolset.

The Z from ZHLT ended up working for Gearbox Software.

A shame to only now learn of Victor Antonov's passing. His work on HL2 and Dishonored remain some of my favorite examples of video game world building of all time. These places felt real and lived in, in a way few other video games have matched for me.

Yeah Cry of Fear really pushed the GoldSource engine to its limits (I think it implemented a custom renderer but the models just push the base engine's limits with regards to maximum polygons and texture sizes).

Don't; I'm pretty old myself, and I've only a vague idea of what gopher is because it was never used in this part of the world, and internet access also came pretty late. Maybe GP is in a similar position.

Looking at what they did to commercial software that used to have excellent, high density UIs, maybe they should stay where they are.

zram has been "obsolete" for years, I don't know why people still reach for it. Linux supports proper memory compression in the form of zswap

https://wiki.archlinux.org/title/Zswap


I didn't realize zswap also uses in-memory compression. It might be a combination of poor naming and zram being continuously popular.


It is not obsolete. It's also useful for other things.

Because I'd rather compress ram when running low on memory rather than swapping to my disks. zram is also default on some distros (e.g. Fedora).


Did you read the link? Additional disk swap is optional, and if for some reason you would still like to have one, it's easy to disable writeback, using just the RAM.

And even if one enables zswap and configures nothing else, compressing RAM and only swapping out to disk under extreme pressure is still the default behavior.


> Additional disk swap is optional

Where did you read that?

My experience has been that not only in reality the pool size is limited by the amount of swap space you've allocated, it's also uncompressed again when offloaded there which is ridiculous when you think about it.


I presume you mean the Oracle cloud?


Nobody is happy with Oracle anything! It has some users because it is free. It has paid users because Larry Ellison bribed the government. Nobody would choose it voluntarily.


No, gcp. Was a happy customer for many years, now I work there.


You don't have backups if you only have one "backup". Look up sysadmin's 3-2-1.


No, the right way to do this on Linux and FreeBSD is to use zfs with zfs send/receive. Creating snapshots and sending them is efficient enough to use it as the underlying storage for moderately loaded databases and VMs.

They are atomic and require zero downtime. They can be encrypted and resent to other machines. Cloning whole machines from them is easy and efficient.


I don't ever edit English wikipedia because my English is not nearly up to the standard, and suggestions for improvement (worthwhile IMO) are usually ignored. Grok at least won't ignore you. (I tend to post suggestions to unpopular pages with sparse edit history, which is probably the reason for them going unnoticed.)


I use to frequent irc channels and forums where no such thing as an old question existed. Someone asked an interesting question on irc and days or weeks later a response would happen. On forums the response could be more than a year "delayed". Gradually things shifted to newer new new news that couldn't possibly be new enough. Then debates happen where people sometimes link to the vastly superior olds. Wikipedia finally caught up and questions are no longer ignored. In stead they are archived long before an ignored status could be earned.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: