When people talk about Waterfall Model they forget the context of the era it originated. I've worked on a waterfall project early in my career. Please keep in mind that back in the 90s most software companies didn't have CI/CD, there was no cloud, releases were hand-delivered to the customer and usually rolled out about every 6 months because technically agile wasn't possible yet for most software shops. Waterfall was a valid method back then due to the technical (and cultural) limitations.
Haven't lived back then I'm not sure today's CD is better. Too often web breaks features people rely on. Worse they just do it randomly with no notice and no ability to have a transition time while you update. I miss the stability of knowing things will work like they did for a long time and when they break it is at a time I choose to apply the upgrade. (it didn't always work out this way, random bugs were a thing and still are)
Quality control was also a lot more important - you couldn't ship something with many bugs as customers would be forced to look at competitors if they hit your bug.
Meanwhile, the Royce paper that takes down the simplistic series of steps arranged in a waterfall, is a sort of formalization of an approach better visualized in late 80s as "the spiral model":
The concept of both, well before the 90s, is you don't know what you don't know, so its faster and more successful and costs less to make, learn, and document from prototypes before you know, to establish what you're really making.