Fun rabbit hole here about the III FR80, the computer used to make the effects. 1970s computer that could scan and print to film stock. There's a great website that's preserved a lot of info on this computer here: http://www.chilton-computing.org.uk/acl/technology/fr80/over...
III (or Triple-I) was a computer company founded by Edward Fredkin. He's a fascinating early computer scientist, doing pioneering work in cellular automata, reversible computing, as well as more mundane things like the trie data structure. https://en.wikipedia.org/wiki/Edward_Fredkin
ZOMG - a brain neuron from decades ago woke up - worked on some weird stuff in reversible computing in the '80s (think of assignment always preserving functional state at assignment) - my boss used to talk about his gates, usually referring to them by a more profane name . :-)
I've wondered if reversible computing research has had any real world impact. The theory is it would reduce heat since you're not increasing entropy, but I wonder in practice if that has ever been useful.
All the experience I had was theorem proving for crypto apps - that said, I think there is only the appearance of reversibility at the symbolic level, underneath its the same old mess - I vaguely recall that at full tilt, the room with all the Suns would get quite toasty . :-)
The title sequence, created by John Whitney, right?
His brother James' 1966 abstract piece 'Lapis' is a favourite of mine, created using John Whitney's analog computer tech. There's a version on youtube[1] sadly only 480p, which gives you an idea, but if you see it projected it's just amazing..
I'd always thought Tron was the film with the first CGI, I've just been reading https://www.empireonline.com/movies/features/history-cgi/ apparently it wasn't the first film to use 3D CGI either. Although Tron did make extensive use of 3D CGI.
I know this may sound weird but only the first 15 minutes of the movie and a few other small scenes were CGI. Most of the movie used old school film effects and hand-drawn cell animation. But the CGI in Tron was mind blowingly advanced for the era including the first CGI facial animation.
I always had heard that first 3dDone was Tron but that the first one to arrive to theaters was Star Trek, the wrath of Khan, I was told that the explosion of the Genesis was the first one full computer generated scene in films.
I love the era of people making practical effects to look like computer graphics. The Hitchhikers Guide tv-show also had hand-drawn animation which was supposed to look like computer graphics.
Max Headroom was such an awesomely weird show. And somewhat prophetic, too: the constant media barrage, their power over the people, this show was literally 30 years ahead of its time.
The series predicted its own demise. A lot of the show's critical focus was on the mistake of TV networks focusing on on-demand demographics (ratings) data to mid-season cancel shows for slight drops (leading to a shorter attention span). It's considered among the first shows canceled due to a short but predictable drop in its ratings, mid-season. A fad show lost to a short attention span (that didn't see the depth below the surface memes).
I believe in interviews Matt Frewer stated that he'd never wear the makeup and prosthetics again (it nearly killed him a few times and he's in his 60s now). That said, given how Hollywood's deaging VFX tech and motion capture has advanced, I wonder if we've nearly hit the point where VFX could do Max Headroom "for real" with motion capture. Would be funny to do with VFX today what was practical effects pretending to be VFX back then.
A chore to sit through?! Yes it looks dated but still an amazing film. It is basically a terminator film before terminator. A chase film with an unstoppable machine.
They really aren't comparable but the HBO series has been such a disappointment. Yeah it looks good but really weak story points. I'm still watching it, but my excitement has dropped a lot.
I wasn’t as impressed by the remake, thought it was pretty obvious the writers ran out of source material after the first season. It’s one of those stories that would have worked as an episode of Black Mirror, and you could stretch it out to a movie format, but don’t think it lends itself multiple seasons of hour long episodes. Remember the 70s tv spinoff Beyond Westworld? Don’t feel bad if you don’t.
The second season was directly teased from the first season. (I called a lot of the reveals of the second season assuming they'd be twists in the first season.) The problem with the writing in the second season was assuming enough people had read at least the Wikipedia summaries of the primary references they had cited directly and indirectly in the first season. Which is a tricky problem with "mystery box" writing in that you assume at least some of the audience will do the research, but you have to keep the show accessible enough for the people following the show but not the mystery solving / ancillary materials.
So far the promos for S3 make it sound a lot more straightforward with less "required reading", but we'll see.
Surely this would have been easier to recreate with practical effects? You can get the same pixelated effect looking through translucent privacy door screens.
Westworld wasn't just a fantasy, they were doing 'real' science fiction in that they were saying look, this is happening. Here's what computers can do right now.
There are plenty of examples of faked CGI back in the day. As has been pointed out, alongside some CGI TRON had a lot of hand drawn cell animation. The 'wire frame' of the skyscrapers in Escape From New York was done by building it with black painted models, edged with neon strips. In both cases it is possible to tell.
My lab purchased a digital color frame buffer in 1980 for 30K. That would be around 150K in current prices. It had a screen buffer of 512 by 512 of one byte, i.e. quarter megabyte memory. It was dual port- write to memory, read onto display. Memory price was the bottle neck then. Four years after that the Apple Mac came out with 1/8 megabyte B/W memory for $3K, e.g. arguably the first consumer graphics computer. Again its price was memory bound.
This channel is full of interesting content. It's sad it didn't last very long (something like 4 months between the channel creation and the last video).
I just didn't have that time to keep making these videos. The research alone took a huge amount of work and then there was writing the script, recording, editing and finally fighting YouTube's copyright mechanism that won't recognize what I'm doing as fair use.
The computer displays were showing films, yes. It is unclear to me that the wireframe animations were not computer generated, I always thought they were.
That is, for the wireframe animations when a ship is docking, I always thought that the animation was in fact a film of a series of discrete computer generated vector images, maybe displayed on a CRT and frame-by-frame stop-motion animated. It looks like that to me, and the tech was clearly available at the time.
Maybe they were drawn by hand and not CRT vector images, but if so they went out of their way to simulate it. I guess I wouldn't put it past Stanley.
For Escape from New York (1981), it was cheaper to build a New York skyline set, paint it black and then paint all the edges white and film it, than create CGI wireframe graphics.
The computer screens graphics in 2001 have a look that is very common for hand drawn film elements of that era. The kind of blurry edges of the lines are pretty typical for this process.
Compare that to the mission briefing in Star Wars Episode 4 where the scientist shows a wireframe animation of the attack flight path. It occupies just a tiny area in the background of the final frame, but it was created by a computer. It took several seconds to draw each wireframe onto a screen, which was then recorded by a film camera triggered to advance frame by frame by the computer. There is a making of film for this sequence somehwere. This is the process you suggest, but about ten years after 2001 it was barely possible.
III (or Triple-I) was a computer company founded by Edward Fredkin. He's a fascinating early computer scientist, doing pioneering work in cellular automata, reversible computing, as well as more mundane things like the trie data structure. https://en.wikipedia.org/wiki/Edward_Fredkin