Hacker News new | past | comments | ask | show | jobs | submit login
A whole website in a single JavaScript file (deno.com)
280 points by lambtron on April 6, 2022 | hide | past | favorite | 161 comments



Since a bunch of people are setting up a straw-man to criticize this post for "not just serving plain HTML" I'll share my opinions on this.

Almost nobody is going to use Deno to serve a basic HTML site with less than a dozen pages, they are going to use it to build a monolith application to get content from a database, or a service, then generate server-rendered pages, and also host an API from the same codebase, or something similar.

Setting up a monolith application means using something like Ruby-on-Rails, Django, Spring, ASP.net, or rolling your own with a Node.js backend serving a SSR front-end, or hydrating React views.

If you haven't experienced this already, you will come away with one of two conclusions.

1. Wow, this is so much fun, as long as I stay on the happy path of ... Rails ... Django ... ASP.net.

2. Wow, setting up all these moving parts really sucks' I'm mostly writing configuration files, and then Googling the error messages when I inevitably screw something up.

What I think Deno is trying to do is make the process of getting a server side rendered application with modern tooling running with minimal ceremony, while still enabling the developer to customize the system and build things that are not covered by the documentation guides. In addition, their solution is one that can be more easily hosted on edge servers than most other options out there.

I'm glad they are doing it, because it's a sorely lacking area of modern web development. They only other people who are taking this on in a meaningful way that I am aware of are Remix. I would be happy for there to be more entrants into this field.

Best of luck to everyone out there.


Your focus on things being "modern", which is no indicator of value except for hipsters and recruiters, made you forget about the third conclusion with which you might come away from setting up a monolith application:

3. Wow, this actually works. I am so productive and i can actually get my work done quickly, the same way as hundreds of thousands other normal developers out there. And then i can have a free time and mind free of stressing over what's cool or not

It's about what you value. If it is your time and happinnes, if you mainly want to just build stuff to solve real world problems, then choosing good old monolith application will get you there safe and fast in 90% of situations.

If you value to be hip and cool, then Deno, hydrating whatever, edge servers and sorely lacking areas of modern web development will 100% get you there.


My thoughts weren't really meant to be a condemnation of monolithic web app frameworks, just a summary of my conclusions after using them for years.

The reasons I have for justifying use of React, JSX, GraphQL, etc. have nothing to do with being "hip and cool" and everything to do with happiness and productivity. Using modern tools is both more enjoyable and more productive in my experience, as someone who used Ruby-on-Rails with templated pages for four years.


“Modern” also means the solution may benefit from reflection on past solutions.

The accounting software I use, written in 2004, evolved over the decades starting from 1980s software. It all works (hence, I’m using it now), but comparing the user experience and source code of various features developed at various times reveals a lot about what “modern” means.

Here’s a few things:

Standardisation: Modern software tends to use proven designs, such as SQL over a bespoke query language.

Performance: The strictness of ABIs and APIs makes it difficult to restructure for the sake of optimisation without introducing undesirable breaking changes. Old codebases may be essentially “frozen” and stuck on old versions of libraries. Fresh software can use the latest versions of libraries, with their new APIs & ABIs and whatever optimisations come along.

Fundamental coding improvements: Old languages don’t have ergonomic, performant closures (see lambda functions in Python). New languages do (see arrow functions in ES6). Replace “closures” with any core language feature, like null coalescing, match statements, object & array destructing (or even better: pattern matching assignment), or hygienic macros. (Let’s not discuss async/await as that doesn’t lie in the “unquestionably better” column.)

Better error messages & debugging: Over time, we collectively as developers have figured out what helps and what hinders when trying to track down troubles.

——

I also build stuff that solves real world problems. I once used Python and Django for everything. Now, I’ve moved to Node.js, TypeScript, React, and generally that ecosystem. It works far better than Python+Django. My development speed is blazingly fast in comparison, and the results stand up — they’re shippable.

Once I was familiar with Python & Django, they got the job done.

Now, I’m familiar with TypeScript & Node, they get the job done better.

Familiarity is that invisible force that, in its absence, prevents us from distinguishing things that don’t work from things that do work but are just different from what we know.


> I am so productive and i can actually get my work done quickly, **the same way as hundreds of thousands other normal developers out there.**

i seek out new technologies because i want to be _more_ productive than the average engineer. i don't believe there are hundreds of thousands of developers who are more productive than someone who makes an effort to learn stuff like deno, vite, unjs, fauna, etc. as soon as possible


There is no need to manually set up your own Node.js SSR framework for React. Next.js exists, and is quite mature at this point. Next.js is quite fun. Highly recommend it.

The novel thing, for this, in my mind, is the edge hosting.


But just like all other frameworks, the cost isn't just getting it started, it's learning the framework, learning the ecosystem, and slowly building up your knowledge of the edge cases when you start moving past it's limits.

Next.js puts you squarely back in monolith server-side territory in that regard, it's just a different flavor. Of course it has it's upsides if you're doing a purely React application. But you've now got all the complexity of a server-side monolith and a frontend framework, with the added bonus complexity of components that have to switch between client and server-side contexts, then state management and hydration across that boundary as well.

Now people are going to say "Ha, idiot, you just simply..." but that's just part of the learning, so you're in no better place with Next.js/SSR in general unless you want to build an interactive application. This isn't against Next.js by the way, I did enjoy it when I needed it, it's just that there's no avoiding some of the inherent domain problems regardless of which framework you pick.


I wonder if this is why a keep reverting to PHP , most stuff I can build without frameworks or just the odd library because I know how.

Everytime I give myself some free project experiments (have a local PI for tinkering) I always feel it’s unnecessarily complicated or feeling that hill I have to climb, at which point I can dream the solution in PHP frontend JS/HTML.


I've passed over that learning hump, put some projects into production with React, NextJS, and Vue, and I still think PHP + native JS/HTML is the more pragmatic tooling choice. It's more simple, robust and easy to debug. I've been experiencing some other backend frameworks too, specifically Node and Ruby stacks, and the PHP ecosystem and language is ahead by a country mile.

I think if you want to pick a pragmatic toolkit to work with long term and kick the churn to the curb, PHP is a great choice for web applications and will continue to be for a long time thanks to it's diaspora.


PHP (or, in my case these days, Deno serving templates, typically nunjucks) that have https://htmx.org/ in them for interactivity gets you productive, quickly, for a lot of the tasks I need to do.

That said, React makes building even more interactive, complicated things simple, and is easy to hire for, so thats what we use at work.

I do think the world is still ripe for a PHP-like language. The template-style webserver-aware way of doing things, with modern features. Maybe one day.


Definitely my biggest concern working with nextjs so far is that the line between when something is will execute server side and when it will execute client side is a bit blurry until you test them. It's manageable but requires some trial and error IME.


That was problematic for me when I was testing it out. Can’t access the window object without excessive workarounds that I couldn’t make work with my project. I ended up reusing existing react components and move it to all browser rendered react and even that experience has me leaning toward moving it to Vue or dropping the framework altogether and grinding everything out in css again. That said, I did enjoy using styled components. I found that pattern pleasant to reason around.


Is it Node’s Rails or Django? That’s roughly what we want to encourage. It’s been a long time coming.

Channeling my inner Cicero, Fuck Rust.


Yep, I've been using Next.js for several years and it's mostly great. One thing that I found is that Next.js still suffers from configuration fatigue, this was especially true before the team moved from Webpack to SWC. I'm glad to see that the team is moving more in the direction of "configured by default", but you still inevitably need to fiddle with the dials if you want to use things outside of the documented tool integrations.


ASP.NET Core is pretty minimal in ceremony nowadays. You can easily do single file web development.

https://github.com/dodyg/practical-aspnetcore/blob/net6.0/pr...


> Almost nobody is going to use Deno to serve a basic HTML site with less than a dozen pages,

Assumes facts not in evidence. People are already building JavaScript monstrosities to serve entirely static blog content.


> monstrosities

why do you have to sprinkle that snark + negativity in there? it implies a toxic role of "you are superior" and "people who use JavaScript to serve static blog content" are inferior.

why can't we all just get along? especially in this tight-knit programming community that is supposed to be full of love and collaboration. in today's modern day of inclusion and emphasis on mental health, you're spreading hate towards people who use JavaScript to serve static blog content and talking down to them. not every 2022 of you, in my opinion.

let's work to get rid of the culture where you imply somebody else's code project doesn't meet your standards, and that since they wrote it, they are dumber than you for making poor design decisions in your opinion.


> especially in this tight-knit programming community that is supposed to be full of love and collaboration

You must be new here.


While I don't like as well the tone of previous poster, I think that any field should have a healthy dose of "elitism" and "competition" and the previous user is right.


Why do you think the previous poster is right? I personally think they are being sort of silly.

Don’t get me wrong, I absolutely think you should consider your tools, but I recently had to get a screw out of a bookcase. It was stuck in free air and was just going round and round, so I ended up using a hammer to push it in the right direction. This is far from the only time I’ve used my hammer for something it wasn’t intended for, and it got some new scratches on the side, but at the end of the day, the screw got out.

Which is important to remember here on HN, because this isn’t really a “programming community”, it is (or was perhaps) a platform for investors to sneak peak on interesting opportunities masked as social media for hackers. And the ”monstrosity” was launched after all, which is the most important aspect of any product.

Aside from that, I think it’s reasonable to assume that a lot of smaller and personal websites are used as learning experiences. So naturally a lot of them are going to be build “suboptimal” because the reason the tool was chosen was the tool itself.


Should have used some pliers...


> full of love and collaboration

Why do you have to sprinkle that positivity in there? It implies a toxic role of “I am naive”.

Ah, I’m just jerking your chain. For the most part I agree with you. Calling the way other people do things a “monstrosity” really is uncalled for.


Having to support these "monstrosities" i cant see it as toxic but merely a way of life..


>"tight-knit programming community that is supposed to be full of love and collaboration."

Now you owe me a coffee and a monitor.


>People are already building JavaScript monstrosities to serve entirely static blog content.

I think we just have to accept that that's how websites are built now. It drove me nuts for a while, too. But modern JS engines are blindingly fast, and 2-3mb of JS download (that will be cached aggressively) is a non-issue for the vast majority of users.

I started talking to a junior developer the other day about server side rendering in the days of Rails/PHP/etc. and he looked at me like I was crazy. Couldn't even grasp the concept. I think for better or worse this is where we are headed.


Since forcing the industry to realize the possibility of server-side rendering at gunpoint is oppressive and impractical I suppose that there is some sense in which we need "accept that's how websites are built now," but there's certainly no problem with speaking about the benefits of server-side rendering and drawbacks of JS-ification.

> 2-3mb of JS download (that will be cached aggressively)

It's a giant problem on mobile. Connection quality varies and the larger the payload the greater the probability that it just doesn't all load. Caching? Mobile Safari will just reload the entire damn page periodically if you swap apps -- I don't think it caches in the same way that desktop browsers do.


A couple versions ago mobile browsers started clearing cache when you leave the application to help save battery life if I remember correctly. Very annoying because now I am unable to open a bunch of tabs in Safari for HN threads before I go off grid.


This sounds made up.


Whether the reason is true, the behavior seems more or less consistent with my experience. Mobile Safari caching seems forgetful on surprisingly short horizons (especially when low power mode is on). Mobile Firefox seems better but sucks down battery life on the order that Google / Apple Maps in active navigation do.

If you have a reason to believe the reality is different, though, happy to hear about more details beyond "this sounds made up."


> I think we just have to accept that that's how websites are built now. It drove me nuts for a while, too.

I don't think we should simply accept that's how things are done now.

> I started talking to a junior developer the other day about server side rendering in the days of Rails/PHP/etc. and he looked at me like I was crazy.

A web developer like the one you mention that can't even conceive of server side rendering makes them a bad web developer. That lack of core understanding means they have no idea how a web browser works and only view the world through JavaScript. They have no concept of progressive enhancement.

So this means they'll likely spend the next five years poorly implementing features a web browser already has. They're going to make websites, er "apps", that won't work properly in micro browsers. Hyperlinks won't work or will be flakey enough to might as well not work. And best of all their stuff won't work on entry level devices that are extremely popular.

> But modern JS engines are blindingly fast, and 2-3mb of JS download (that will be cached aggressively) is a non-issue for the vast majority of users.

You say this but it's not the common case. A lot of people have shitty devices because they're cheap. Whether they're entry level phones or shitty bullpen office systems people everywhere are stuck with them.

No matter how fast Chrome might execute some JITed inner loop is immaterial when the next line adds the DOM's billionth div pretending to be a button.

Thanks to cargo culting CI/CD there's little guarantee some app is going to reference the same JavaScript file on different days so the cached version will be tossed and yet another copy of Doom will be downloaded.

There are plenty of uses for JacaScript and cool web technologies. There's also lots of places where JavaScript is indispensable and enables awesome things. But requiring 3MB of JavaScript to read a static blog post is just poor craftsmanship. It's not even interesting as a project because you've signed the reader up to execute some unsolicited code to do who knows what. If you love JavaScript, render all your markdown with it on your device and just send me the static output. Don't make me download Doom and the markdown just to turn it into a few paragraphs of text I can't actually read. Web developers should respect their audience enough not to make them spend unnecessary resources to use their stuff.


I would like to see web apps move in the direction of server-side rendered static pages on initial load, and then progressively hydrate the app with data from the back-end on demand.

Rails does this surprisingly well using Stimulus with web sockets to mediate the exchange of events and data between the client and server layers.

Similar strategies are used in Phoenix Live View apps.

Load static markup and data -> request more data if you need -> send events and data to the server -> respond with the new state to display if different than the client’s version.


At the risk of sounding rude, what you're describing has been the status quo for data-heavy SPA scenarios for a couple years. NextJS, NuxtJS, Angular Universal, Gatsby, they all allow you to preload data into a server side render and then let the client side JS take over on demand.


Have you actually had a look at Rails 7 with Turbo? It's nothing like what you're describing with NextJS etc. in that it aims to keep Javascript to an absolute minimum.


Yes, they’re sending HTML chunks over the wire. It’s the same end result though, except you’re still not exempted from learning JS.


It puts rendering back on the server where it belongs. The JS is minimal.


I’ve seen garbage and exemplary sites produced with both methods. Rendering HTML with data hydrated server side and assembling it with a library client side vs rendering JSON and generating the HTML client side is a weird religious stance to take.


Some of the jamstack frameworks like Gatsby do this. They turn into SPAs after the first page of pre-rendered HTML is served.


These sites aren't fast though... As soon as your computer isn't the newest or your internet connection isn't the fastest, or you simply have other things happening in the background, these JS-heavy sites slow down to a crawl where more reasonable things would render instantly. They also make it easier to hide nefarious code and often break accessibility and customizability the user agent is supposed to provide.


Look at the bright side... more $$$ for your efforts.


Perhaps, but if your goal is to turn a pile of markdown/aciidoc/rst files into a blog, there are better and more purpose built tools for that. i.e. Jekyll, Hugo, Gatsby, 11ty.


> Setting up a monolith application means using something like Ruby-on-Rails, Django, Spring, ASP.net, or rolling your own with a Node.js backend serving a SSR front-end, or hydrating React views.

https://github.com/cheatcode/joystick


You can use a service worker proxy to cache and serve files to an application, which keeps things both fast and also with raw uncompiled html and JS file.


I don't think most of the people commenting here realize that the entire site loads just fine with Javascript disabled. Essentially this is HTML getting generated on the server.

The fact that it is in Deno rather than PHP, Ruby or Python is the point of the article.


Thank you. Being unfamiliar with Deno, I was trying to figure out what was loading the libraries when there was only one request/response over the wire. I of course was making the assumption that the single JS file was being run in the browser, not on the server.


I was doing the same!


3 of us :D


> Essentially this is HTML getting generated on the server.

Not a web developer. How is this different from CGI or a regular web server? This an honest question - I don't understand the significance.


Yes, everything old is new again. I just browsed the docs, but it seems it's not so different, other than it's automatically run on servers all over the world. Temporarily, this is for free. Also, it automatically updates after pushing a commit to GitHub, which seems nice.


> Also, it automatically updates after pushing a commit to GitHub, which seems nice.

also not new, but you know that already :)


it is not different.

CGI is replaced by JS

Apache/nginx is replaced by CDN

web server is replaced by edge nodes (aka, glorified runtimes automagically spread across many endpoints)


It's written in js/React so you can share frontend and backend code and write the entire site in one language or even using the exact same components for server side and client side rendering. This is a boon for development. Ignore the nay-sayers.


The demo doesn't use React, only JSX


The import { h } made me think it was preact, but I guess it's not even that.


I mean, this is a "single file website" in the sense that `<iframe src="https://google.com"></iframe>` is a "search engine implementation in one line of code".

The only semi-interesting thing here is that this demo pulls dependencies from 3rd party registries via HTTP without an explicit install step. It's really not that different than doing regular Node.js development with a committed node_modules (hi, Google), except that if node.land or crux.land go down, you've lost your reproducibility.

The thing about "familiar/modern techonologies" seem like superficial vanity. A vanilla Node.js equivalent might look something like this

    import {createServer} from 'http'
    import {parse} from 'url'

    const route = path => {
      switch (path) {
        case '/': return home()
        case '/about': return about()
        default: return error()
      }
    }

    const home = () => `Hello world`
    // etc...

    createServer((req, res) => {
      res.write(route(parse(req.url)))
      res.end()
    }).listen(80)
Which is really not anything to write home about, nor an intimidating monstrosity by any measure. Serving cacheable HTML is really not rocket science, it simply does not require "the latest and greatest" anything.


I wouldnt say an iframe and this are in any way shape or form comparable. this is a "full-fledged" website.

> except that if node.land or crux.land go down, you've lost your reproducibility.

Dependencies are cached. This is no different from if npm would go down.

> The only semi-interesting thing here is that this demo pulls dependencies from 3rd party registries via HTTP without an explicit install step

Given that this seems interesting to you, it seems you haven't heard of Deno (https://deno.land). It is not related to node in terms of environment, its a new completely separate runtime.

In regards to your node example, this is fairly different: the dependency pulled in from deno.land is a wrapper around the built-in http server, which does various error handling for you and simplifies the usage. The router isnt a simple switch statement either; its a URLPattern (the web's version of path-to-regexp) based minimal router. Campring these to the node built-ins isnt exactly a fair comparison I would say.

Also on top of this, with node you need a configuration to get typescript working, then you need a package.json, etc etc.


Yes, I know what Deno is, and when I say "semi-interesting", I mean I'm trying to find a silver lining to praise Deno for. To clarify, the similarity is that this claims to be a "single file" thing by importing the meat of the functionality from elsewhere. Which is not really interesting at all, because using batteries to make websites was already a thing with PHP in the 90s. Or, as I mentioned, it's not that different from just using express or path-to-regexp or lodash or whatever in a typical Node.js setup.

Caching dependencies is very different from general reproducibility. Committing node_modules guarantees that the app works even if the NPM registry were to implode. Try to deploy your deno thing from a cold state (e.g. maybe you're moving to a different AWS region or a different provider or whatever) while there's a deno.land outage and it will blow up. I'm actually curious what this caching story looks like for large cloud fleet deployments. Hopefully you don't have every single machine individually and simultaneously trying to warm up their own caches by calling out to domains on the internet, because that's a recipe for network flake outs. At least w/ something like yarn PNP, you can control exactly how dep caches get shuttled in and out of tightly controlled storage systems in e.g. a cloud CI/CD setup using AWS spot instances to save money.

These deno discussions frankly feel like trying too hard to justify themselves. It's always like, hey look Typescript out of the box. Um, sure, CRA does that too, and it does HMR out of the box to boot. But so what? There's a bunch of streamlined devexp setups out there, from Svelte to Next.js to vite-* boilerplates. To me, deno is just another option in that sea of streamlined DX options, but it isn't (yet) compatible with much of the larger JS ecosystem. </two-cents>


IMO the silver lining to Deno is incredibly simple: it’s compatibility with the web platform. I’m not sure what you mean by not compatible with the larger ecosystem, as Deno is basically spec compliant except for the Deno namespace (which you can polyfill out).

If you haven’t experienced any pain authoring isomorphic JS with Node, that’s great! My experience has been the opposite of that. But with Deno, everything feels completely web native. You never need to worry about modules, syntax, platform features (even localStorage!), or packages… it just works.

On top of that, all the built-in tooling is high quality and I’ve never felt the need to replace them. A formatter, test runner, bundler, type checker, doc generator, benchmarker, and even the built in deployment platform. In fact, I’ve never experienced a more smooth deployment experience anywhere. There is nothing this cohesive in Node.

If you need one more reason, Deno is arguably the most secure runtime in the world. I would not be surprised to see more corporations start to use Deno for user submitted programs as we’ve seen recently with Supabase and Slack, for this reason.


To be clear, IMHO, Deno looks fine for what it is. The features are great. The cons ironically mostly boil down to "it's not node", i.e. ejecting a non-trivial app from CRA into some vite setup is doable with some effort, but migrating to Deno is, charitably, likely a monumental task that nobody would ever undertake, even considering the upsides.

At the risk of diving too deep into opinion territory, I'm not all that enthusiastic about isomorphic JS (and I say this as a someone who's worked on a isomorphic framework). The promise of low learning curve is certainly appealing, especially for those still in the learning phase, but at least in my experience I find that isomorphism falls a bit short in practice because server and client semantics are just... different.

When I talk about compatibility, I'm specifically talking about non-platform compatibility, i.e. library authors need to consciously target Deno if they want to support it, and the way to do so may be entirely non-trivial (e.g. the lengths that postgres.js goes to, compared to slonik, comes to mind). But most of the JS ecosystem lives on NPM and imports things willy-nilly with no regard for whether their thing will work in Deno because that's the path of least resistance. This is not Deno's fault of course, just an unfortunate reality.


Leo, your comments save me a lot of typing on threads like this, and since I recently wrote[1] what beeandapenguin wrote above almost to a point (sans security), I feel obliged to expand a bit.

You are right about incompatibility being a major issue; Deno recognizes that as well, hence, they are working on a compatibility mode that allows using Node specific libraries in Deno[2].

> migrating to Deno is, charitably, likely a monumental task that nobody would ever undertake, even considering the upsides.

This is, of course, contingent on the architecture used: for code tightly coupled to frameworks/runtimes it is indeed a monumental task. I have two small to mid size SaaS apps happily running on Node.js, but I'm looking forward to replacing it with Deno solely for the streamlined DX. The apps follow DDD architecture, thus, framework specific stuff is decoupled into a service/adapter and changing it is a day's work. The major technical road-block for now is indeed incompatibility of third-party libraries/SDKs written for Node.js (google sdk, mongdbo driver, etc.).

[1] https://itnext.io/moving-libraries-to-deno-the-whys-and-hows...

[2] https://github.com/denoland/deno/issues/12577


> The major technical road-block

Yeah, this is primarily what I'd expect would hold back migrations, both in actual technical terms (e.g. Deno-flavored libraries for some tasks may not exist at all) and buy-in from engineers. Don't get me wrong, I'd love to seriously consider Deno for our (very large) codebase (a 1000+ package monorepo with 400+ engineers committing), and I say this as someone who's successfully lead a number of massive migrations for this monorepo. But Node -> Deno at even 1/100 of this scale is, in my mind, potentially orders of magnitude more difficult than even a monorepo-wide Flow -> Typescript conversion, which is already fairly daunting migration.


> Committing node_modules guarantees that the app works even if the NPM registry were to implode. Try to deploy your deno thing from a cold state (e.g. maybe you're moving to a different AWS region or a different provider or whatever) while there's a deno.land outage and it will blow up

You can just move your DENO_DIR (cache) along with the rest of your code the same way you can move your node_modules folder.

See: https://deno.land/manual/linking_to_external_code


Or you can use `deno vendor` to check in your dependencies into version control, or put a caching HTTP proxy between you and the origin server. Don’t be fooled: Node & NPM have these same problems.


> I wouldnt say an iframe and this are in any way shape or form comparable. this is a "full-fledged" website.

This is what's called an "analogy".

But your other points are valid.


Now, add jsx and ssr to your example, deploy it, then compare with the deno version in terms of performance, code length, and dev time.


Why? Just so you can tell another developer that there's a compiler transpiling non-standard syntax into function calls that concatenate strings at runtime? While the output HTML that the user sees is exactly the same? That's exactly why I'm calling out to be library vanity. My example is SSR, that's literally the default baseline. It doesn't make a very strong argument to imply my 5 min thing will somehow be worse if only you get to decide what random garbage to add to it to make the alternative look better. E.g. Make hegel types work in the original and then let's talk loss of productivity from arbitrary decisions.

Deployment for a vanilla node.js thing is as simple as adding `node index` as the entry point in your favorite provider (because they all have node.js images these days), I've had such a thing humming along for years. Again, it's really not rocket science.


Different use cases. You equate SSR to just serving strings. Others need to use jsx + SSR together (be it personal preference or hard requirement).

Imperative vanilla code vs Declarative components. Both should have their place.


I'm with lhorie. SSR literally is about serving strings... you're the one equating it with server-side JSX. JSX is syntactical sugar that abstracts vanilla JS which in turn renders strings.

Rendering HTML on the server has always been the standard way of doing it, so the whole concept of SSR is funny to me. We've been creating new abstractions that trade old problems for new problems, and then newer abstractions that trade out problems again, since the dawn of time.


My point is that it doesn't matter, serving strings or rendering react comps. For folks who has to work with jsx + ssr for one reason or another, they will appreciate what deno's team has done here.

And yeah sure, you can always take a simple demo app with Declarative components and turn it into a few lines of imperative vanilla code and say it's simpler this way. But then what? How are you tackling scaling, organization, composability, and deployment? (these are the real things the deno team is trying to show here, are they not?) By the time you design everything out and put all these in place for your vanilla code, you'll end up spending just as much resources (if not more) as you'd have for using Declarative components with deno.


What I was trying to get at is that whether you have to work with JSX or whatever, that doesn't really have much correlation with Deno per se. CRA/Next/Remix give you decent JSX setups out of the box too (for scopes where JSX is actually justifiable), and so on for all the popular framework flavors, so it kinda doesn't do Deno any justice to say what amounts to "hey look, it can do the most basic of things when you pull in a bunch of libs".

If the point of the article was to highlight a super simple, no-fuss edge computing deployment thing, maybe it would have been better to lead with that? Because if you lead w/ "A whole static website in a single JS file", then let's not blame me for pointing out that that's a relatively trivial task to accomplish with other technologies.


Yes, you are certainly not alone on that. The headline could be made better. Focus should be more on the composability and tooling side of things.


Agree 100%. The first time I heard the term "Server side rendering" I wondered what the hell it meant! Must have been coined by the new-fangled DOM-manipulator army. Modern web development is a big, clunky, slow mess, and for no good reason.


SSR means server-side rendering. String-ness is irrelevant (everything is a string as far as HTTP is concerned). The difference is between serving HTML vs JS for the purposes of generating a DOM tree. The article is using nanossr specifically to server HTML, MPA-style. My thing is using template string, which is what systems like lit-html use for their flavor of "declarative components"

Whether one wants to squint at this and think of React is neither here nor there, IMHO. Svelte, for example, cannot implement this website in this MPA format with only one `.svelte` file, but I don't think it's necessarily more verbose or slower to develop with than, say, Gatsby.


>The only semi-interesting thing here is that this demo pulls dependencies from 3rd party registries via HTTP without an explicit install step.

We used to call those script tags back in the olden days...


> except that if node.land or crux.land go down, you've lost your reproducibility.

I wouldn’t say you lost it, I’d say you never had it in the first place.


This is great, I'm not sure why this is being misinterpreted so much. Serving generated HTML from Node using Express many years ago was also great at the time. You can still do that, but in my experience the tooling is quite dated/fragmented and the ecosystem+language has evolved significantly since then. Nowadays, SSR+Node generally refers to front-end frameworks with SSR capabilities (Next, Nuxt, SvelteKit, etc.) or generation of static files. Building a dynamically server side rendered site using TypeScript+JSX but without the issues of client side frameworks, hydration, SPA routing, etc. sounds revolutionary, even though it shouldn't(?)


This starts to make sense when you consider the self-flagellation of a full server-side-rendering production setup that has existed over the past decade, to the point many SPA products completely give up on SSR - or nowadays throw themselves at the walled garden of Vercel/Next.js etc to solve it for them.


I am not a fan of Vercel's strategy, but exactly how is the open source and MIT licensed Next.js a walled garden?


It's not. Next.js works on any platform supporting Node, and Vercel supports more than just Next.js. They support dozens of other frameworks.[0] They do promote Next.js on Vercel, but they don't stop you from using other systems.

[0]: https://vercel.com/new/templates


Ever since Vercel killed `now`'s ability to serve an index.html with `now .`, I stopped being a fan. That and the pricing blunders have unfortunately pushed me away.

On the other hand, https://docs.ycombinator.lol/ has been running for years without me ever having to worry once, so there's that. Maybe it's time to give them another try.


That never stopped working. Neither on the edge side (as your live site indicates, which has been edge cached for 109 days in San Francisco), or on the deployment site, as the example below shows:

Try it out: https://hi-there.vercel.app

  ▲  /tmp/ mkdir hi-there
  ▲  /tmp/ cd hi-there
  ▲  hi-there/ echo '<h1>hi sillysaurusx</h1>' > index.html
  ▲  hi-there/ vc
  […]
  Inspect: https://vercel.com/rauchg/hi-there [690ms]
  Production: https://hi-there.vercel.app [copied to clipboard] [9s]
In under 10s which includes the project setup and linking prompts. The main difference is that `now` got shorter to `vc`

Happy to hear more feedback on pricing. Please reach out to rauchg@vercel.com


Happy to hear that. It did, at one point -- I am quite certain of this -- but that was years ago.

I think it was in the transition between now and vercel. Back then, `now` had some sort of requirement where it only worked automatically if the folder had a package.json file. A plain index.html file required configuration. Delighted to see that this is no longer the case.

I'll give `vc` a shot.

The pricing issues stemmed from the fact that I use updown.io to monitor my domains. Since it pings them every 15 minutes or so, and since vercel spins up servers on demand, that means I was paying for essentially 24/7 service, which was an unwelcome bill over what I signed up for. Customer support gave a 50% refund, which I appreciated, and I've downgraded to a free account ever since.


Sorry Guillermo but you’re not being truthful here. “$ now” worked on any node server, not just static content, it would just run “npm start”. Now v2+ instead requires a handler-based approach.


I think I misunderstood his comment. I focused on the `index.html` example, but I think he was talking about servers (thanks for pointing that out). That said, the transition was good for servers too depending on what you were using. For example, `package.json` with Next.js was completely seamless!


Hosting next.js on Vercel and Azure is like night and day. A lot of smaller things start breaking, like `NEXT_PUBLIC_` environment variables. For me coming from Vercel, using Azure has been an uphill battle.


There's plenty of non-Vercel attempts at this and it's kind of table stakes for frameworks hoping to get popular. There is a push for server-first/MPA app development where SSR is assumed to be the baseline. Remix [1], Qwik [2], and Marko [3] are in that camp. I'm not sure about Remix but the other two have a goal of authoring the entire app components but only sending the JS needed for the parts that can change on the client.

[1] https://remix.run/ [2] https://github.com/BuilderIO/qwik [3] https://markojs.com/


The truth I learned building my own SSR JavaScript framework is that it's not that complicated, it's just made complicated.

1. Get some data from a database/source.

2. Pass that data to a template/component.

3. Convert that template/component to HTML (using the given framework's version of renderToHTML()).

4. Return the HTML to your HTTP request.

For example, here's the SSR for my framework: https://github.com/cheatcode/joystick/blob/development/node/.... It blew my mind when I wrote it because I was under the impression it would be more difficult.


> rendered dynamically, just in time, at the edge, close to the user.

HTML. You're serving HTML.

Doesn't really matter that the server-side language is JS, PHP, or BASIC.


Aaron from Deno here. Of course it's producing HTML as an output, but the point is that you can use JSX and familiar technologies like tailwind to dynamically generate that HTML at edge vs client side.

And unlike a pure static site, you can add API or form routes


Wow, a lot of misunderstanding of what Deno is and how it works in these comments! Must be frustrating for you.

I'm a huge fan of runtimes that reduce boilerplate and configuration, so that's what makes me most interested in Deno. What I'm most concerned about is that we're pushing the idea that Deno's approach to third party imports solves all the problems of npm et al. If we teach developers to think of third party and native libraries as equivalent, I think we're hiding a lot of problems rather than solving them, which could be even worse.


I can appreciate what's being done here, but I think a more compelling demo would have had a bit of dynamic rendering just to emphasize the point (since in the real world, a fully-static site like this would be better served by a static-only hosting service with no custom server running at all, even one on the edge). Even something as simple as grabbing the current timestamp and displaying it in the returned HTML, just to show that logic is running on every request.


Oh. So you've reinvented PHP. Nice.


This is a very lazy comment. I'm sure it makes you feel smart, but it drags down the entire conversation, and doesn't add anything of value. You seem very capable and accomplished, so I'm confused why you would spend any of your time to simply shit-post on someone who is trying to build something of use to many people.


You are right. It is a lazy comment. I would delete it if I could at this point but thats not possible.

It really comes about from my frustration. So much effort pushing into new tech and the result (at least as I and others in the comments here noted) is something reflective of pre-existing technology that has been around for decades.

I get that though enough of these exercises true innovation does emerge. However there is a whole lot of "re-inventing the wheel" in-between that which is frustrating as it seems to be prevalent.


Wow toxic much? PHP doesn't solve the same problems as JSX


es6 is much more pleasurable to code than PHP. Check Little Javascripter [0]

[0] https://www.crockford.com/little.html


It matters to me as a developer who would choose a hosting environment based on my familiarity with the language I have to write in.


> /// <reference no-default-lib="true"/>: Deno comes with various TypeScript libraries enabled by default; with this comment, we tell it to not use those, but instead only the ones we specify manually.

Interesting. I checked the docs on this and it’s not quite clear to me why this is needed in this case, or what the benefit is in taking this approach. Is this strictly a build time optimization, or is it necessary in this example?


It’s actually not needed at all for runtime. It’s purely for editor experience and type checking.


Tons of big websites use something quite similar to this for their maintenance pages - pop a page of HTML in a JS function, upload it to a Cloudflare worker, and attach that worker to a wildcard route to catch everything on your domain temporarily when you want users to see your maintenance page. It's a common strategy that works well.


This is great.

I've been dabbling in Deno for a while now. Standard lib is there. Testing is there. All the packages I'd ever want are there. Linting, a strong style guide, and a documentation generator too.

And unlike other beasts, it feels so minimal and out of the way.


I fucking love what Ryan is trying to do with Deno. The entire JS landscape is unnecessarily complicated and NPM is making it even worse.

Deno is making JS development fun again. Major props. I hope Deno Deploy is a commercial success for the team.


Is there a particular advantage to how this is done here with Deno or is this just an example of server side rendering being possible in Deno? The latter is fine as I'm a fan of Deno :) just missing why it's such a popular post (maybe more Deno fans?)


It was mainly intended to be a "cute" example of the latter.

Technically if you were doing this in Node, you would need at least a package.json and would have to configure your TS/JSX transpile, etc...


You mean you are a user ? Fans are for music :)


I'm actually still a Node user! I really like how Deno has been innovating this space though and it's come a long way in the last couple of years so I'm looking forward to making the switch at some point. Even if I never do for some reason (most likely node API specific libraries) it has really given Node some reasons to innovate in a few places and I love Deno for that as well.


Wait a sec, how does Deno knows these classes are actually tailwind css classes? How can I disable that or use an alternative?


The `ssr` function uses the twind module to extract the classes and generate plain CSS.


I don’t think a simple, static site is a great example for an entire framework


What other kinds of examples would you like to see ?

The goal was to showcase simple yet intuitive JSX + tailwind at edge, we didn't elaborate on more advanced use-cases like authenticated pages, API endpoints/forms, dynamic pages (location, etc...) or parametric routes.


I'd love an example where the user updates some piece of data. The update should be displayed right away to the user and in a DB.


There is another blog post coming up in a couple of weeks (hopefully) that will demo this.


If the the deno executable is roughly the same size as a popular "web browser"1 why not just distribute Javscript files and let users run them in deno. Or let users "import" them into their own scripts.

1 The one I downloaded weighs in at 85MB. That is smaller than some popular smartphone apps.

As I understand it, deno is designed to be somewhat safer than nodejs.

I can edit and compile deno much easier than I can compile a popular web browser. Some popular web browsers are not meant to be edited or compiled by users. If there are things that users dislike about these programs they are powerless to change them.

The "web browser" is created and controlled by companies in the business of user data/metadata collection and advertising. AFAIK, the people behind deno are not engaged in such practices.

The "modern" web browser has become a Trojan Horse of sorts. Instead of a program for displaying and reading HTML, it has become a vehicle by which users indiscriminantly/involuntarily run other peoples' programs, often written in Javascript, in a context where a third party, the commercially-oriented browser vendor, can quietly collect user data/metadata, mediate and manipulate user interaction.

Deno takes the Javascript interpreter out of the browser.


I've been looking at Deno very briefly recently (overall a good impression) and I was very much surprised that just getting the visitor's IP address took like a dozen lines of code.

My test case was, basically reproducing something like

  <?php echo $_REQUEST . "\n" . $_SERVER; ?>
and I was a little surprised how much convenience was baked into it and how you wouldn't have access to all that in other libs. That someone created an issue[1] makes me think I am not just not looking good enough and it's actually tedious.

[1]: https://github.com/denoland/deno_std/issues/1884


Wow that's cool. Would love to see a more "interactive" to-do or "guestbook" site that shows how that side of things work.

Also wondering if this can be done serverless-ly or requires something always on?


Deno Deploy is serverless, and Deno CLI is always on, so you can do either with the same code


TiddlyWiki is a whole wiki in a single file, been around a while, great system:

https://tiddlywiki.com/


I keep seeing comments about static html vs generated html. So I have a question (please respond):

Why can’t we just run the example Deno program to generate snapshots of html?

It seems like some of us think pure static html is a good goal for some things, so why not use this Deno program to create the same html responses in generated files?

It’s probably the same amount of code because instead of writing a http response you write a file.

Of course you lose some functionality this way, but your app you rules imo


Warning: Mocking-JS plug

Meanwhile, I just created a JavaScript-free website.

Never have to worry about broken NPM, cookies, trackings, API, or JS-based malware.

And I use my iPhone/Android to edit/create web pages in Markdown, then my CI will build it and post it for me.

Look at the snazzy 3-D CSS, also JS-free.

Did I mention that I have a no-nonsense Privacy Policy?

3-D web page. https://egbert.net/blog/index.html


Your link is giving a ERR_SSL_VERSION_OR_CIPHER_MISMATCH.


Drop your Chrome-based browser.

Website is only for those who do most-secured approaches.


Alas, I wanted to believe, but I pretty much immediately uncovered a bug with the way my browser’s back button works.

In iOS Safari, click the “CLI” link at the top, then swipe the page to the right to go back. If you do it slowly it works, but the first time I tried I did a regular flick-swipe from about the height of the page where the version number is. I was trapped in deno.land and couldn’t go back.

(Maybe that’s a bug in deno.land though, not deno.com?)


Where’s Tailwind coming from in this example? It doesn’t seem to be imported, and I can’t see anything about it in their playground link either: https://dash.deno.com/playground/website-in-a-single-js


nanossr seems to be using twind (https://github.com/tw-in-js/twind) under the hood.


> For rendering of the JSX, we use nanossr, which is a tiny server-side renderer that uses twind under the hood for styling


In Firefox 98.0.2 / Windows 10, at the playground, the URL does not reflect the route, and refreshing will display the hidden but currently selected route, rather than the URL route.

https://imgur.com/a/ZP95YcD


Yup, this is very unfortunate, but sadly a browser cross origin privacy protection. We can not fix it without injecting some third party code into your site automatically (which we don’t want to do).


does Deno have a solid web framework? Next.js is somewhat being worked on / forked for, like Aleph.js


Also check out https://fresh.deno.dev


Remix is working on making their framework work with Deno. You can already try it out.


Seemed pretty simple to add a bagel route https://dash.deno.com/playground/cold-parrot-17 with page not found handling. Love the minimalistic nature of this.


If I publish my website to npm and then import and run it in an index.js file, does that also count as "a whole website in a single JavaScript file"?


> These are served using the serve function from the standard library.

I guess Node.js could learn a lesson here.


Demo was started by the original author of Node, and they took lessons learned to this new platform.

Also note that Deno is an anagram of node :)


"time to interactive: 1.0s / first content paint 1.0s"

My man, let me introduce you to ... HTML. It has "time to interactive" at 0.0 seconds and content paints instantly!


I think you might have misunderstood the blog post. It is server-side rendering on the edge, shipping nothing more than plain HTML/CSS to the browser. There is no client side JS *at all* here.


I understand the blog post, and the capability itself is neat, but I'm having a hard time understanding the utility in what they are showcasing from the actual example site[0].

As other posters have pointed out, why not do it in HTML from the start? It's more simple and efficient than this -or any- framework. Just drop the ol HTML file on your server and away you go!

I understand that the supposed "real" utility in this would be when you want to do JS-y things in HTML (auth, API, hand state, etc), but they don't show any of that on their showcase site...so...yeah.

0. https://website-in-a-single-js.deno.dev/


I agree we could have elaborated on auth, API endpoints or parametric routes (maybe a follow up !).

But this example does showcase a few things you don't typically get with a single vanilla HTML file:

- JSX + reusable/shared components

- Multiple URLs / pages

- Tailwind


I actually think this is quite neat, but I am a bit worried about caching.

Someone mentioned rails, and rails have a lot of facilities to set correct cache headers for assets (css, js, images etc) and for dynamic content (for logged user in and/or for pages that are dynamic but public).

If you're deploying static files via a vanilla web server, you also get a lot of that for free, via the file meta-data.

I would expect a framework for publishing sites to showcase a minimum of good caching (client cache, ability to interact with a caching reverse proxy like varnish - and/or a cdn).


I get that after reading your blog post, so that's fair. Maybe it's just a case of the magic trick that's missing that third act.

Clicking around with the Dev Console open and watching the pages in Sources was enjoyable.


I know it's taken from their marketing lingo, but edge makes sense only if you have a "core". This demo app doesn't use that model. It's just "cloud", or lambda if you want.

I'm not sure Deno(the service, not Deno the language) are actualy proposing a model - similar to Cloudflare, for example - where you have your infra somewhere and they only host the "edge", in a CDN, or spread around the world.


The shown page could be served on 0.01 seconds if it was...html. Or, if you MUST lob complexity at it, use a static site generator... that generates...HTML


Try playing around with the pagespeed link provided in the article. You can see that in the default view the network is set to slow 4G throttling with an RTT of 150 ms so it's going to be impossible to get times like 0.01 seconds. Even just loading https://x.com, a site that literally serves a single character of content "x", gets 0.8 seconds.


You can see for yourself with the provided Pagespeed metrics that the root document was served in around 30ms (corresponding to TTFB).

If you can elaborate on how statically-served HTML would render orders of magnitude faster than server-sider-rendered HTML with a similar response time, I'd love to hear it.


Unless the server runs at negative cycles per second, more cycles means more time taken. Did i miss-math?


You've shown a way to add multiples of ~0.0000000003 seconds to the time but haven't explained how the page is going to go from .01 seconds to 1.0 seconds as a result when TTFB is 0.03 seconds.


The demo is literally a static site generator that generates HTML


It's not, unless "static site generator" is one of those terms (like "literally" or "REST"[0]) that has morphed to mean the opposite of what it was supposed to mean. (Not to be confused with "serverless", which never meant what the word suggests.)

This application is generating HTML on the server, but so does PHP. A Web site backed by a PHP application is the antithesis of a static site.

0. https://news.ycombinator.com/item?id=23672561


The shown page is HTML. It's a plain-old website that works the same way websites have always worked: the page is generated at request time, and HTML is served directly to the client. It's not technically a "static" site because the HTML is not cached ahead of time, but apart from the fact that the whole back-end is a single JS file, there's nothing special here.


No, not even remotely accurate.

The browser still has to fetch and render the HTML. JS-heavy sites do tend to be slower, but no site has a 0s TTI/FCP.


I think it's actually much worse than that - it's a little tricky to tell on mobile, but going from "stats" to "bagle" and back again - looks like this mucks up the client side cache? While one whole second is "90s slow" for a first render for a static site - it's truly ludicrous for navigating back to an already cached page?

How are the cache control headers with this set-up - is there a varnish or similar cdn/cache doing useful work (I'm assuming not, more importantly I'm worried pointing something like fastly at this will fail in caching static pages?).


The request headers have no-cache set. I assume this is because it's in a live developer playground page instead of a production deployment.


> I assume this is because it's in a live developer playground page instead of a production deployment.

Not how I read it - the first link is to a site deployed via "deno deploy", the last one is a link to the same content in a playground.

>> Hosted on Deno Deploy, this little website is able to acheive a perfect pagespeed score. Serviced from an anycast IP address over encrypted HTTP from 29 data centers around the world, the site is fully managed and will be available indefinitely at zero cost.

>> Everything mentioned here can be viewed on a playground.

For what it's worth, the deployed site seems a little snappier to me now on mobile - maybe I'm just less grumpy after dinner...


You know what's funny I'm a fool that still had "disable cache" checked in the dev tools from a previous session when I was checking the request headers. At the current moment now that I have unchecked that in dev tools it is actually respecting the cache for those images. I can't definitively say if that is different than before but either way at least it seems cache is verifiably working with it normally now that I have that unchecked.


Still, that's quite different from live editing static files on disk - then you'd normally get cache and invalidation (if-modified-since/304 etc).


Report a bug in Lighthouse then.


All apps should be written as a single, stand-alone, self-contained file.


why


I personally find it insane to use the monstrosity in terms of loc and complexity that the v8 engine is to generate a static HTML web site. I also disagree with other JS based static site generators because of the above reasons. I strongly believe it's a bad idea to have to locally install nodejs or deno and write JavaScript to generate a few HTML pages. Also, I disagree with plain HTML because of duplication.

http://mkws.sh/ uses the standard UNIX tools to generate `HTML` pages featuring a powerful sh based templating engine https://adi.onl/pp.html. Dependencies and complexity are kept to a minimum. It does the minimum required, generate HTML pages and keep duplication low using aprox 400 SLOC.


Glad to hear that you prefer shell scripting, but please consider that there are many (web) devs out there who prefer JavaScript:

"Many [devs] are more familiar with the Chrome DevTools console than they are with a Unix command-line prompt. More familiar with WebSockets than BSD sockets, MDN than man pages. (...) Many developers, we think, prefer web-first abstraction layers." -- https://deno.com/blog/the-deno-company


Not sure if matter of preference or matter of right tool for the job and industry agreed best practices regarding simple vs complex. In the sense that you may prefer the round peg, but it doesn't fit the square hole.

I've been doing JavaScript myself for about 15 year, unfamiliar with the UNIX philosophy (to be doubted as any doctrine). Started doing web development using plain HTML the "old school" way, I personally dare to say the _normal_ way.

Before the rise of SPAs I very much agreed with the idea of progressive enhancement which is coming again into attention with the likes of https://turbo.hotwired.dev/.

While doing SPAs I always felt that stuff constantly didn't fit, that we were constantly using unfit tech, doing hacks for benefits of using a single (unfit?) language both on the server and the client, partial loads (faster loading times) and having a single codebase for all OSes. Stuff felt hacky most of the times and we were hiding those complex hacks under what _seemed_ as elegant and simple abstractions. But I believe most experienced JavaScript developers agree that the elegance and simplicity is mostly on the surface. I constantly felt dissatisfied with the code I wrote. I refuse to go on a full rant regarding SPAs and JavaScript but that's the gist of it.

While configuring my dev environment I stumbled up the https://suckless.org/ guys. Their code embodies the UNIX philosophy well although some people, including me, might say that some stuff is too simple. Simplicity for the sake of simplicity is not a good idea (nothing for the sake of anything is a good idea to be honest) but rather as a consequence of you understanding of what's not really needed.

While investigating more more the UNIX world, discovering OpenBSD and using it as a daily driver things started to fit and make sense.

Now, regarding how mkws fits generating static sites, it mimics building a small C project except the Makefile is replaced by a shell script, so all the principles fit and are well established. pp is the compiler, .upphtml files are the sources, html files are the output binaries. Everything integrates and fits well. I feel satisfied about how everything works.

Code is small and simple, abstractions are kept to a minimum. I, as a single person, am able to investigate, understand and change every part of the generating process. Can't say the same thing about a JavaScript static site generator, you don't really need the v8 engine to generate a few HTML files, that's complex, most of use agree simple is relatively good, complex relatively bad as an industry best practice.

Regarding SPAs, I believe they were a quick solution until we properly solve the problems they solve via progressive enhancement.


> a powerful sh based templating engine > it mimics building a small C project

You're using practices from a language unrelated to webdev, scripting language unrelated to webdev and what looks to be a closed source generator built and (un)maintained by a single person. There probably is a context where this setup makes sense, but to me it's anything but simple or intuitive.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: