Hacker News new | past | comments | ask | show | jobs | submit login
<3 Deno (matklad.github.io)
557 points by scraptor on Feb 12, 2023 | hide | past | favorite | 218 comments



Interesting to see a different side to the benefits of Deno.

Personally, I find direct Typescript interpretation to be one of the best features. Typescript is not necessarily a simple type system, but it is an expressive one:

- It has great type inference avoiding Java style codespam

- It has gradual typing, enabling "risky" type casts

- It has fully functional types, enabling the productivity and readability improvements from functions as data

- It is 10 seconds away from having dependent types

Typescript took most of the best programming language research from the past 20 years and popularized it in a way that actually makes js hackers more productive without getting too much in the way.

The only hard thing has been that js interpreters don't understand typescript, requiring an awkward transpilation step with weird symlinks and other hidden complexity that works 99% of the time, but wastes a lot of time when it doesn't work just right.

If Deno could just solve that, it would be a great improvement. If it's also faster (as claimed) that's even better. This article shows a lot of other ways in which small, simple improvements add up to a much better product.

Can't wait until it's production ready.


Yeah, I’m a big fan of strong, static types, and was a big JS hater pre-ES6 (still a light JS hater), but I gotta say, TS is damn good. It really is almost as type safe as a classic strongly, statically typed language, and almost as productive “in the small” as a dynamically typed scripting language.

If you don’t need systems language type performance, and don’t need true parallelism, it’s a pretty terrific choice. Starts out productive/simple in the early days of a project, stays productive/simple as it grows. Plus, if you’re doing “mobile apps + web app + RESTful services”, pretty nice that you can do it all in one language (assuming React Native on mobile), and that there’s no JSON related boilerplate.


It’s hard to beat a pedigree like Anders’ for Typescript. He had Delphi and C# to cut his teeth on. I can’t imagine many other people that would be better qualified to design something pragmatic on top of the ugliness of JavaScript


That's very true. I see other programming languages trying to add/enhance types now (ruby, python, go to some extent), and not having a beast like Anders on the team must make it tough to ship an amazing type system.


I feel like pedigrees are somewhat underrated in software engineering. Everywhere I go there always seems to be some sort of egotistical “I can make the best thing”


Well, yes, and I've done that plenty myself haha

I have the impression that many of us programmers have a hard time really accepting that different applications of programming require vastly different skillsets, so we often underestimate how hard things can be.

For example, a software developer of 15 years probably doesn't have almost any of the necessary skills to make a new database server.


Exactly!

I feel like one of the skills that levels you up from junior to senior engineer is learning never to utter the words “it will be easy”


Well, that should be easy!


Creating a new language that interoperates with another is way different than retrofitting an existing language to add types.


Missing Turbo Pascal and J++ on the list.


Typescript is great compared to plain JS, but they're really deep into diminishing returns area with type system features for a couple of years now.

A lot less would be a lot more, because as of now Typescript lures otherwise entirely competent programmers into writing complex 'type system puzzles' which are entirely obscure to everybody except the person who wrote that code, and it takes a lot of discipline and experience to reist the lure and keep things simple.


TS shouldn't have embraced all the JS quirks.

I want a type system that embraces the correct way to to do things instead of allowing anything you want no matter how bad it actually is. I want my type system to inform me that my code is really bad for performance and should be redone.

Instead, I constantly deal with people who think their code is fine just because the TS checker doesn't complain.


I feel the same way, I lost amazing engineers to this weird typing obsession.

Excessive typing is very costly and should be left to popular libraries that solve generic problems. This is where you get best bang for buck.

It should not be present in your business code, high level business logic is rarely prone to type errors anyway.

Static typing was always a means to an end, but unfortunately it was recently sold as a virtue and many people have bought it.


I kind of agree, some Typescript projects look like Haskell envy (with all GHC extensions enabled).


The Haskell type system is WAY more simple than the type soup TS allows you to make.

I'd gladly take Hindley-Milner typing and typeclasses (or better yet, module typeclasses) over the current TS situation.


I suggest a digression by GHC extensions.


I feel similarly to this, and often find when working on a TypeScript codebase that I get "distracted" by spending more time trying to get my types just perfect.


Discriminated unions are such a powerful feature that I would be very reluctant to choose another language that doesn’t have them.


The nice thing about them is that you have been using them all the time if you do dynamic typing.

They are the most natural way to encode structural information, especially across language/system boundaries.

- Clojure’s multimethods (multiple dispatch)

- Clojure spec’s “or”

- JSON schema’s “anyOf”

There’s likely more features and tools that are not strictly/statically discriminated/tagged unions, but in practice they help you to code that way.

In TS this is IMO the best feature. In a sense it just adds tooling support on top of a coding style that is very close to dynamic typing.


It is the way I always wanted to use JavaScript, but was too cumbersome to utilise without a good type system.

My favourite use-case for discriminated unions is to represent all possible client events as a DU, as well as the entirety of the application state. This allows you to efficiently communicate with a backend over websockets via a very thin transport layer that only needs to handle one message type.


CLOS multi-methods actually.


"You can take those Algebraic Data Types from my cold dead hands" - Ron Minsky


What did you mean by weird symlinks? I've been doing TypeScript a few years and havent had to interact with symlinks.


TS has terrible type inference. At work I collect a list of the failures.


If only handling types wasn’t so difficult and error prone. I suppose being able to manage types with human readable text files is a benefit, theoretically.


What editor are you using?


I recently built a full-stack app with Deno (and their Fresh framework https://fresh.deno.dev) called Invo (https://invo.ee) and the whole experience was delightful. I was able to move really fast and I never had to touch a configuration file, which also immensely helped with my motivation of actually building something - far too often I get stuck in a pile of config (I do Clojure mostly) and lose any motivation I had.


Just wanted to notify that your app doesn't work. I downloaded the invoice but it shows `404 Not Found The requested deployment does not exist. code: DEPLOYMENT_NOT_FOUND`


Thanks for mentioning! Fixed now.


Can you mention your complete tech stack?


Developed using Fresh framework (https://fresh.deno.dev) and deployed to Deno Deploy (https://deno.com/deploy). For persistent storage I use a simple MySQL instance from DigitalOcean (https://www.digitalocean.com/products/managed-databases). The PDF creation is done via Browserless (https://www.browserless.io/), which really just captures a website I point it towards and creates a PDF out of it.


Nitpick:

> Additionally Invo supports USD, EUR, JPY and GDP currencies.

Did you mean `GBP` here?


Yup indeed! My bad, fixed now.


FWIW I just tried to create an invoice using your page and changing currencies doesn't appear to stick (Safari on MacOS).


Thanks for mentioning! It was broken indeed, and I just fixed it :)


Cheers! Very quick fix - Impressive!


Nice, simple, and to the point.

One usability issue - the currency selector is hidden behind a gear icon on the last Total line, whereas you see the currency in previous lines without the ability to change it. I would move it somewhere much more obvious.


Good input! I moved it to be right above the "Items", so it's the first thing to do with currencies, and changed the icon to reflect the currently active currency. WDYT?


It's perfect!


This argument is so interesting, essentially it's "there is no other Deno, so you don't have to worry about using a slightly incompatible implementation and getting bugs".

But a lot of people think that multiple implementations and specs are good things (I personally don't know which way I go on this one). And thinking about it, I kind of don't buy the argument. There are specs for JavaScript and TypeScript, and multiple standalone JS and TS engines: node, QuickJS, duckjs, Bun, etc.

I also don't buy the "minimal, practical" take on dependency management. It sounds like they just built a package manager into Deno itself and put versions in the URL, which feels like the opposite of minimal and more like "batteries included". Again not saying this is bad--the Python ecosystem suffers pretty badly from not having their dependency story totally mapped out--just quibbling over the characterization.

On the other hand, I do buy the security thing. AFAIK this is the first scripting runtime to essentially integrate pledge [0], which is real interesting.

[0]: https://man.openbsd.org/pledge.2


My take is that Deno borrows a lot from web standards and functionality provided accross browsers.

In the case of dependency management, they use a pattern which would also work for client code shipped to a web browser (in that case the client would assemble the code - like if you had a reference to load mathjax or google analytics or whatever, or you could alternatively vendor and ship them yourself).

To me Deno is basically modern web standards + very nice tooling for use as a scripting or serverside language (typescript interpretter, format, compile, vendor, test, lockfiles/import maps, ability to explicitly include/exclude capabilties from runtime, etc).


It's a step in a positive direction that Deno makes you explicitly grant a script permission to do some things, but that's not at all what is meant by capabilities, and that's absolutely the wrong word to use to describe it. A capability is a first-class concept that exists at runtime, and is a value you must pass to dependencies that need it. You should also be able to use it to create new, dependent capabilities with fewer rights, and they should in general be much finer-grained than "can access the internet" to be of real use.


> they should in general be much finer-grained than "can access the internet"

Just for the record, Deno does allow finer grained control than that:

> --allow-net=<allow-net> Allow network access. You can specify an optional, comma-separated list of IP addresses or hostnames (optionally with ports) to provide an allow-list of allowed network addresses.

https://deno.land/manual@v1.30.3/basics/permissions


Agreed - the Deno model doesn't bring much security, because you have to grant all code in the whole app the same permissions, which for any non-trivial app will amount to "mostly everything".

First class capabilities are a much better approach, a real solution to supply chain attacks.


Sure but things like web access you can grant access to just the domains your app uses and everything else is blocked same for file access you can grant is access to "views".

Though most people will likely just --allow-all because it is easy.


For now. Realms will address this.


Thanks, it was confused thinking on my part, corrected!


> The only big drawback of Deno is the language

Ooft. Although TypeScript does have a bit of a learning curve, and a lot of gotchas, I still have to say it's my favorite language to use. If you combine the delicious syntax-sugar of ES6 (and beyond) with a sprinkling of tasteful types, the language leaves you full and energized without feeling bloated.


TypeScript’s type system is very complex and yet fails to achieve type safety to the degree that really strict languages do.

I’m not that into class-based languages and prefer a language to have macros (more for DSLs they enable them for application code). However, even if those preferences were off the table, I’d still rather have something like Kotlin on the server than TS. It’s simpler and isn’t weighed down by 30 years of cruft.


Typescript isn't really a class-based language.

It's a language that supports both function and OO paradigms.

Also, if macros/DSLs are what you're after, then you'd probably be interested in typescript's decorators. (you can use the old experimental decorators based on the old standard from ages past, and the new *standard* decorators introduced in typescript 5 which are based on the current stage-3 ecmascript proposal).

The only things typescript as a language is really lacking are:

1. Pattern matching (minor issue)

- +90% of pattern matching is really just a more concise switch/case syntax

- the remaining -10% can be done by hand with mapping or just basic if statements

2. Immutable data-structures (minor issue)

- The `as const` suffix handles most of the use cases.

- There is a Records/Tuples proposal which would give some better syntactic sugar and some run-time performance improvements too.

3. Trait system (minor issue)

- Typescript compiles down to Javascript which at the lowest level is actually a prototype-based language. Traits can be approximated using run-time mixins which alter the prototypes to essentially perform the same functions as traits.

- But, it would be nice to have compile-time rather than run-time support for traits.

- I think most languages are still only coming to terms with the fact that traits are often better than "traditional OOP classes".

4. Effects system (investigate)

- It's too early to tell, but having an effects system (à la OCAML 5) will probably lead to entirely new programming paradigms. I'm not sure where it will lead, but having native algebraic effects to replace try/catch/throw would be very nice.

Edit: Formatting


As I said, the primary reason I wanted macros was for the DSLs they make possible, not their use in application code. For example when I’m working on a Phoenix app, I very rarely write a macro, but I get a great deal of benefit from Elixir them since they make Phoenix’s router DSL, Ecto’s database query syntax and the standard testing DSLs possible.

I disagree that "pattern matching is "really just a more concise switch/case syntax" and use it regularly in function heads and in parsing nested data structures.

More importantly, I reject the premise that the four items you mentioned are the only things really lacking or even that the problem with TS was a lack of features as opposed to its bad features, superfluous features and poor design decisions. For example, I'm not a fan of any of the following:

- loose typing

- multiple ways to declare variables, with different scoping rules

- multiple ways to declare functions, with different treatment of "this"

- the ability of code in any module anywhere in the system to affect code in the module you're looking at without leaving any evidence of this in the module you're looking at

- the ability of functions to alter parameters they're called with

- the sort function in the standard library being destructive by using the above

- etc...

That said, JS/TS really is missing some important things that didn't make your list:

- a good concurrency model

- macros

- compile time guarantees about references

- list comprehensions (actually minor)

- a sufficient standard library

- etc...


Fwiw, the function syntax in Typescript/Javascript is generally expressive enough to use as a basis for DSLs, and not needing to use macros for this also has a lot of benefits. I'm a big fan of Rust, but I find often library authors overuse custom syntax in macros when just writing functions would have been easier, just as expressive, and much simpler to integrate with an IDE. Where macros do exist in Javascript-land, they tend to be either to implement upcoming features, or as optimisation tools (e.g. converting CSS-in-JS expressions at build-time rather than run-time). The one exception here is probably JSX, but even there lots of people prefer to use function calls instead.

But for things like testing blocks, configuration, HTML/CSS generation, etc, it's generally enough to use the language by itself.


If JavaScript is “expressive enough to use as a basis for DSLs”, why do you suppose it is that various explicit attempts to recreate RoR in JS, going way back to SailsJS have failed to recreate its router DSL or even similarly terse models or controllers?

Of course not everyone wants a Rails-inspired MVC framework experience, but why do you figure even those who did couldn’t achieve the same level of syntactic sugar?

(I know Ruby doesn’t have macros either, but it does have enough expressiveness in a few other ways to do some things Python and even JS can’t)


It's interesting that you mention Python, because obviously the RoR equivalent there is Django which leaves something to be desired when it comes to DSLs and expressive APIs. Symphony and Laravel also come to mind here, and I guess Spring in the Java world. I think it's difficult to argue that the creation of a major web framework hinges on being able to express things in a nice DSL.

Rather, I think the truth is that server-side Javascript tends to be used in a different way to Ruby - generally Node projects that I've worked on are much smaller (and often coupled with other similarly-sized services), and are often just minimal APIs designed to act as a middleman between a database and a front-end application. It's not an issue with missing expressivity, but rather a tendency not to need (or at least, not to want) the full-featured powerhouse that is RoR.


These comments are getting a little large, so let me break it into topics:

# DSLs

I'm not familiar with a Phoenix development workflow, but judging by what you've written already, it seems that DSLs are the main sticking point for you.

If I understand correctly, you want to use them to minimise the amount of code you need to write to and/or to guide the project to follow a specific pattern.

I already see this being used in the js/ts sphere:

- NestJS uses decorators for routing and to make the MVC pattern more terse: https://stackblitz.com/edit/nestjs-typescript-starter-pcysqn... - AdonisJS uses decorators for its ORM: https://docs.adonisjs.com/reference/orm/decorators#column

---------------------

# Pattern matching

Can you give an example of what you mean by "use it regularly in function heads and in parsing nested data structures" in regards to pattern matching?

I'm under the impression that it's just a shorthand switch/case that uses filters as conditions.

The current ecmascript proposal for pattern matching (which would end up in js and ts if passed) seems to be inspired in part by Elixir/Erlang.

---------------------

# Features that you're not a fan of

One of the things that makes TS so interesting is that it's a broad language. It doesn't impose a way of development for you.

You can start a project based on how you know how to develop.

- If you already know about functions and typing systems then you can start programming right away.

- If you only understand 90's style OOP then you can start programming right away

Developers should have the option to define the programming styles of the project that they create.

But to go through your list:

> loose typing

I don't see this as a negative.

Those that want strict typing use strict typing, those that want loose typing use loose typing.

> multiple ways to declare variables, with different scoping rules

The `var` keyword is only there for backwards compatibility.

But, if you do try to use it in a new project, you'll be alerted immediately by eslint and you can change it to `let` or `const` depending on your intention.

> multiple ways to declare functions, with different treatment of "this"

All this comes down to is if you want to do OOP (function keyword), or functional (arrow functions).

Use the one that suits you best.

>the ability of code in any module anywhere in the system to affect code in the module you're looking at without leaving any evidence of this in the module you're looking at

What do you mean by this? It sounds like you're describing side effects in general, and nothing to do with modules.

> the ability of functions to alter parameters they're called with

What do you mean by this?

When you pass by reference, you get the same reference. When you pass by value, you get a copy of the value.

If you want to alter the original source of something directly, you pass by reference.

If you want a temporary copy of a value, you just pass the primitive value and you'll get your own copy of it scoped to a function. Now you can make whatever changes you want to it, and when you're happy you can return the final value. This has the added benefit of the outer value not being stuck in a half-changed state if the function throws in the middle of its execution before the changes to the value were finished.

> the sort function in the standard library being destructive by using the above

I'm not sure what you mean by this. This is just pass-by-reference vs pass-by-value again.

---------------------

# Features that you think are missing from JS/TS

> a good concurrency model

JS/TS already has an event loop, async/await functions, pull-style generators, and threads.

One of the things I suggested in my previous comment (but marked as needing further investigation) was an effects system. They can be used to make so-called "colorblind" functions which can be valuable for some async workflows.

> macros

I don't see how these are different from decorators. Unless maybe you're talking about compile-time macros, in which case that's more in line with metaprogramming and things like extending the language with your own (re)compilers like babel.

> compile time guarantees about references

Isn't this what `readonyl`, `as const` and the `satisfies` operator already achieve in TS?

> list comprehensions (actually minor)

Yeah, this is minor. Just syntactic sugar, nothing more to say here.

> a sufficient standard library

Yeah, a lot of early weirdness was caused by not having a standard library. But now we have things like lodash and deno's std.


What About Named Parameters?

I'm surprised Typescript has not tried to add this major missing feature from Javascript.

I know you can hack this using an Object, but for me Named Parameters are so useful for code readability and intent - especially for calling functions to make it obvious what's happening and to so that different order of arguments doesn't cause bugs.


Immerjs is close enough to language level immutable data structures that I don't miss them that much.

I'm hoping the proposal for match syntax finally lands. Something like Swift's protocols would be nice.

TS inherits some unfixable mistakes from JS but overall I find it a pretty comfortable and productive language.


> typescript's decorators

Typescript has decorators?!!

Damn, i wish I'd know about this 3 months ago.


Decorators in TypeScript is still an experimental feature which may change depending on if/how it's implemented in JavaScript.

> To enable experimental support for decorators, you must enable the experimentalDecorators compiler option either on the command line or in your tsconfig.json

https://www.typescriptlang.org/docs/handbook/decorators.html

> ..the current decorators proposal, which is a work in progress

https://github.com/tc39/proposal-decorators

Oh, but I see that the proposal is now at Stage 3, which means the specs and syntax are stable ("completely described") and ready for browsers to implement.

On further digging, it seems decorators will be a standard feature included in TypeScript 5.0 planned for release on March 14th.

TypeScript 5.0 Iteration Plan - https://github.com/microsoft/TypeScript/issues/51362


~3 months ago the TC-39 (ES/JS language steering committee) hadn't moved their decorators proposal to Stage 3 and Typescript hadn't implemented it yet.

Typescript supported a much older (and different) decorators proposal under an experimental compiler flag, but my momma always taught me never to use flags marked "experimental" in Production.


Shit even something like rescript. Ocaml is weighed down by 30 years of cruft and it's not known for a simple type system either. Yet it's still easier to work with than typescript and you get more for it.


ES6 classes are still based on prototype inheritance underneath.


Afaik, it's still missing a high quality REPL which python can give you. Extremely useful for exploratory coding.

Otherwise I have to agree, Typescript is a pretty damned good compromise.


Deno’s REPL is quite nice. It’s syntax highlighted, supports Typescript and autocompletion as well as creature comforts like enter opening a new line when the cursor is in between quotations. You can also pass a file to load and then work interactively from that. The URL-based imports allow you to paste code from another project and not worry about `npm install`ing anything.


You can get pretty close with Vitest, a unit test as a REPL, and Vitest's ability to run tests as soon as the file changes.


I never understood why people feel that typescript has a learning curve. I never formally learned typescript; I just started using it intuitively. I already knew javascript, and I already understood how types work from using other languages, so there wasn't anything to learn.

When people say typescript is hard to learn, do they know javascript, or are they starting from scratch?


I had a similar experience where I was able to pick up most of it just by tinkering around with it, but there were a few places where I expected it to behave in a certain way, but ended up tripping over a few places where the type system is a little too weak, or the syntax for some of the advanced types are too unique to guess how it works without some studying. I made a blanket statement about the curve too, since I feel like the community as a whole is still adopting it. Library authors adopted it early, but the average front-end dev (judging from podcast topics and documentation I see) seems to be really getting serious about learning it in depth. YMMV.


It has an insanely complex type system. It's kind of incredible, I don't know of any other type systems off hand which are totally turing-complete. That's honestly really impressive.

I don't think you need to use most of these types until you start getting into stuff like needing union types. You can get away with reading and writing most typescript without getting too crazy.

I think the most compelling thing I read about type hinting is that it's like salt - A little and the dish doesn't taste the same without it but too much and you've ruined the thing


OCaml, Haskell and C++ (with templates) for example.


Same.

Python's type hints and Ruby's Sorbet (or their weird RBS thing), I can understand.

But TypeScript? Besides a few things that are not obvious at first (like callable interfaces), I don't think you really need to think about most stuff.


What are you comparing it to and what is lacking in those other languages?


Good question. To be fair, JavaScript was my first "real" programming language, so my brain has adapted the most to it's weirdness. That's probably why TypeScript is very useful for me. But to answer your question, I've been using C# primarily for the last 4 years, and I dabble in other languages like Rust and Python. I used C++ in school (that's what they used for programming classes if you'll believe it). TypeScript feels pretty nice, but I do adore C#. I wish Microsoft would make a T# (as someone else put it once).


I love deno so much and all of its ideas highlighted in the article (dependency management, security, performance) so much that I've build an entire framework around making scripting enterprise grade that is centered around deno support. The project is open-source [1], we even expose a web monaco editor and made the lsp support available using websocket and json-rpc. The goal here is to build enterprise-grade one-off script and workflows in deno, that can benefit from permissioned credentials, autogenerated UI (by parsing the main function parameters), a workflow engine similar to airflow/temporal, etc ...

We have a hub [2], centered around deno, that serves as our library of integration. An integration for us, is just a script that uses the right dependency to do an atomic action like fetching data or doing a POST.

We are betting big on deno, and are hoping with windmill to be the framework to make it enterprise-ready for other things than webservers (which most of the deno framework currently focus on).

[1]: https://github.com/windmill-labs/windmill [2]: https://hub.windmill.dev


While trying to use the public instance, finding created scripts was kinda hard: they're under "Home". When a script is being viewed, "Home" is not highlighted, and it seems like one of the last places to check.

Also, starting with an empty workspace and importing examples from the hub manually might be less confusing. With public workspace having all these examples, "Home" is cluttered and it's not obvious that new scripts also end up there.

Overall it's very impressive, thanks for open sourcing!


Just wanted to say I bookmarked windmill a couple weeks ago and am really excited to try it out.


Very cool project, I bookmarked it as well and excited to try.


It's great to see other JS runtimes progressing, I have a continued frustration with the lack of decent TS tooling and ESM support in Node.js.

Jest and Mocha both do not support ESM very well out of the box and there is an expectation that you are using babel to transpile to CommonJS for everything to work nicely. (See Jest's _experimental_ support for ESM mocking!). Node.js is only starting to build out a testing framework, and it has a way to go yet until it has any level of feature parity with the established libraries.

Then you have the naive assumption that any package written in JS works with TS. Which is true, they do work, but the missing types will always lead you to look for a TS-first alternative. A good example of this is Joi vs Zod for validation, with the latter performing type inference based on your validation parameters, which is great. But it really does feel like you have a subset of a package ecoystem lurking on NPM.

I am not that confident Deno is the answer, but I would say those are my biggest frustration writing production grade JS/TS code today.


One particularly nice aspect to Deno is its built-in test runner. There’s no complex configuration, like grabbing ts-jest, setting up your transform filters, ensuring that you’re injecting `@jest/types` and then almost invariably spending time debugging precisely why it still can’t grok an export keyword.

The Node testing ecosystem has its obvious advantages in terms of how much it’s been buttressed by third-party efforts (which makes it incredibly flexible), but just being able to write a TypeScript test file, run `deno test` and just have it work is really a breath of fresh air.


node has a built in test runner too, but i've not used it yet.


Recently stabilized in v18, yes, and it’s pretty great, but it doesn’t support TypeScript without a transpilation step, so you’ll still need to implement a build pipeline before you’re able to run a test.

For pure JS projects it’s quite nice though.


ah indeed. it should :(


Have a look at https://github.com/esbuild-kit/tsx

tsx is a CLI command (alternative to node) for seamlessly running TypeScript & ESM, in both commonjs & module package types.

It's powered by esbuild so it's insanely fast.

I also suggest using its alias: esno


Your point abour the "TypeScript subset" of the package ecosystem is good. However, I've found TS support (native or via @types/) tends to be a strong indicator of package quality.


I tried Deno and really liked it. The pros:

- It uses the v8 engine so you've got the same APIs you know and love for the browser like fetch() - Something that nodejs is just catching up to now

- It's not nodejs, the whole name means 'destroy node'. You can get a lot done without being buried in dependencies. Well-maintained core libraries.

- Server-Side WASM is a really interesting idea for encapsulating and integrating stuff written in, say, C++

The cons:

- Some of the third party ecosystem seems to be suffering from a lack of recent maintenance

- Server-side WASM is limited to 32-bit until they figure that out, which also limits whatever you do with it to a 4GB memory ceiling

Other than all that though it's probably not a bad choice for anyone not bothered by those limitations and just wants everyone who speaks typescript on the frontend to be able to grasp what's on the server side end of things too.

I think it could probably thrive with more usage traction.


Node.JS and Deno are both running with V8. fetch() isn't part of the JS standard library, it's a Web API. Deno just made (the good imo) design decision to support WebAPIs over something more custom despite the fact that it's not a runtime for web apps.

Server-side WASM is the future for FaaS imo, sandboxing built in and an effective object-capability approach to dependency safety is amazing.


fetch is also an API in node (last few versions)

https://nodejs.org/dist/latest-v18.x/docs/api/globals.html


Hello Java servlets.


Fetch is not included in v8. If it were, node would have had it earlier...

Deno doesn't mean 'destroy node', it's just node reordered.


i think the name is actually derived from 'node'.split('').sort().join('') but i like "destroy node" a lot better lol


judging by the comments here everything I knew was apparently a confabulation


Explanations that are visceral have narrative power tend to be believed and spread much more widely that the true, boring explanation.

A classic "true fact" that's false: Hey did you ever hear that people in england used to brick up their windows due to a window tax? (true) Thís is the origin of the phrase "daylight robbery" (false)


Node also uses V8, the lack of API parity with browser is unrelated.


Another pro in my book: it "compiles" to a single binary. I was writing a small side project and when it was time to deploy, it was as easy as to push the binary to my server via `scp`. That single binary also includes a copy of `sqlite` since it is a WASM dependency. Also, cross compilation. I can compile it from my m1 mac for an x86 linux server.


This and the built in formatter make me feel like Deno’s developer experience was directly inspired by golang


I mean, the standard library is inspired in golang's[1], so I would assume that other parts of their design are also inspired by go.

[1]: https://deno.land/std@0.177.0#contributing


> copy of `sqlite` since it is a WASM dependency

How's the performance vs ffi version? I'm about to use deno with sqlite and have to decide which one to use. The lack of WAL in WASM version gives me a pause though.


> How's the performance vs ffi version?

Can't say because I haven't compared them. This is a small personal project that will have less than 10 users, so I'm not that worried.


I'm a huge fan of it personally. To me, relative to Node the way this handles feels, broadly, closer to a browser -- by leaning on Web standards there's a variety of knowledge transfer between client-side work and server-side for things like packaging, API calls, and various other functionality; and it builds in enough basic tooling that I don't have to find or install or develop opinions on external libraries for handling anything from watching files to using environment variables to running tests (if I necessarily cared to do that with personal projects).

The limitations that stick out most to me personally are that there's still no ARM support, and that, while the Node compat is getting a lot better, it's still hard to get a clear sense of where it is for individual packages -- what runs natively/is isomorphic, vs what'll run under a compat layer, vs what just won't run at all.


32-bit ARM you mean? I’m not a big user, but it seems to be native on my ARM mac.


That's less of a distinction with 32- vs 64-bit for me (though I know that does also pose issues for, say, users of any Raspberry Pi other than the 4B/CM4) and more that there just aren't any first-party ARM builds for Linux.

(That said I can't really speak to macOS here, as I really only use Macs when workplaces require me to, which also doesn't necessarily translate to having sudo access.)


I do wonder how it compares with a more suitable server language like go. Being better than js is a low bar


Personally given a preference I like go more but sometimes you just need everyone on the frontend to be able to read your code on the backend without having needing them to spend a week learning a new programming language


As a frequent js/ts avoider currently using deno and deno deploy for a few bits and pieces, I've found both truly pleasant to use. My needs are simple (cut to fit my js ecosystem ignorance) but given those limits, the only problems I've had so far are with some npm deps that don't work (&/or I don't know how to make work). No big issue as there's often some deno-proper equivalent, or if not, some other npm package that's OK.

Usually when I do anything in the node/npm world I have to steel myself. But Deno & Deno deploy are actually fun.


Agreed. Deno Deploy is actually a great product. I'm using it for all of my side projects now.


I'm a big Deno fan and use it whenever I can. I think tasks are great and they solve the most common needs.

In my experience, however, as soon as my tasks section got large enough that a separate file was needed, it became long-running enough that I wanted a generic file dependency-aware builds which required a make-like solution. What should large projects do that need to build assets or perform long-running work as part of a build process without dipping into another clunky tool?

I have similar issues with threading development-only import maps that reference local modules in a polyrepo during development and sharing common Deno configs. Customization doesn't scale well to task one-liners even though I want it to. Do folks know of open-source polyrepo reference that serve as a paradigm for patterns that scale?


> Deno comes with a built in deno vendor command, which fetches all of the dependencies of the current project and puts them into a subfolder, making production deployments immune to dependencies’ hosting failures.

While this is certainly nice, should third-party code be actually committed in your version control system? I always treated it like a bad idea. Adding node_modules to gitignore is second nature now.


If you work a company with a large monorepo, you often want to vendor all third party dependencies in the tree — you don’t want the build of this massive repo to be down because one of the five package managers it depends on is down. And if there’s a single vendor folder used by all projects, you’re certain that all parts of the tree are using the same version of the external dependency.


What is wrong with

  ["10","10","10"].map(parseInt)

?


It passes the value and index to `parseInt`, where the 2nd argument is `base`. So it does not return `[10, 10, 10]` as you'd expect. It returns `[10, NaN, 2]` instead.

You have to do `['10', '10', '10'].map((val) => parseInt(val, 10));` to get the "expected" output. In addition, you should always provide `base` to `parseInt` otherwise it has it's own interpretation infered by the value you give it.

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


or just this:

  ['10', '10', '10'].map(val => parseInt(val))
Leading zeroes can often be considered invalid input, and you may be able to safely assume that it has been dealt with already.

Also the parens around a single argument of an arrow function are a superstitious thing suggested by some well-known react developer a while back, I think.


I have my linting setup to force a base on `parseInt`, for the extra characters I do feel it's worth it for the safety.

The extra parentheses is just habit from Prettier formatting :)



Also, no reason to write extra characters just to prevent people from using hexadecimal, it's often convenient and it comes for free!


Octal is sneakier than hexadecimal, especially if the values are from user input.

['07', '08', '09'].map(val => parseInt(val)) // => [7, 0, 0]



Plus there's this if you really want to support old browsers: https://github.com/zloirock/core-js/blob/ce52fdc735c5c809c9e...


was

['07', '08', '09'].map(val => parseInt(val)) // => [7, 8, 9]


My favourite interview question is asking what would return `"192.168.0.1".split(".").map(parseInt)` and why.


That seems like a very bad interview question


Can you elaborate why do you think so? I'm not asking for correct answer on this question of course (indeed if candidate would answer correctly it means that he knew the answer and the question had no point). I'm expecting to confuse candidate by providing him correct answer and then ask him to find out why his assumption differs from correct answer. This is common in software development and observing how does one tries to understand wrong code provides valuable insight on his skills.

Another question that I like is asking to find out issues in one fragment of code. This fragment is crafted to contain style issues, some bugs and architecture issues. This allows to understand the level of candidate: which issues can he spot. Not ideal, but I think it works good enough.


I think the assumption was that you were asking for the correct answer. Definitely makes sense after you explained it though (at least, it makes sense to me.)


I didn't know about that one. What the hell, JavaScript?


map passes the index as the second argument to the mapping function, parseInt accepts the second input as the base. So the result is [10 (0 is treated as 10), NaN (base 1 doesn't work), 2 ("10" in base 2 is 2)].

You can try ["10","10","10"].map(console.log) for more info... the third argument is the source array:

   > 10 0 ['10', '10', '10']

   > 10 1 ['10', '10', '10']

   > 10 2 ['10', '10', '10']


Whoops, of course it does!


> ["10","10","10"].map(parseInt)

Click the 'Run' button:

https://www.typescriptlang.org/play?#code/MYewdgziA2CmB00QHM...


This is a classic in the "Top 10 reasons JS is dumb" type articles.

But yeah it is just a case of the map providing 3 args to the function (element, index, array) and parseInt using 2 value s (string, radix) so we have element->string (yay!) and index->radix (oof!) and array->ignored (meh!)


IIRC `map(parseInt)` is equivalent to `map((value, index) => parseInt(value, index))` (my argument order may be wrong), which is `parseInt(string, radix)`, so you get a different numeric base depending on the array index.


Actually it's even `map((element, index, array) => parseInt(element, index, array))`. In this case `parseInt` does not use third argument but with other functions it might even make more chaos.

`Array.map` function API was badly designed. I'd like to know who I should to blame for that design.


I don't think the API is that bad, just that writing "point free" in any language with loose typing is dangerous.


I fully agree with the author here.

The good old Unix philosophy of designing simple tools that are doing only one thing right has proven extremely useful in the context of using the Shell as a scripting interface.

Composability is paramount to be able to glue small cogs together to quickly build a larger machine for ones needs.

But this way of thinking should not apply to everything, especially not when the tool itself is designed to be used a direct interface with the user (like a Shell).


A lot of complexity in computer science stems from duplicating functionality due to developers not wanting to learn the best tools. The best tool when it comes to implicit dependencies are nix/guix. These have solved the issue across languages for almost two decades. Unfortunately, they have a reputation of being 'hard', so instead of someone attempting another try at the actual problem (cross language implicit dependencies), what we have here is yet another javascript runtime that 'solves' this problem, but only if you're using javascript.

Deno sounds well thought out, but the article realistically presented no compelling reason to try it.

And realistically, it presented many reasons not to. Javascript is a horrendous language. Not because of the syntax, not because of the library support, but simply because even self-dubbed javascript 'experts' are often confuddled by its standardized behavior. We could probably count on several fingers the number of people who can accurately describe javascript's semantics. As the rest of the computing world moves towards sane languages, this seems like backwards 'progress'.


Nix still has issues with things going out of date / falling out of support.

One of the good points about deno raised in the article is that deno removes that problem by having first-party solutions to things that developers usually need built-in.

Also while JS has plenty of footguns, Deno being built around TS and deno ts developers using eslint as a standard practice eliminates those footguns.


An incredibly expresive and powerful functional language... With no type system or static analysis to speak of? Hard nope, that's the express path to write once read never.


The reason to try it that I took away was that it's easy to install and then you get all these niceties - LSP, formatter, scripts, with no extra effort! I love that!!!


Nix flake init does this for just about any language.


Ah and yet nix itself does not yet have a particularly useful language server, to my knowledge


So good that I never saw them in production, not even on the list of officially supported Linux distributions for commercial products.


Really nice. I've only seen Deno as something hard to adopt for very little reason but this gives really great reason!



Making things harder for developers “because it’s better for them” rarely works out.


At this point, I would love to see a compiler for a strict subset of TypeScript with a threading model similar to that of Go.

The biggest thing holding back TypeScript is the awkward transpilation dance, a weak standard library in Node and relatively poor performance - none of which are the fault of the language.

Break free TypeScript, you gotta move out of your parents' place eventually


AssemblyScript is a bit like that: https://www.assemblyscript.org/

But it goes into a slightly different direction from what you want (the main motivation was a "stripped down TS for WASM", not a "stripped down TS with go routines"). It will be interesting to see how they'll tackle multithreading (now that WASM threading is out in the wild).


Microsoft has a native compiler for a Typescript subset on their MakeCode IoT project.


> ["10", "10", "10"].map(parseInt)

[ 10, NaN, 2 ]

what... the... fuck...

I have seen the WAT video but forgot about this example.


Not that surprising when you remember that map and parseInt both deal with more than one argument.


That example isn't in the WAT video (I just re-watched it).


It would have been a pretty long talk if they had included all the WAT


Tried out Deno recently! Having a super powerful REPL is really important to me, so I was trying chrome://inspect connecting to Deno. But I guess that when you do this, you have a JS REPL rather than a TS REPL, is that right? What do people usually do for that?


I can't think of a situation where I'd need my repl to be statically typed, since it's for disposable code; I'm not even sure I would want it to be. Did you have a case where that would be helpful?


I suppose step one might be "being able to paste some TypeScript code without generating syntax errors"? But it'd be interesting to see extra completion options or type errors surfaced, possibly even asynchronously after you've already run the code, letting you know about a mistake you might be making in your disposable code.


That's fair, I could see that


After reading this, if anything, it made me appreciate containers more.


I've recently been playing with Deno. It's early days yet but for all the general scripting tasks I might use Python for, I find Deno a better fit. First class typescript is a killer feature, as is the fact that there is a semblance of stdlib which node simply doesn't have.

My two main quibbles about Deno are:

1. The rather pointless security model that seems about as relevant as it did for PowerShell.

2. It's quite young, there's still some pointy edges.


The whole point of deno is that you can’t use it for general scripting without jumping through a bunch of hoops and then it isn’t portable.


What hoops? I am using it for general scripting and very happy with it.


I would like Deno or Bun to be easier to compile than Firefox or Chromium: Less resource-intensive. Less CPU, less memory, less storage space and less time required to compile.

The present: Use an enormous, complex, sometimes closed source, omnibus program containing a Javascript engine to remotely source and run Javascript files, i.e., other peoples' programs, listed in webpages (sometimes cascades of them as dependencies) in a way that is non-transparent to the user, allowing selected access to certain features of the the user's computer, all mediated and controlled by Big Tech with its dependence on advertising and associated data collection and surveilllance and conflict of interest with respect to users.

The future: Use a standalone Javascript engine to remotely source and run other peoples' Javascript programs in a way that is transparent to the end-user, with all access to the user's computer, if any, under the full control of the user, with no dependence on advertising and no need for data collection and surveillance.

"Javascript" might be replaced with some other language. Typescript, WASM, whatever. In the earlier days of the web before Javascript, people tried to use "Java applets". Back in those days when an applet was encountered the browser would ask the user for permission to run it. As I remember it, this was about as annoying as cookie permission popups, another idea that was implemented in the earlier web then disappeared, but has now been re-implemented due to emrging privacy laws.


Isn’t the JavaScript runtime the largest single component in a browser? Is there really that much fat to trim?


From all of the quirks JS has, I think

["10", "10", "10"].map(parseInt)

is the sanest. Taken in isolation, the parameters of both Array.prototype.map and parseInt makes sense, and a typed language wouldn't help there too much. All hope is on ESLint rules to not let you do shortcuts which could burn you.


[...] and a typed language wouldn't help there too much.

That would be impossible in a typed language. The argument of map() is supposed to be a function with signature (T, Integer, T[]) -> R and parseInt() has signature (String, Integer) -> Integer.


Yeah the third parameter would by chance save the situation here but it's not a rule that you'd always be protected. Also it would be perfectly fine to cast parseInt to (String, Integer, _) in a sufficiently smart type system.


The third parameter, I would argue, is a bit wired to begin with. If you need the entire collection, then you are probably doing more than map(), and if you really need it, then you could probably just close over it. But with that somewhat wired third parameter removed, you are right, that is something you could easily run into in a typed language.


As usual, a bit of a love-hate relationship with TypeScript. What TS has done to JS, though, is absolutely incredible. I wonder if it's possible to abstract the engine, somehow, and apply it on top of a new language. I would love TS without all the warts of JS.


One of the frustrating thing with npm-scripts is the mandatory one-liner script. It is not possible to write a script on several lines.

I would really appreciate if deno-task-shell could be used in a shebang for writing scripts that expands on several lines in dedicated files.


this was an interesting read, although the beginning was a bit confusing.

I laughed a little when I saw the author hinting at TS creating complex structures. A reason I never adopted it. A great use of TS would be to type the external interfaces though!


Definitely this is a weird take to me because most web developers just use a Mac and then setting up an environment is fairly easy with brew and nvm. I haven’t tried but I imagine linux is also pretty straightforward. If you’re doing web dev on Windows I think it would behoove you to use WSL rather than mess around with the weirdness around path limits and the like.

However, the security issues are valid but not necessarily an issue solved by the framework. Ultimately some kind of package manager that had a manual review step before packages could go out would be a nice step, and would work at scale as well.

Manually listing permissions for an application works for security until people get tired, or are just so used to saying yes. Its also very bold of deno to assume their sandbox is impossible to escape. It’s much more likely that no one cares to really try.



What Deno features are tied to TS? Can you eg use the "vendor" feature (or hook it) if you wanted to use it with other languages that compile to JS or Wasm, like ClojureScript or Rust?


> post ES6 [...] has some really good parts, like injection-proof template literal semantics.

What is injection-proof template literal? Any link about that? Thanks :)


https://julialang.org/blog/2013/04/put-this-in-your-pipe/#do... for what is this abstractly and why that is needed.

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe... for specific implementation in JS. Compare with f-strings in Python, which are superficially similar, but can’t be used to, eg, construct SQL or HTML not prone to injection attacks.


Seems like going in to more of a .NET land....


One issue that's been bothering me about Deno is this one:

https://github.com/denoland/deno/issues/14244

You have to use a third-party Docker container if you want to use Docker locally and develop on an M1 Mac (I use Docker to separate my different dev environments). And the issue was just ignored and became stale.


You don't really need Docker even with Node (because it has good dependency management), and especially with Deno (good built-in tooling too)


> You don't really need Docker even with Node

I'm assuming you've never run into node-gyp issues then?


In my experience they're pretty rare. Not many Node packages need a native build (mostly node-sass in my experience, which has now been deprecated in favor of a Dart implementation which I believe compiles to JavaScript), and I don't think Deno even supports importing non-JS/WASM modules to begin with, so that type of problem should be fully impossible here


I usually see these binary dependencies in node projects:

- sass - imagemin-gifsicle - imagemin-jpegtran - imagemin-optipng - imagemin-svgo

Deno does support native plugins, so we could see the same thing happen there.


I think there's still a meaningful difference between runtime plugins (which to my understanding have to exist locally on the file system, outside of normal dependencies) vs normal dependencies that just sometimes have a native component. Deno projects won't occasionally accidentally have a native dependency like Node projects do; adding one will be a Big Deal that you choose to do on purpose, probably reserved for extreme cases, and for those special cases maybe you also choose to bail out and use Docker


Dart has AOT/JIT support since version 2.


That's "just" the lack of an ARM64 image. That is annoying and they should fix but the workaround (of using someone who has built the image) isn't unreasonable.

You still see this frequently in a number of ecosystems (eg https://stackoverflow.com/questions/70061068/could-not-find-...)


Yeah they also can't produce musl libc binaries - so no alpine linux either.

Regardless, my experience on the platforms it works on has been very positive.


Looks great, I just hope they get to release an arm64 version for linux soon as well.


I'd love them if they fixed the import require module mess


I don't know, Deno seems to be a solution looking for a problem. For example, the built-in bundle command works for Deno, and "might" only work for the web, other solutions as esbuild are recommended instead.


> Deno seems to be a solution looking for a problem

On the contrary. Ryan Dahl, the person who created node, came up with deno, specifically to address a number of problems (or regrets) he had with node: see https://youtu.be/M3BM9TB-8yA.


Yes, I know that. But Dahl's regrets are not my problems, really. Remember that Dahl is NOT the author of npm, and without npm, where would Node be now? Probably nowhere.

For example, what is the problem with calling the TypeScript compiler? I have a build setup anyway, so why does it need to be integrated? That's an example of a solution where I don't see the problem.


> without npm, where would Node be now? Probably nowhere.

I disagree. There were and still are good competing package managers and registries. Perhaps, in an alternate universe, yarn might have won. Or in yet another universe, node would have package management built-in, as deno does now. In any case, node would be fine.

> what is the problem with calling the TypeScript compiler? I have a build setup anyway

Just because your project has a build setup, doesn't mean all other projects must have one, too...


No, projects can do what they want, but if having a build setup is your biggest problem you need to solve, you are pretty lucky.

Anyway, my real problem is to be able to easily create modules that I can use everywhere, on the backend, and in the browser. I don't think Deno makes that easier than Node, because for the web, I still need to add a build step using esbuild. So for me, Deno is pretty pointless. But that's just my opinion.


That kind of import, means private package is not allowed ? So no more private package in Deno ?



Does it have guidance on publishing your own package to github ?


Not specifically for GitHub, but:

https://deno.land/manual@v1.30.3/advanced/publishing

The point seems to be that it doesn't need anything special, you just import files, and Deno takes care of the rest.


As soon as Deno introduced me to the idea of just using URLs for importing dependencies, it immediately felt like obviously the best way

Want a private repository? Stand up an apache server. Or an S3 bucket. CDN in front? Sure, why not

Want to casually host and not deal with a repository? Import directly from github

Namespacing? Domain names. And they already have ownership controls/auth that your ecosystem doesn't then have to reinvent

Combine that with import-mapping (being able to tell the runtime to load X when it sees Y), and I don't see any reason to do it any other way


One potentially compelling reason is dependency resolution. You can’t express Cargo-style “collect _requirements_ for all dependencies, and then pick the minimal set of dependencies which satisfies all requirements” with just urls. You potentially can encode fancy constraints for each specific dep into an url, but you can’t have an algorithm which globally looks at the set of dependencies. If you want that use-case, you’ll need some extra tooling which reads constraints and writes the import map.

OTOH, it’s not clear if fancy constraints solve more problems than they create.

Notably, if the only constraint is `^x.y.z`, than the gready, expressible-via-urls algorithm of always selecting the latest semver-compatible version yields the minimal solution.


Yeah that's fair. Personally I'm not too worried about that use-case, but it is something that would be tricky to solve with URLs that get treated as black boxes


Yes, the only requirements/issues as i see, is:

- Need a fallback plan if original hosting died ?

- Need to specify immutable version/hash for the url ?


Import maps take care of the first, like I said: https://deno.land/manual@v1.30.3/basics/import_maps

And Deno supports lockfiles for the second: https://deno.land/manual@v1.30.3/basics/modules/integrity_ch...

It also has a vendoring mechanism, if you prefer that approach: https://deno.land/manual@v1.30.3/tools/vendor


Then it's cool. Same in Haskell when i could use multiple versions in same module.


fresh.deno.dev reviews ?


Fun fact: I remember when I was at Google in 2017.

Ryan Dahl was there as well as a junior engineer (L4).

L4 typically are engineers who graduated for 1-2 years.

Meanwhile Ryan has built a programming engine that has been used worldwide and widely popular on the same scale of Ruby, Java, Python... But he was slotted as an L4.


I interviewed at Google pre-pandemic, and they wanted me to start at an L4 as well.


That's nuts, what was their reasoning?


I don’t want to particularly get into all of it at this moment, as there were a lot of weird moving parts. In the end Cloudflare gave me a very straightforward process and a better offer, so I ended things with Google post in-person-interview but pre actually extending me an offer.

What I will say is that from what I hear this sort of under-leveling is pretty normal and just how it is. Maybe it’s better now, I don’t know.


> What I will say is that from what I hear this sort of under-leveling is pretty normal and just how it is. Maybe it’s better now, I don’t know.

I was offered a position with G after I left a previous employer. I also felt that I was underleveled and expressed this, then declined the offer.

This earned me a call from the SVP I would have been reporting to. The general philosophy they related to me was that G likes to see people perform _at Google_ and that leveling up is very easy, so if you're good, being underleveled isn't a problem because it will correct itself.

"If it's so easy and you're impressed with my track record and believe it qualifies me for the role, as you said before, you should be convinced I'm at the level you mentioned." They hemmed and hawed and bit and said they would see what they could do. Eight (!) months later, long after I'd already accepted another role and told them about it, they got back to me with the higher level.


I can't tell whether they're high on their own supply or they're trying to get people to accept lower comp. The fact they took 8mo to get back to suggests the former to me.

As an SVP, why would you take time out of your day to try and convince a candidate to join the company only to low ball them? Makes no sense.

The whole "we'll level you up quickly if you perform" is almost always bullshit. It usually takes at least 1yr no matter what.


And the fact that the “level up quickly” promise is verbal, means they aren’t bound to anything. Get them to put it in writing if it’s important to you _before_ you sign an offer letter.


It's not almost always bullshit. People just can't grok that their management does not agree that they're performing as stellar as they think.


Generally you're ramping up for 3-6mo and then most companies want to see that you're performing at the expected level for at least 6mo. That's 9-12mo bare minimum.

If you were downleveled and they promised you staff+ then they are probably going to be more stringent and there are also more factors out of your control (E.g. your team, org, manager, etc) so good luck getting "quickly" promoted.


Then, you also wait for the promo cycle which is every 6 month. Promoted in a year is the best case.

In a normal case, you wouldn't be promoted because there wouldn't be big enough project for the next level.


When an organization tries its best to look and act like a government agency, what do you expect?


Doesn't seem better now. Maybe certain teams do better? Seems rampant.


You got trolled


They're clearly incentivized to underlevel people and then take whatever they can get who accepts, so how is it a troll if it's business as usual?


That is absurd. I joined with a few years of experience from FAANG, and they slotted me as L5 (senior).


> But he was slotted as an L4

Here's some context on that from Ryan: https://tinyclouds.org/residency

It does not sound like a typical Google software engineer position or one where leveling is particularly significant.


There are many types of talent, even in programming, and Google requires a very specific one.

As an extreme example, a best brain surgeon or businessman in the world would completely fail Google's software interview and be rejected.


I can guarantee you that L4's bar is not that specific nor the best.

A new grad joining Google for 2 years would likely reach L4.


Deno #_#


Deno is awesome but I'm not satisfied with implementation. I wrote very simple web server which listens for GitHub CI notifications (with deno I implemented it all including crypto with 0 dependencies, wow). We're tiny company so notifications are like once an hour or something like that. It eats like 50 MB of RAM and eats like 6m CPU all the time. Simple Go server eats around 20-30 MB of RAM and eats < 1m CPU all the time. I just don't understand what does deno do all the time with its 6m idle CPU. Hopefully it'll be improved in the future.

Also I think this whole granular access thing is weird. I never wanted it, container isolation is more than enough. Hopefully it does not add much drag to deno progress.


Deno packs a lot in and V8 is pretty big to begin with. It makes sense to me that a Go binary will be leaner in pretty much every metric than a Deno script.

I think the win with Deno is the JS ecosystem and the situations where JSs dynamicity is an advantage. Sometimes (often times, in my own estimates) that's worth double the memory usage and 20% the raw CPU performance.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: