Hacker News new | past | comments | ask | show | jobs | submit | daffl's comments login

One thing that I think is often overlooked when it comes to Deno is how the combination of secure defaults, web API compatibility and the dependency loading system can essentially eliminate the need for actual deployment builds and containers making it a great candidate for a truly open source and non-proprietary serverless platform.

Especially in combination with content addressed file systems like IPFS (which gives you additional features like built-in checksums, immutability, de-duplication and storage resilience) you can have a Deno instance running and as the deployment step simply send it a file hash or URL to execute (of course with the appropriate security precautions like a signature or encryption on top). I built a prototype to see how it could work and a full Deno app deployment to my own VPS took less than 2 seconds. No git pushes, no container build pipelines, no proprietary cloud services.


I'd really like to hear the perspective and success stories of "Good first issue" labels and other means of encouraging open source contributions from other open source projects.

In my experience it unfortunately often hasn't been a net benefit for the projects I worked on. A "good first issues" takes up a lot of time to write and often never get addressed at all or it takes even more time to review and give feedback ultimately causing more work than addressing it directly since most "first issue" contributors do not come back to contribute again.

GitHub has done a lot to streamline the process of contributing to an open source project. I think what is still missing (or I don't know about) is an overall resource where you can learn about open source best practises (or just "Best Practises" since you should be using them everywhere) like writing tests, writing good docs, using conventional commits etc. outside of individual projects - and then in addition to the "Good first issue" label also indicate which one of those best practises apply to that issue, e.g. "This is a good first issue if you know about NodeJS, Mocha tests and Markdown".


When there is an issue with a clear bug report that can be fixed in on of two lines and the solution is very clear, I like to answer something like:

"Thanks for the report! The problem is in link-to-file-in-GitHub. Do you want to send a PR to fix it? If not, we can fix it."

Sometimes they accept the proposal, sometimes not. It is a little more work to fetch the change, rebase, and merge it. But some people like the opportunity and perhaps may become a contributor in the future.

(If they don't accept, fix the problem soon and say thanks again with a link to the fix in case they want to see the change.)


A big challenge I'm seeing is when it comes to tests. Even though it is documented in the contribution guide and - I'd like to think - fairly easy to get running (Clone, `npm i`, `npm test`) many first time bug-fixes or features do not include tests.


As an author they have definitely taken more time than if I fixed the issues myself. But for me, that's not the point, the point was to help other fellow devs get started in OSS programming, kind of "light mentoring". At that, it has been wildly successful. I haven't had much free time to continue doing this in the last couple of years though.


Absolutely. I really enjoy helping get people involved in open source, too and help run a once-a month workshop on how to get started with open source here in Vancouver.

I'm just not sure if this is something all open source projects - many of which don't have a lot of resources available - can handle. The very large open projects that already have a lot of contributors that can put in the additional time are definitely at an advantage here.


If an issue takes a lot of time to write, then it's probably not a good first issue. For my project, I found there are all kind of small things that are easy to do but I don't have time, so instead I write a good first issue.

Usually I also put some pointers on how to get started, like "check such method in such file", or "have a look at that reducer action".

It doesn't take long to do this and it's been working pretty well. It's a very good way for new developers to get something done easily and to get familiar with the code base.

As of now there are 40 such issues, including 28 completed: https://github.com/laurent22/joplin/issues?q=label%3A%22good...


This is interesting because it definitely seems to be different for different types of projects like frontend and end-user applications vs. backend and other libraries.


It definitely seems like a lot of the time, the people making these lists could just resolve the issues with not a lot more effort than writing all this 'good first issue' documentation.

It's one thing to throw a tag 'good-for-beginners' on a bunch of issues, but the amount of effort going into some of these 'good first issue' things is astonishing compared to how tiny the issues are in the first place.


A while ago I made a demo of a P2P chat application that runs on the beaker browser and uses the experimental P2P APIs. It's running at dat://feathers-chat.hashbase.io (slides from the talk at https://vanjs-decentralized.hashbase.io). If the peers can see each other you should be able to register with any email address and send chat messages between browser instances.

What I realized is that this technology ticks a lot of the boxes of what we right now think only the big cloud providers can do. By using a more decentralized protocol it is by design

- Actually Serverless

- Offline-first

- Real-time

- Auto-deploying

- Live-updating

- 100% uptime

I really think there is something there from both, a developer and user experience perspective. The problem is that a lot of it is still very experimental and far from the usability and maturity of e.g. a Firebase or Heroku.


dude I love how many people in Vancouver seem to be interested in or even working with dat :) thumbsup


;)


Can you explain how P2P chat is saved into dat?

How can someone without the keys add their messages to the feed? Wouldn't only the initial creator have the keys necessary to add to the feed?


This is a really cool idea, but how should I be using it? Trying to test it out just now I opened up two tabs in Beaker but both rooms just have the one user in them.

Love the design, by the way.


I believe this is happening because it is using the same peer connection. If you start to separate instances of the browser (e.g. in a VM or another machine) you should be able to see both users.

The design is taken from the Chat guide for https://feathersjs.com. Feathers is a JS library that allows to architect APIs in a way that they are protocol independent. Which worked great in this case because I just had to swap out the existing REST/websocket Feathers adapter for a DAT/Beaker API Feathers adapter.


To handle the real-time event syncing we created https://github.com/feathersjs/feathers-sync. It uses a central Redis DB or a MongoDB tailable collection to to synchronize service events between different application instances. Another option is to use your websocket libraries' clustering library, for Socket.io for example there is https://github.com/socketio/socket.io-redis.

This is a very good question. We're definitely planning on adding a section about performance and scaling to the documentation very soon.


I'll quickly add to what daffl said. That there is one more manual method outline here (https://github.com/feathersjs/feathers/issues/265#issuecomme...) that doesn't require running another non-feathers server.


That sounds like something worth keeping track of. Do you want to create an issue at https://github.com/feathersjs/feathers/issues for it? The nice thing is that you most likely won't have to change your services when adding a new transport layer.


Express claims to be a "Fast, unopinionated, minimalist web framework for Node.js" so we decided that minimal fits well because Feathers at it's core is just a very small extension to and in the spirit (or so we'd like to think ;) of Express.


Feathers doesn't assume much on the client. You can use a module loader or the bundled Feathers client which works right out of the box. To be able to use ES imports in the browser I personally use http://stealjs.com


Yup and I use Webpack. The beauty is you get to take your pick. The downer is it's not set up for you. Tradeoffs for now...


Full disclosure - I am one of the Feathers developers.

I think part of it is a philosophy difference. If you want a one-stop solution you will very likely always end up more locked in and might be happy with Meteor. But, as cool and advanced as the technology is, eventually investors do want a return on their investment and it is really hard to do that without compromising the identity of an open source project. With so much cash in the bank it is also easy to become over-ambitious instead of focussing on doing one thing well. That is tricky in a field that is changing as quickly as web development. The more you add to the project and the more you try to do yourself the harder it will be to react to the ever-changing landscape and you can see that with the challenges of the Meteor package system.

We wanted to explore an approach that doesn't hide complexity by adding more and then putting some generators and configuration files in front of it to be able to say "look it's super easy". The Node ecosystem thrives on tiny modules that do one thing well and Feathers - as a very small extension to Express - does exactly that: Providing a pattern that allows you to create REST and real-time APIs without trying to tell you how to do everything else.


The download builder ran out of space on MongoDB and started returning empty files some time today. It should generate everything properly again.


Thanks for pointing that out. It should be fixed now.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: