Hacker News new | past | comments | ask | show | jobs | submit login

I had a second look and the server doesn't look too hard to deploy. I like that there's reasoning behind requiring it, although I suspect that SQLite is more than capable to very easily do this.

I'm trying to deploy the server right now so I can try Plandex, it would be easier if I hadn't forgotten my Postgres password...

As a tip, self-hosting would be much easier (which may be something you don't want to do) if you provided a plain Docker image, then it would just be "pull the Docker image, specify the local directory, specify the DB URL, done".

By the way, why does it need a local directory if it has a database? What's stored in the directory?




Agreed on providing a docker image. I made an issue to track it here: https://github.com/plandex-ai/plandex/issues/78

I do want to make self-hosting as easy as possible. In my experience, there will still be enough folks who prefer cloud to make it work :)

There's a local .plandex directory in the project which just stores the project id, and a $HOME/.plandex-home directory that stores some local metadata on each project--so far just the current plan and current branch.


I see, thanks for the explanation! If you're only storing a bit of data, removing the requirement for a local directory would make deployment easier; these could just go into the database.


Oh sorry, my comment was referring to the local files created by the CLI. The server uses the file system much more heavily in order to enable efficient version control with an embedded git repo for each plan. Everything in a plan that's version-controlled (context, the conversation, model settings, and tentative file updates) is stored in this repo instead of the database.


Ah, that makes sense, thank you.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: