Hacker News new | past | comments | ask | show | jobs | submit login

A whole new generation of developers is learning what the previous generation termed "briefcase applications". When client-server applications were the thing late 90s and early 2Ks, internet speeds were a serious limitation. This forced many architectures to work with "local-first", disconnected dataset, eventually-synchronised desktop applications. Microsoft ADO famously touted the "disconnected dataset" for this after Borland's "client dataset" pioneered this concept for the Windows desktop. Eventually even Java got disconnected datasets. All these techs more than two decades ago had real practical problems they solved: one of them I worked on involved collecting water flow data from several thousands of rivers in Southern Africa. Hundreds of users had mobile laptops that fired up desktop apps with authentication and business rules. Users then came back to head office to synchronise or used a branch with network to upload.

They worked and they were necessary.

Things changed when internet connectivity eventually became cheap and ubiquitous and the complexity of developing such applications then didn't merit the effort.

Now, the swing back to "local-first" is mainly for user-experience reasons. The same theoretical complexities of "eventually synchronised" that existed for the "briefcase app" are still present.

Is the complexity worth it though?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: