Hacker News new | past | comments | ask | show | jobs | submit login

Yeah, I think you've got it.

1. Initial page load dumps in as much data as possible to get things started.

2. All writes go through the Django API (we just look for a success/fail response).

3. All new/real-time data gets pushed to the client via socket.io. This includes data that you yourself created.

#1 is kind of a given, but #2 & #3 are more personal preference/app-specific I think.

We could have done #2 through socket.io, but as noted in the post, Python is our primary language and we wanted to push as much of the logic to it as we could.

At first we had content creators reading responses from the API with all other team members getting their updates from socket.io. This was problematic because creators would get content twice (once over the update channel for the team and once from the API). It resulted in some strange race conditions and a lot of "have I already received this message?" code. In the end, it was easiest for us to just always defer to the data from the socket (which was faster as well).




What is your system for pub/sub from the client's perpective? Do you have matching models on Backbone & Django?

I guess what I'm interested in is how to know what users to route what data to via socket.io.

Great stack! We use something very similar.


The client passes a unique key with a short TTL to the Node.js server that maps them to the proper user and teams for the duration of their session.


This makes a lot of sense and I look forward to trying this out in my own project. Thanks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: