14

I've been playing around with different publish/subscribe implementations for nodeJS and was wondering which one would be best for a specific application. The requirements of the application involve real-time syncing of objects in multi-channel, multi-user 3D environments.

I started off using socket.io, created a basic array of channels, and when users send messages, it loops through the users in that channel and sends a message to the users' client. This worked well, and I had no problems with it.

For object persistence, I added Redis support using node_redis. Then I replaced the client.send loop on the array of channels with Redis pub/sub as a layer of abstraction. But I noticed that I needed to create a new Redis client for each user that made a subscription. And I still needed to store socket.io client information to send messages to on publish. How scalable is that? Are there other (better) implementations or further optimizations I could make? What would you do?

5 Answers 5

6
+100

For object persistence, I added Redis support using node_redis. Then I replaced the client.send loop on the array of channels with Redis pub/sub as a layer of abstraction. But I noticed that I needed to create a new Redis client for each user that made a subscription. And I still needed to store socket.io client information to send messages to on publish. How scalable is that? Are there other (better) implementations or further optimizations I could make? What would you do?

Yes, you have to create a new redis client for every io request. It is heavy and not scalable. But creating a new redis client connection does not consume much memory. So if your system user count is not more than 5000 then it is ok. To scale you can add in slave redis server to resolve the heavy publish and subscribe and if you are concerned about creating a lot of connections then you can increase your OS uLIMIT.

You don't need to store socket.io client in message sent. Once redis received subscribed channel message. It will send message to particular io client.

subscribe.on("message",function(channel,message) { 
 var msg = { message: [client.sessionId, message] }; 
 buffer.push(msg);
 if (buffer.length 15) buffer.shift(); 
 client.send(msg); > });

To subscribe multi channel. I suggest you to pre-store all user with more than one channel(You can use storage Mongodb or redis).

var store = redis.createClient();
var subscriber= redis.createClient()

store.hgetall(UID, function(e, obj){
     subscriber.subscribe(obj.ChannelArray.toArray());
 })
Sign up to request clarification or add additional context in comments.

1 Comment

This is good feedback. I will play with these suggestions. Thanks.
4

I use Faye.js. Seriously the simplest pub/sub implementation I could find. Maybe this will help!

1 Comment

This looks promising. The only con I can see is the number of supported transports. Two (WebSocket and polling) vs socket.io's six supported transports.
2

Try to look at this question regarding redis pub/sub and socket.io.

2 Comments

This works well for a single channel, but not so sure for multiple channels? Maybe if you kept track of the channel name inside the message and confirmed user is in channel on message, but seems redundant and inefficient.
@Detect: It may seem redundant and inefficient, but you can still outperform other solutions, because redis is natively optimized for high level of concurrent connections. Only way to find out is to benchmark each solution.
2

Or you can try juggernaut. It using socket.io + redis pub sub middleware + node.js. Support all transports. It help you handler all channel subscribe or publish between client and redis. Juggernaut is scalable but only concern on overhead on redis and it not support authentication (you can do some trick on it). However redis can write/read >150 k per seconds so should be no problem for your case.

https://github.com/maccman/juggernaut

Comments

1

FYI Socket.io v0.7 will support channels and should simplify your existing code (no more pub/sub lib dependencies)

See: http://cl.ly/0B0C3f133K1m3j422n0K

1 Comment

That's an improvement. I'm worried keeping socket.io client info in memory won't be scalable without using a shared memory solution. That's why I was playing around with Redis as an alternative.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.