3

I am trying to build a web-app that has both a Python part and a Node.js part. The Python part is a RESTful API server, and the Node.js will use sockets.io and act as a push server. Both will need to access the same DB instance (Heroku Postgres in my case). The Python part will need to talk to the Node.js part in order to send push messages to be delivered to clients.

I have the Python and DB parts built and deployed, running under a "web" dyno. I am not sure how to build the Node part -- and especially how the Python part can talk to the Node.js part.

I am assuming that the Node.js will need to be a new Heroku app, so that it too can run on a 'web' dyno, so that it benefits from the HTTP routing stack, and clients can connect to it. In such a case, will my Python dynos will be accessing it using just like regular clients?

What are the alternatives? How is this usually done?

5
  • This is an alternate approach, so I'm just posting it as a comment. I just finished watching a great IPython video from PyCon 2012. In it, Fernando mentioned that they were using tornado and 0MQ to create a very-loosely-coupled clients/brokers/servers environment. Tornado is an async server platform (like node.js) written in Python. I'm just getting into the 0MQ docs and it's fascinating. You might want to look at tornado as a pythonic replacement for node.js. Commented Nov 21, 2012 at 17:40
  • I've actually heard of it, and it makes sense, but I have some general issues with this: (1) it doesn't answer my question on how Heroku dynos talk to each other, and whether I need a new Heroku app for this (2) can Heroku dynos talk to e/o using 0MQ? (3) it couples my push server logic and my API logic, which are rather different projects, into one project just so that I can put them on one dyno; (4) it thus makes me write my API async-style, whereas writing it sync can be a lot simpler and manageable for this, especially if I use gevents/gunicorn. Commented Nov 21, 2012 at 18:21
  • All true. As I said, that was why I made it a comment. Commented Nov 21, 2012 at 18:40
  • Yes I'm in a similar boat Nitzan - have a REST API (python tornado) and a node web client. At the moment trying to publish the REST API on Heroku so that I can reach the service with a url. Locally it would listen on port 8080 but I don't think that maps so well to Heroku :-( Commented Sep 21, 2015 at 17:58
  • You would want to listen on port $PORT (env variable), and when accessing it from your node.js app you would use regular port 80, which the heroku http router will map to the right port for your instance. That's an altogether different issue then what I was facing, but I hope this helps nonetheless. Commented Sep 24, 2015 at 17:59

1 Answer 1

2

After having played around a little, and also doing some reading, it seems like Heroku apps that need this have 2 main options:

1) Use some kind of back-end, that both apps can talk to. Examples would be a DB, Redis, 0mq, etc.

2) Use what I suggested above. I actually went ahead and implemented it, and it works.

Just thought I'd share what I've found.

Sign up to request clarification or add additional context in comments.

2 Comments

Hi Nitzan can you share your code - just a management summary would do :-)
Not sure what a "management summary of a code" is, let me try though: both are "regular" Heroku apps, both have only "web" dynos. The code for the API app uses a secret token to validate that it receives requests only from my other app (since it's http interface is open the whole wide world). The token is part of the env settings for both apps, so is visible only to me.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.