9

I want to put node.js on the cloud for an application which has sensitive corporate information. I am afraid node.js is not as secure as some of the older servers since it has not been in the wild a lot. I saw people recommending to use a reverse proxy with it to make it safer. I understand how it is safer since it is not directly exposed to the world. But still, xss and other attacks are possible. From a security perspective only, anyone thinks that node.js is on par with the older servers? Any tips on "how to convince your boss + the corporate security team"?

5
  • If it is corporate, you should just put it behind the NAT(firewall). Also node.js can be pretty safe against most attacks. Commented Aug 14, 2011 at 9:17
  • @alfred: thanks, but "pretty safe against most attacks" will not convince the corporate security team. Also this is a cloud app it's not behind the corporate NAT. Commented Aug 14, 2011 at 9:45
  • 2
    Your question is far away from the reality. Event "older srevers" are vulnerable to xss and other attacks while this are bugs in YOUR CODE not, the server. From this standpoint older servers are as insecure as nodejs. Reverse proxys can only protect you from for example "http.createserver" when the proxy server detects a malformed http request (protocol error) and drops the request but a hypotetical bug in nodejs would have exposed sensitive information. Commented Aug 14, 2011 at 11:25
  • @Yaron I think it is hard to convince a corporate security team about a product that is a new a node.js... Commented Aug 14, 2011 at 14:51
  • @Tobias the fact that old servers are still vulnerable only shows how hard this problem is. And while I agree most problems are in user code, still node.js does some parsing e.g. to fill in the request/response objects etc. So my question is what are the risks not mitigated by a reverse proxy, and if there is any known benchmark or anything else that can show that node.js is secure. Commented Aug 14, 2011 at 15:23

3 Answers 3

3

In theory, a reverse proxy wouldn't pass on any requests that it itself couldn't process (including those it's designed to block intentionally).

However, if there were bugs on node.js that would for example make it disclose the contents of certain variables when a request like

GET /x0c/xa0

is received, then the proxy would just pass on that request and relay the answer to the client (attacker).

So there are still risks...

Sign up to request clarification or add additional context in comments.

Comments

0

The way to convince the boss and security teams is to demonstrate that you have thought through the issues and have a reasonable and realistic plan to test them.

In any corporate setting, your proxy will only be a small part of the overall security and that is how the risks are managed.

In order to test something like this, you will need to throw a number of *un*reasonable requests at the proxy. I like juand's suggestion for example, you should also throw very large requests at the proxy too.

A Node.js proxy should be at least as secure as an Apache or indeed a custom python/c++ proxy as you need only allow it to proxy very specific items.

Comments

-2

Why don't create hardcore locking proxy on python, c++, etc, which will control access? Every one who pass this proxy is trusted user and node.js works with them.

1 Comment

I don't think this is really an answer to the OP.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.