Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Node.js and open files limit in linux

I run an node.js client which send a lot of requests to my server (which written also in node.js).

The server get a request for a specific file, and upload it to S3.

I get the following error after two minutes that everything goes well:

{ [NetworkingError: getaddrinfo ENOTFOUND]
  message: 'getaddrinfo ENOTFOUND',
  code: 'NetworkingError',
  errno: 'ENOTFOUND',
  syscall: 'getaddrinfo',
  region: 'us-east-1',
  hostname: 'XXXX.s3.amazonaws.com',
  retryable: true,
  time: Sun Oct 12 2014 11:27:54 GMT-0400 (EDT),
  _willRetry: false }

After I did a small research, i found that it happens probably because I'm trying to open too many file handles, or sockets, which can really only happen after a while.

But as I understand, node.js should encapsulate this issue for me. In other words, node.js should understand the limit of the file descriptor, and open a new one according to this limit. This is the advantages of using one user thread based on event (Where I wrong?)

If node.js doesn't do it, what the best solution for this error, which is not to increase my open file limit (This is a bad idea, because we need a good performance on this machine. This and more, How will I be sure that if I would increase the number, this error would not appear again? How would I know the number the OS should have for this application?)

like image 953
Or Smith Avatar asked Oct 17 '25 00:10

Or Smith


1 Answers

The default open file descriptors are 1024 in Ubuntu. you can set the ulimit -n from the terminal using

ulimit -n   #a number

but this will make the changes on the current login session only. to make the changes permanent use these commands.

ulimit -n   #see the number of open files

sudo vi /etc/security/limits.conf    #open the file in vi

now set both:

user soft nofile 9000

user hard nofile 65000

root soft nofile 9000

root hard nofile 65000

sudo vi /etc/pam.d/common-session

now add this line:

session required pam_limits.so

http://posidev.com/blog/2009/06/04/set-ulimit-parameters-on-ubuntu/

EDIT:

About how to get over this problem without messing with the ulimits then you will have to limit the total number of the fired requests to be less than your current ulimit -n.

by either creating a pool of Http long lived connections and start push your requests through them, or you limit the max concurrent requests fired from your app and any requests exceeding this limit add them to a queue so you can process them later.

this is just a concept if you want more help with this problem, show us some code so we can imagine what you are trying to do.

Hope this helped.

like image 175
Mahmoud.Mahfouz Avatar answered Oct 19 '25 14:10

Mahmoud.Mahfouz