Can anyone say are if there is any practical limit for the number of databases in mongodb? I've started to have serious problems when I passed 120 databases. Simple things like :
> show dbs
Mon Feb 10 16:35:32 DBClientCursor::init call() failed
Mon Feb 10 16:35:32 query failed : admin.$cmd { listDatabases: 1.0 } to: 127.0.0.1:27017
Mon Feb 10 16:35:32 Error: error doing query: failed src/mongo/shell/collection.js:155
Mon Feb 10 16:35:32 trying reconnect to 127.0.0.1:27017
Mon Feb 10 16:35:32 reconnect 127.0.0.1:27017 failed couldn't connect to server 127.0.0.1:27017
>
Mon Feb 10 16:36:01 trying reconnect to 127.0.0.1:27017
Mon Feb 10 16:36:01 reconnect 127.0.0.1:27017 failed couldn't connect to server 127.0.0.1:27017
>
Mon Feb 10 16:37:01 trying reconnect to 127.0.0.1:27017
Mon Feb 10 16:37:01 reconnect 127.0.0.1:27017 ok
and
> getMemInfo()
{ "virtual" : 32, "resident" : 7 }
Mon Feb 10 16:39:00 DBClientCursor::init call() failed
Mon Feb 10 16:39:00 query failed : admin.$cmd { replSetGetStatus: 1.0, forShell: 1.0 } to: 127.0.0.1:27017
> shell
Mon Feb 10 16:39:38 ReferenceError: shell is not defined (shell):1
Mon Feb 10 16:39:38 trying reconnect to 127.0.0.1:27017
Mon Feb 10 16:39:38 reconnect 127.0.0.1:27017 ok
Yet the log file stayed enigmatic
What version of mongodb are you running on what host? Here is a test on CenOS 6.5, mongodb 2.2 x86_64 direct from EPEL
Here is a sample python script that creates 1000 databases
from pymongo import MongoClient
mc = MongoClient()
for i in range(5000):
print i
mc['db%s'%(i)].test.insert({"test":True})
output:
...snip...
506
Traceback (most recent call last):
File "overload_mongo.py", line 6, in <module>
mc['db%s'%(i)].test.insert({"test":True})
File "/usr/lib64/python2.6/site-packages/pymongo/collection.py", line 357, in insert
continue_on_error, self.__uuid_subtype), safe)
File "/usr/lib64/python2.6/site-packages/pymongo/mongo_client.py", line 929, in _send_message
raise AutoReconnect(str(e))
pymongo.errors.AutoReconnect: [Errno 104] Connection reset by peer
There it is, looking at the log
ERROR: Uncaught std::exception: boost::filesystem::basic_directory_iterator constructor: Too many open files: "/index/bauman/db/_tmp/esort.1392056635.506/", terminating
The good ole too many open files problem
If you are on a enterprise linux platform, you can drop this file into /etc/security/limits.d/mongodb.conf and start a new session
mongodb hard nofile 99999
mongodb soft nofile 99999
mongodb hard nproc 99999
mongodb soft nproc 99999
I dont know how to achieve a similar result on windows.
The 'problem' lies in that MongoDB wants to memory map every single database file, so you need your hostOS to allow it to do so.
Same code as above
python overload_mongo.py
Output
...snip...
995
996
997
998
999
All better
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With