Phantombot average/max memory usage


Hey Guys, dear Developers, l really need to know what is the average and a max ram usage of the Phantombot? l’ll rent a vps based on what are you guys will say. Sorry for my bad english. :>

Have a nice day!


Check out the post below for recommendations


The more viewers you have the more memory it uses, but I believe you should expect around 90-256mb ram used (256 being an upper limit). Don’t quote me on that.


So, I have a low viewer count (3-15 viewers). My bot has been staying around 225MB resident on CentOS. So I agree with UsernamesSuck. It depends upon what all modules you enable too, and how much data has to be cached. A larger channel will need more memory as we have to cache more data from Twitch (username to ID mapping, for example). Then as you add more custom commands (!addcom) and notices and so on, the bot needs more memory.

I wouldn’t go any lower than an instance with 512MB of RAM. The operating system still needs memory.



Would you have an estimate of how much memory the Discord modules and stuff take up?


Highly dependent on the member count of your server, roles, channels, members who use invisible status, etc.


Thank you guys for this awesome informations! l really appreciate that!


So, I ran NetBeans Profiler on my bot with Discord. Mind you, I only have myself and my bot in my Discord. The usage was about 20MB total, adding everything up, if I matched up and caught everything. Of course, like Koji said, the number of channels, users, etc will impact this. The 20MB at least shows you the “minimum.”


Thanks very much. :slight_smile:


Yeah the bot stays in the 20-30MB range, even in large event channels. However, JVM (Java Virtual Machine) likes to set a big heap at times, so if you look at your task manager on Windows you’ll probably see the bot using around 150MB.

If you get over 10,000 viewers, I would recommend having at least a decent server. (4 cores, 2-4GB of memory)



Let’s start with 10 on a regular basis and work our way up from there first, shall we? :rofl: