Question for #FediAdmin , especially the ones managing a single-user instance (or just with very few users):

How much RAM is currently using your fedi software, whether it is #Mastodon #Akkoma #GoToSocial or one of the fifty shades of #Misskey (and by RAM I'm meaning "in total", not just the software's main process, also the database and additionnai stuff)

It's really out of curiosity. For instance, soc(dot)breadcat(dot)run currently uses approx 4,5Gb (4Gb for Akkoma's main processes and ~500Mb for the DB, on Docker)

Repost appreciated :nice-three-hearts:
Thanks for all the answers, now I have a good idea of how much my Akkoma install ✨ suck ass ✨
@Breadly Here's the heap in use graph for a GoToSocial instance for the last week (current main/snapshot) running on a 1gb / 1cpu machine. The db is just sqlite, and media processing is done with bundled wasm ffmpeg so any overhead there is already included in the graph. The machine is configured with 2gb of swap to handle spikes.
@dumpsterqueer @Breadly Yeah around the same for me with GTS: 300MB RES with sqlite
@Breadly the database in my case is shared between several services (pleroma, prosody, nextcloud, etc) so I'm not sure what share of that number I'd give it, but 1.8GiB or less I'd say
@Breadly GoToSocial, average RAM usage is 420MiB.

@Breadly my single user instance that also follows Yassie has:

MEM USAGE: 409MB / 16.64GB
MEM%: 2.47%

CPU TIME: 24m48s
AVG CPU%: 1.80%

Running on podman (docker) with sqlite db, though I rarely use this instance but I can imagine spikes closer to 1GB for actual posting and sharing.

@Breadly around 9g (including database, elastic search, debian, etc) for a small masto instance

(tho it's definitely configured so that it uses a lot more than what would be necessarily required, and a lot of that usage would be fine offloaded onto zram)

@Breadly akkoma is using 435.7MiB with a 1GiB limit, + 700MiB for postgres, shared with a matrix server

@Breadly

I'm basically the only user
650MB (SQLite)
#GoToSocial 0.19.1

@Breadly@soc.breadcat.run looks like i'm at about 1.8 gigs for my evy.pet sharkey instance, which i share with a few friends. this includes the instance itself, redis, and postgres. all running in docker
@Breadly@soc.breadcat.run Enby.Life sits at around 4GB (50% of system capacity) with occasional spikes up to 5 or 6 GB during periods of heavy traffic. That's for Postgres, 2x Redis servers, Reverse Proxy, 2x Sharkey queue workers, 3x Sharkey API servers, ModShark daemon, and OS filesystem cache.

@Breadly I'm at 371MB currently for my home gotosocial which is used by me and like five or six active bots

the sqlite DB uses 4.4 GB on disk and the instance has been running since mid-2022

@Breadly My single-user Mastodon instance has 8 GB RAM + 4 GB swap, and shows about 3.4 GB used total. It runs a basic Linux system plus Postgres for the database and whatever stuff Mastodon needs server-side including Redis, but no full-text search.

I initially had it with 4 GB RAM but that was too tight a squeeze for a mastodon.

@Breadly@soc.breadcat.run single-user Sharkey instance on a very-over-specced dedicated server
PostgreSQL is currently using 16GB of virtual address space, about 4GB of resident memory (but it's also backing a NextCloud, a MediaWiki, and some of my own code)
Sharkey itself has allocated 32GB of virtual address space, about 3.5GB of resident memory (running 4 processes)
there's also about 150MB of memory for Redis, but I feel that's a rounding error at this point
@Breadly@soc.breadcat.run All Sharkey. PlasmaTrap-proper lives on a VM with 16GB of RAM, of which it uses ~8GB.
My single-user private instance hovers at ~700MB total.
@Breadly about a gigabyte for Iceshrimp.net
@Breadly@soc.breadcat.run 4.4Gb in total. postgres seems to be consuming 3.3 of those. misskey (vanilla). don't know if 31 users fit the bill, but about 10 are active. Hope that helps!
@Breadly I run mastodon 4.3.9 and it uses 10 GB, 6 of that is Elastic search which is optional, I just like being able to search more easily. The server has 16 GB total. I run my mastodon instance on a lightweight version of Ubuntu called Lubuntu. I run it on bare metal, not docker so all the configuration is up to me. 
@Breadly@soc.breadcat.run
i run sharkey with 8 users using a docker stack (sharkey, postgresql, redis) on debian lxc and the total ram usage stays around 1 GB
sharkey uses ~700-800 MB, postgres uses ~350-400 MB and redis uses ~30 MB
@Breadly Single user here, using GoToSocial on docker with SQLite (1,3GB now). The whole thing never reached 400 MB of RAM, always below.
@Breadly dreampi.es with it's three users and gotosocial + sqlite db: `Memory: 349.2M (peak: 885.5M, swap: 93.9M, swap peak: 203.8M)`

The reverse proxy (caddy) adds a bit more, but that is also proxying tons of other services:
`Memory: 72.6M (peak: 157.7M, swap: 5.8M, swap peak: 14.6M)`

@Breadly I'm running the full Mastodon stack for my single-user instance (including Elasticsearch and an additional minio for media) on a VM with 8GB RAM, of which about 6GB is reported as being used by applications.

DB size is about 22GB after a good 8 years.

@Breadly With Mastodon, 1.4 GiB total for the containers (the VPS has 2 GiB of RAM). I'm not running Elasticsearch or any optional services.

I get rare OOM kills and I haven't done any memory usage optimization yet.

@Breadly docker stats for me reports ~900 MiB for the Akkoma container and ~950 MiB for the Postgres container, the latter is also used by Synapse (1.128 GiB); total memory usage on the system is 3.5 GiB out of 4 GiB

@Breadly Just over 4GB total RSS for Mastodon (web, streaming, sidekiqs,redis,postgresql, and elasticsearch), assuming I've not missed or miscounted anything (it runs on k8s without metrics so that's entirely plausible).

TBH it's lighter than I thought it was going to be, which further makes me wonder if I made a mistake.