Are you using any MCP servers? If so, what?
Are you using any MCP servers? If so, what?
Basically it’s a layer to let your LLMs plug I to tools.
They generally run on your machine (I use docker to sandbox them) and may or may not cash out to useful APIs
One example is I just wrote one to connect to my national weather services RSS feed, so my LLMs can get and summarize the weather for me in the morning.
Works well with Gemma 3n
Berger heard of this tool but I’ll check it out.
Mostly I’ve just been making my own dockerfiles and spinning up my own mcp instances.
They’re actually not hard to build so I’m trying to build my own for small utilities, that way I don’t get caught up in an is_even style dependency web.
Made some 30 of them talking to the app server and all the containers inside Docker.
Now we can ask how they’re all doing and asking application-level questions about records, levels, etc., as well as system level questions like how much RAM the db server is using. Makes for a fun demo. Not sure how useful it will be in production.
Can you explain more about your setup?
I’ve been playing with something similar where I built a shared file system and a messaging system for my agents, but how are you running your AIs?