Overview
Obot is an open-source MCP Gateway and AI platform that can be deployed in the cloud or on-prem.
Getting Startedβ
To quickly try a live demo of the Obot MCP Gateway and chat experience, visit https://chat.obot.ai.
To run Obot yourself, youβll need to setup Docker with something like Docker Desktop. Once thatβs ready, run:
docker run -d --name obot -p 8080:8080 -v /var/run/docker.sock:/var/run/docker.sock -e OPENAI_API_KEY=<API KEY> ghcr.io/obot-platform/obot:latest
Then open your browser to http://localhost:8080 to access the Obot UI.
You need to replace <API KEY>
with your OpenAI API Key.
Setting this is optional, but you'll need to setup a model provider from the Admin UI before using chat.
You can also set ANTHROPIC_API_KEY
here as well, setting the value to your Anthropic API Key.
Setting both is also supported, but OpenAI models will be set as the defaults.
For more installation methods, see our Installation Guide.
The Three Parts of Obotβ
The platform consists of three main components that work together to deliver a comprehensive AI solution.