Deployment
Learn to deploy OpenAgent.
Deployment
Introduction
This guide provides instructions for a deployment using containerized solutions including Docker and Kubernetes. It is assumed that the user has a basic understanding of terminal commands and containerization concepts.
The guide is opinionated towards a containerized deployment solution, the source code is available at: https://github.com/RSS3-Network/OpenAgent, if one prefers an alternative deployment method.
It's important to note that while the Node itself is production-ready, the deployment methods we outline here may not be universally applicable in all environments. Adaptation of these methods may be necessary to fit the specific requirements of your own setup.
Prerequisites
Hardware Requirements
Here is a recommended hardware configuration:
- CPU: 4 Cores
- RAM: 8 GB
- GPU: an NVIDIA card with 16 GB VRAM
- Storage: 20 GB SSD (adjust it based on your model size)
Configure via .env
A .env
file is required for deploying your OpenAgent. Duplicate the .env.sample
file in src/
to begin with.
Here we provide a sample .env
file:
Deploy via docker-compose
Sample docker-compose.yaml
For your reference, a docker-compose.yaml looks like this for production:
Start OpenAgent
Once you have the compose file and .env file ready, simply run:
🎉 And you are done! (Yes it's that simple.)
Deploy via Terraform on Google Cloud
If you are reading this, you probably don't need much guidance.
See our Terraform module:
https://github.com/RSS3-Network/terraform-gcp-openagent
https://registry.terraform.io/modules/RSS3-Network/openagent/gcp/latest
Integrate into Any Apps
To integrate OpenAgent into any apps, see:
Integrate with Any App
Learn how to integrate OpenAgent with any App.
You may also use localhost:18000 to visit Chainlit, which is built into OpenAgent by default, to start using OpenAgent. See:
Chainlit
Learn how to interact with OpenAgent via Chainlit.
Learn how to interact with OpenAgent via Chainlit.
Initialization with Local Models
If you have successfully deployed your OpenAgent instance, and you are using a local model, there are some initialization steps you need to take. OpenAgent uses Ollama to manage your local models, see:
Ollama
Learn how to deploy an open source model with OpenAgent using Ollama.
Download the Model
Send this request to Ollama for downloading the chosen model:
Replace ${MODEL_NAME}
with the model you are using. For compatible models, see:
Compatible Models
Learn about the models compatible with OpenAgent.
Learn what models are compatible with OpenAgent.
Conclusion
This guide provides basic instructions for deploying an OpenAgent instance in containerized environments. For more advanced configurations, refer to the respective Docker or Kubernetes documentation.
This guide adopts an opinionated stance in favor of containerized deployment. We advocate for this approach due to its benefits in ensuring consistency and scalability, promoting ease of portability across various environments. However, we appreciate the diverse preferences and needs within our community, for those who wish to compile and run the application locally, the source code is available on: https://github.com/RSS3-Network/OpenAgent.