Documentation Index
Fetch the complete documentation index at: https://docs.lucid.foundation/llms.txt
Use this file to discover all available pages before exploring further.
Deploy from CLI
Launch an Agent Using the Base Runtime
To deploy an agent using the base runtime from the command line, you can use the following command:ghcr.io/lucid-fdn/agent-runtime:v1.0.0. This image is designed to work with any OpenAI-compatible endpoint for inference, which you can specify using the PROVIDER_URL environment variable.
Configuration
The deployment is configured through several environment variables:- LUCID_MODEL: Specifies the model to be used.
- LUCID_PROMPT: Defines the prompt for the agent.
- LUCID_TOOLS: Lists any additional tools or configurations.
Inference and Receipts
- Inference: Handled via the
PROVIDER_URL, allowing you to connect to any compatible endpoint. - Receipts: Managed through the
LUCID_API_URL, which is decoupled from the inference process to ensure flexibility and modularity.
.png?fit=max&auto=format&n=VsjUqn6fLqEhBiuI&q=85&s=8b4c7e6431e9a6af1ef23b77bb4ff5fd)
.png?fit=max&auto=format&n=VsjUqn6fLqEhBiuI&q=85&s=d5651a45e4bfbabc33f74e146af3f94a)