Configuration Resolution
While Model Resolution maps to the direct model configuration as defined in config.toml, a final configuration is built from merging that with others layers of configuration from difference sources. This gives full flexibility over where every configuration key is defined.
Configuration Sources
The following configuration sources are supported, ordered by precedence (replace openai and variant as needed).
- Defaults: These are hardcoded, (hopefully) sane default values for common provider properties.
- PROMPTCMD_PROVIDERS_*: Non-provider specific environment variables
- In
config.tomlunder[providers]: Properties common across all providers. - PROMPTCMD_OPENAI_*: Environment variables
- In
config.tomlunder[providers.openai]: Properties for OpenAI provider - PROMPTCMD_OPENAI_VARIANT_*: Environment variables for variant
- In
config.tomlunder[providers.openai.variant]: Properties for variant - In prompt file's frontmatter: Properties for variant
- Command line argument: E.g.,
--config-temperature
General Strategy
Generally, resolution adheres to the following principles:
- Specialized configurations override general ones (e.g., variant vs base provider)
config.tomltakes precedence over environment variable.- A frontmatter-defined configuration overrides both
config.tomland any environment variable - A command-line configuration overrides everything.
Example
If config.toml contains:
[providers.openai]
temperature = 1.0and an environment variable PROMPTCMD_OPENAI_TEMPERATURE=0.5 is set, the final configuration honors the toml value.
However, if additionally the prompt file contains:
---
config:
temperature: 0.2
---or the prompt is executed with --config-temperature 0.2, then these take precedence over everything else.
Groups
When Model Resolution results in a group, the above configuration resolution strategy is applied to its members.