While there are circumstances in which AIDR can be operated using an open, freely accessible LLM, typically organizations are using LLMs protected by API keys, running on dedicated company-hosted instances, and otherwise requiring container-level configuration to interact seamlessly with AIDR for GenAI. It’s worth noting that operating the AIDR proxy in forward-proxy (“enriched”) mode requires the LLM to be configured in the container settings, and not via API key. Additionally, configuring the connection in the container spares the necessity of sending the key with every request.
The container environment variables in this section are used to configure a backend connection to LLMs on the container level. Note that many of them do not have defaults, as they are only accessed when connecting to a specific LLM instance.
AIDR can be configured to route traffic to a single AWS account or multiple AWS accounts.
If there is only one AWS account configured, this account is used as the default for routing all traffic.
If multiple AWS accounts are registered, while there is a default account, it is not used for routing all traffic. Requests with incorrect AWS account information will result in an error message.
- There must be a default registered account. A single account configuration will be set as the default account.
- For any additional named credential sets, the suffix X in the environment variables should be the actual AWS Access Key ID. See the Single AWS Account and Multiple AWS Accounts examples below.
| Environment Key | Required | Example Value | Description |
|---|---|---|---|
| HL_LLM_PROXY_AWS_ACCESS_KEY_ID_DEFAULT | True | Default AWS Access Key ID | |
| HL_LLM_PROXY_AWS_SECRET_ACCESS_KEY_DEFAULT | True | Default AWS Secret Access Key | |
| HL_LLM_PROXY_AWS_SESSION_TOKEN_DEFAULT | False | Default AWS Session Token (required if using temporary credentials) | |
| HL_LLM_PROXY_AWS_REGION_DEFAULT | False | us-east-1 | Default AWS Region used if none is specified for another set |
| HL_LLM_PROXY_AWS_BEDROCK_BASE_URL | False | https://bedrock-runtime.{region}.amazonaws.com | Default Bedrock base URL used if none is specified for another set |
| HL_LLM_PROXY_AWS_SAGEMAKER_BASE_URL | False | https://runtime.sagemaker.{region}.amazonaws.com | Default Sagemaker base URL used if none is specified for another set |
| HL_LLM_PROXY_AWS_ACCESS_KEY_ID_X | False | AWS Secret Access Key for the credential set | |
| HL_LLM_PROXY_AWS_SESSION_TOKEN_X | False | AWS Session Token for the credential set | |
| HL_LLM_PROXY_AWS_REGION_X | False | Uses default region | AWS Region for the credential set |
| HL_LLM_PROXY_AWS_BEDROCK_BASE_URL_X | False | Uses default Bedrock | Bedrock base URL for the credential set |
| HL_LLM_PROXY_AWS_SAGEMAKER_BASE_URL_X | False | Uses default Sagemaker | Sagemaker base URL for the credential set |
If only one AWS account is registered with AIDR GenAI, this account is used by default to route traffic.
Example - Default AWS Account
This will create a default credential set. The application can then use this credential set to configure AWS clients.
export HL_LLM_PROXY_AWS_ACCESS_KEY_ID_DEFAULT=<your-secret-access-key>
export HL_LLM_PROXY_AWS_SECRET_ACCESS_KEY_DEFAULT=<your-session-token>
export HL_LLM_PROXY_AWS_REGION_DEFAULT=us-east-1AIDR can be configured to route traffic to multiple AWS accounts. If there are multiple keys registered, there is no default account for routing traffic. Requests with incorrect AWS account information will result in an error message.
When configuring multiple AWS accounts, you must create a default account and additional accounts.
Example - Default AWS Account
This will create a default credential set.
export HL_LLM_PROXY_AWS_ACCESS_KEY_ID_DEFAULT=<your-secret-access-key>
export HL_LLM_PROXY_AWS_SECRET_ACCESS_KEY_DEFAULT=<your-session-token>
export HL_LLM_PROXY_AWS_REGION_DEFAULT=us-east-1Example - Additional AWS Accounts
If your AWS Access Key ID is AKIAXXXXXXXXFOO, this will create a credential set named AKIAXXXXXXXXFOO.
export HL_LLM_PROXY_AWS_ACCESS_KEY_ID_AKIAXXXXXXXXFOO=<your-secret-access-key>
export HL_LLM_PROXY_AWS_SESSION_TOKEN_AKIAXXXXXXXXFOO=<your-session-token>
export HL_LLM_PROXY_AWS_REGION_AKIAXXXXXXXXFOO=us-east-1There must be a default AWS account, for either a single registered account or multiple registered accounts.
For any additional named credential sets, the suffix X in the environment variables should be the actual AWS Access Key ID. For example, if your AWS Access Key ID is AKIAXXXXXXXXFOO, set HL_LLM_PROXY_AWS_ACCESS_KEY_ID_AKIAXXXXXXXXFOO and corresponding variables using that exact key ID. This creates a credential set named after that AWS Access Key ID.
Arbitrary settings can be sourced from the AWS Secrets Manager. This is intended for Instance Profile authentication use only.
The environment variable is HL_LLM_PROXY_SECRETS_<MANAGER NAME>_SECRET_<PROXY SETTING NAME>.
- For
<MANAGER NAME>, use AWS. - For
<PROXY SETTING NAME>, use any of the environment variable names for a setting.
The following examples will pull the HL_LICENSE and HL_LLM_PROXY_CLIENT_ID from a secret named HiddenLayer_AIDR in the AWS Secrets Manager.
HL_LLM_PROXY_SECRETS_AWS_SECRET_HL_LICENSE=arn:aws:secretsmanager:us-west-2:123456789012:secret:HiddenLayer_AIDR-abCdEF
HL_LLM_PROXY_SECRETS_AWS_SECRET_HL_LLM_PROXY_CLIENT_ID=arn:aws:secretsmanager:us-west-2:123456789012:secret:HiddenLayer_AIDR-abCdEFWhen creating new secrets, AWS encourages you to store the secrets in a JSON blob. For example, '{"HL_LLM_PROXY_CLIENT_ID": "xxxx-xxxx-xxxx-xxxx"}'.
- This implementation requires the keys in this JSON blob to match the Proxy env var setting name.
- Alternatively, you can store a plaintext value for the secret.
AIDR can be configured to route traffic to an Azure OpenAI account.
| Environment Key | Required | Example Value | Description |
|---|---|---|---|
| HL_LLM_PROXY_AZURE_TENANT_ID | False | A unique identifier for your Azure tenant | |
| HL_LLM_PROXY_AZURE_CLIENT_ID | False | A unique identiifer assigned to your application registration | |
| HL_LLM_PROXY_AZURE_CLIENT_SECRET | False | A unique secret assigned to your application registration | |
| HL_LLM_PROXY_AZURE_BASE_URL | False | Default Azure application base URL used if none is specified for another set | |
| HL_LLM_PROXY_AZURE_REGION | False | eastus | Default Azure region |
| Environment Key | Required | Example Value | Description |
|---|---|---|---|
| HL_LLM_PROXY_OPENAI_AZURE_API_KEY | False | The API key to access Azure OpenAI | |
| HL_LLM_PROXY_OPENAI_AZURE_SCHEME | False | The schema for Azure OpenAI output | |
| HL_LLM_PROXY_OPENAI_AZURE_HOST | False | The name of the Azure OpenAI host |
AIDR can be configured to route traffic to an OpenAI account.
| Environment Key | Required | Example Value | Description |
|---|---|---|---|
| HL_LLM_PROXY_OPENAI_DEFAULT_MODEL | False | Enforces the use of only the model specified in this setting. For example, gpt-4o. | |
| HL_LLM_PROXY_OPENAI_BASE_URL | False | https://api.openai.com | Default OpenAI base URL used if none is specified for another set |
| HL_LLM_PROXY_OPENAI_API_KEY | False | The API key to access OpenAI |
AIDR can be configured to route traffic to a Hugging Face model.
| Environment Key | Required | Example Value | Description |
|---|---|---|---|
| HL_LLM_PROXY_HUGGINGFACE_TGI_PROVIDER | False | huggingface-tgi | Identifying Hugging Face as the model provider |
| HL_LLM_PROXY_HUGGINGFACE_TGI_NAMES | False | Qwen/Qwen3-0.6B | The Hugging Face model name |
| HL_LLM_PROXY_HUGGINGFACE_TGI_BASE_URLS | False | Default Hugging Face base URL used if none is specified for another set |
AIDR can be configured to route traffic to a custom model, like Ollama.
| Environment Key | Required | Example Value | Description |
|---|---|---|---|
| HL_LLM_PROXY_CUSTOM_{{MODELNAME}} | False | HL_LLM_PROXY_CUSTOM_LLAMA=tinyllama | Used to provide an appropriate name for the model |
| HL_LLM_PROXY_CUSTOM_{{MODELNAME}}_PROVIDER | False | HL_LLM_PROXY_CUSTOM_LLAMA_PROVIDER=ollama | Provider of the custom model |
| HL_LLM_PROXY_CUSTOM_{{MODELNAME}}_BASE_URL | False | HL_LLM_PROXY_CUSTOM_LLAMA_BASE_URL=http://ollama:11434 | Base URL of the endpoint where the model can be queried |
| HL_LLM_PROXY_CUSTOM_{{MODELNAME}}_API_KEY | False | The API key for the custom model. |