# LLM Proxy
HiddenLayer LLM-Proxy is a service that provides detection and response capabilities for Generative AI solutions in the HiddenLayer AI Security Platform.
{% admonition type="warning" name="Locally deployed AIDR proxy in Hybrid mode" %}
- The LLM Proxy API endpoint is supported only for a locally deployed AIDR proxy running in Hybrid mode. This API is not supported in HiddenLayer SaaS.
- The documentation is presented here on the documentation portal to help reduce confusion between a locally running proxy and the SaaS proxy.
- For more API documentation, see the HiddenLayer Developer Portal.
{% /admonition %}
Version: 1
## Servers
Self-hosted server
```
https:///
```
## Download OpenAPI description
[LLM Proxy](https://docs.hiddenlayer.ai/_bundle/docs/products/aidr-g/llm_proxy_api.yaml)
## Health and Metrics
### Metrics
- [GET /metrics](https://docs.hiddenlayer.ai/docs/products/aidr-g/llm_proxy_api/health-and-metrics/metrics.md): Return prometheus formatted metrics
## OpenAI
### Reverse Proxy OpenAI Chat completions
- [POST /v1/chat/completions](https://docs.hiddenlayer.ai/docs/products/aidr-g/llm_proxy_api/openai/reverseproxyopenaicompletions.md): Generate completions for a given prompt
### Proxy OpenAI Chat completions
- [POST /api/v1/proxy/openai/chat/completions](https://docs.hiddenlayer.ai/docs/products/aidr-g/llm_proxy_api/openai/proxyopenaichatcompletions.md): Generate completions for a given prompt and return response and analysis
## Azure
### Proxy Azure endpoint
- [POST /api/v1/proxy/azure/{endpoint_name}/score](https://docs.hiddenlayer.ai/docs/products/aidr-g/llm_proxy_api/azure/proxyazureendpointscore.md): Generate text or chat completions for a given Azure endpoint and return response and analysis
### Proxy Azure endpoint and deployment using Azure Entra
- [POST /api/v1/proxy/azure/{endpoint_name}/{deployment_name}](https://docs.hiddenlayer.ai/docs/products/aidr-g/llm_proxy_api/azure/proxyazureendpointentra.md): Generate text or chat completions for a given Azure endpoint and return response and analysis
## HuggingFace TGI
### Reverse Proxy HuggingFace TGI endpoint
- [POST /tgi{name}/v1/chat/completions](https://docs.hiddenlayer.ai/docs/products/aidr-g/llm_proxy_api/huggingface-tgi/reverseproxyhuggingfacetgicompletions.md): Generate completions for a HuggingFace TGI endpoint
### Proxy HuggingFace TGI Endpoint
- [POST /api/v1/proxy/tgi/{name}](https://docs.hiddenlayer.ai/docs/products/aidr-g/llm_proxy_api/huggingface-tgi/proxyhuggingfacetgicompletions.md): Proxy chat completions for a given HuggingFace TGI endpoint and return response and analysis