Return prometheus formatted metrics
LLM Proxy (1)
HiddenLayer LLM-Proxy is a service that provides detection and response capabilities for Generative AI solutions in the HiddenLayer AI Security Platform.
Locally deployed AIDR proxy in Hybrid mode
- The LLM Proxy API endpoint is supported only for a locally deployed AIDR proxy running in Hybrid mode. This API is not supported in HiddenLayer SaaS.
- The documentation is presented here on the documentation portal to help reduce confusion between a locally running proxy and the SaaS proxy.
- For more API documentation, see the HiddenLayer Developer Portal.
Download OpenAPI description
Languages
Servers
Mock server
https://docs.hiddenlayer.ai/_mock/docs/products/aidr-g/llm_proxy_api/
Self-hosted server
https://<YOUR-SELF-HOSTED-AIDR-INSTANCE-ENDPOINT>/
- Mock serverhttps://docs.hiddenlayer.ai/_mock/docs/products/aidr-g/llm_proxy_api/metrics
- Self-hosted serverhttps://<YOUR-SELF-HOSTED-AIDR-INSTANCE-ENDPOINT>/metrics
- curl
- JavaScript
- Node.js
- Python
- Java
- C#
- PHP
- Go
- Ruby
- R
- Payload
curl -i -X GET \
https://docs.hiddenlayer.ai/_mock/docs/products/aidr-g/llm_proxy_api/metrics