Skip to content

LLM Proxy (1)

HiddenLayer LLM-Proxy is a service that provides detection and response capabilities for Generative AI solutions in the HiddenLayer AI Security Platform.

Locally deployed AIDR proxy in Hybrid mode
  • The LLM Proxy API endpoint is supported only for a locally deployed AIDR proxy running in Hybrid mode. This API is not supported in HiddenLayer SaaS.
  • The documentation is presented here on the documentation portal to help reduce confusion between a locally running proxy and the SaaS proxy.
  • For more API documentation, see the HiddenLayer Developer Portal.
Download OpenAPI description
Languages
Servers
Mock server
https://docs.hiddenlayer.ai/_mock/docs/products/aidr-g/llm_proxy_api/
Self-hosted server
https://<YOUR-SELF-HOSTED-AIDR-INSTANCE-ENDPOINT>/

Health and Metrics

Operations

Metrics

Request

Return prometheus formatted metrics

curl -i -X GET \
  https://docs.hiddenlayer.ai/_mock/docs/products/aidr-g/llm_proxy_api/metrics

Responses

Response with prometheus formatted metrics

Bodytext/plain
string

OpenAI

Operations

Azure

Operations

HuggingFace TGI

Operations