Large corporations typically rely on cloud providers to serve their large language models, each offering a suite of tools and services to build and deploy LLM applications. For instance, Microsoft Azure provides Azure AI services, AWS has Bedrock, and GCP offers Google Vertex. At PAI, we meet our customers where they are by offering a diverse range of integration options through our LLM runtime security solution, Layer. This includes both agent-based and agentless integration options, providing maximum flexibility for securing LLM applications at runtime. In this brief overview, we focus on how you can secure your LLM applications using our agentless integration option.
Agentless security allows you to monitor and secure LLM endpoints without needing to install agents or instrument each LLM application with an SDK. This approach is highly effective as it enables us to scan and monitor all your LLM endpoints from the outside by assessing the information provided by your cloud provider. For companies that prefer not to install agents due to security or operational concerns, our agentless integration option offers a minimally invasive, cost-effective, and simple way to secure LLM applications in production. It provides several key benefits:
We achieve this by tapping into the existing logging capabilities within vendor environments to collect data. For example, major cloud providers offering AI services typically provide an API gateway to manage LLM endpoints. Using this gateway, we can ingest or scrape logs directly into Layer (when available), monitor any application, and automatically discover new ones as they are detected.
While agentless security offers numerous benefits, it also has some limitations. It can only monitor and discover endpoints that are proxied through the gateway from which Layer is collecting logs. Additionally, agentless security does not provide full runtime protection in the same way eBPF does. Unlike eBPF, agentless security cannot run directly alongside your workloads, which limits its visibility into workflows that extend beyond the LLM application.
—
We currently support agentless integration for Azure AI and are in the process of rolling out support for AWS and GCP. If you're using any of these cloud providers' AI services and have implemented a gateway such as Azure API Management Service, reach out to us to try our agentless offering, which can be set up in just one day.
Learn more about Layer and set up a demo with a sales expert here.