Skip to content

Injection: LLM Insecure Plugin Design

Identifier: llm_insecure_plugin_design

Scanner(s) Support

GraphQL Scanner REST Scanner WebApp Scanner

Description

LLM insecure plugin design happens when plugins integrated with large language models arent properly checked for valid input or overseen for permissions. This means attackers could trick the system into running harmful code or getting sensitive data without proper controls, leading to dangerous outcomes like remote code execution or data theft. Developers often overlook thorough input validation or mistakenly trust external inputs, which can open the door to serious security threats if not fixed.

References:

Configuration

Example

Example configuration:

---
security_tests:
  llm_insecure_plugin_design:
    assets_allowed:
    - REST
    - GRAPHQL
    - WEBAPP
    skip: false

Reference

assets_allowed

Type : List[AssetType]*

List of assets that this check will cover.

skip

Type : boolean

Skip the test if true.