N
Hacker Next
new
show
ask
jobs
submit
login
Design Patterns for Securing LLM Agents Against Prompt Injections
simonwillison.net
104 points by
simonw
23 hours ago
|
24 comments
add comment