security
Guardrails
Rules and constraints you set to keep AI output safe, on-topic, and within boundaries. "Do NOT modify existing tests" is a guardrail. They prevent AI from going rogue on your codebase.
Want to learn more about AI?
Peter Saddington has trained 17,000+ people on agile and AI. Let’s talk.
Work with Peter