How Aikido Uses AI

Aikido uses AI throughout the full software development lifecycle. It starts in the IDE, continues in pull requests and scans, helps teams fix and enforce security standards, and extends into runtime protection and pentesting.

AI across the SDLC

From first code to production, Aikido uses AI to:

  • prioritize issues early

  • generate and refine fixes

  • generate API specs

  • enforce custom code and cloud rules

  • learn repo-specific standards through extra code context

  • track AI usage in production

  • validate security with AI-powered pentests

1. Start in the IDE

Aikido brings AI directly into the IDE. Developers can prioritize findings and apply fixes before code reaches a pull request.

Read Aikido AI in IDE.

2. Reduce noise during scans and review

As code moves through scans and pull requests, Aikido uses AI to cut SAST noise and surface what matters first. It checks exploitability and adjusts priority based on real code context and likely impact. Check out Denoise via SAST AutoTriage.

Code Quality is also AI-powered. It reviews newly introduced pull request changes and helps enforce engineering standards across many languages. More info in Code Quality Overview.

You can also add extra code context. This gives Aikido AI more signal and less noise. Use it to explain accepted exceptions, architectural choices, and repo-specific standards. Aikido then uses that context to make Code Quality comments more relevant. Check out how to Add Extra Code Context.

3. Fix issues with AutoFix

Aikido generates reviewable fixes for code, dependencies, infrastructure, and containers. You can apply fixes in the IDE or open pull requests for review.

You can also refine generated fixes with follow-up instructions, so the patch better matches your codebase and standards.

Read AutoFix Overview, AutoFix for SAST and IaC Issues, AutoFix for Open Source Dependencies, AutoFix for Containers, and Refine AutoFixes with Aikido AI.

4. Generate specs and enforce policies

Aikido can generate an OpenAPI specification directly from backend code. That helps teams start API scanning without maintaining the spec by hand.

Read Autogenerate OpenAPI via Aikido AI (Code2Swagger).

Aikido also uses AI for custom rule creation. Teams can define custom code checks in natural language and generate custom cloud misconfiguration checks for their environment.

Read Add Custom Code Rules and Custom CSPM Rules.

5. Track runtime AI usage

Zen Firewall tracks LLM provider usage, model activity, token consumption, and estimated cost. That gives teams visibility into how AI is used in running applications.

Read Tracking AI / LLM usage with Zen Firewall.

6. Validate security with Pentests

Aikido Pentest uses autonomous agents to discover, exploit, and validate vulnerabilities across applications, APIs, and infrastructure.

Read Pentest Overview.

Last updated

Was this helpful?