AI-assisted traffic analysis inside Burp Suite Professional.
BurpGPT Pro brings model-provider settings, active scan analysis, targeted manual scans, and an interactive AI Chat editor into Burp. Use hosted model providers — OpenAI, Anthropic, Google AI Gemini — or keep selected traffic on infrastructure you control with local and self-hosted endpoints.
- $ provider configured
- $ ai-scanner active
- $ targeted-scan ready
- $ ai-chat ready
- $ context req · res · url · headers
- $ output target → site-map
What ships in the box.
Three analysis workflows
Run AI-assisted active scan checks, send selected traffic for targeted review, or ask follow-up questions in the AI Chat editor.
Unified provider settings
Configure hosted, local, or self-hosted model providers once, validate them with Test request, and reuse them across scanning and chat.
Prompt-driven analysis
Use the prompt library and placeholder reference to tailor model input with request, response, URL, method, header, and body context.
Configure hosted, local, or self-hosted providers once and reuse the same setup across every workflow.
Generated insights
live> scan complete · 3 informational issues added to Target site map
Provider-flexible AI analysis for modern Burp workflows.
BurpGPT Pro brings large language model workflows directly into Burp Suite Professional. Configure the active provider once, test the connection from the extension, then reuse the same setup across active scanning, targeted manual analysis, and AI Chat.
The AI scanner integrates with Burp's active scan flow and creates informational issues for generated insights. For narrower testing, send a selected request or response from the context menu and review the result in Target → Site map.
The AI Chat editor lets you ask follow-up questions about selected HTTP traffic while choosing whether to include the current request, response, or both with the next message.
Hosted providers are supported alongside local and self-hosted options. Examples include major cloud model APIs and compatible local runtimes; check the supported provider documentation for the current provider list. When you configure a local endpoint, selected traffic is sent only to that endpoint.
Prompt library import/export and the placeholder reference make it practical to keep repeatable analysis prompts for authentication, session handling, request parsing, response review, and other engagement-specific workflows.
Flexible
Cloud, local, and self-hosted backends
3
Active scan, targeted scan, and AI Chat
Private
Keep sensitive traffic on your own infrastructure
One licence. Every workflow.
Legacy
Archived
- Former open-source Community edition.
- No longer actively maintained.
- Kept available for reference and historical community use.
Pro
£99
£79 /year
Introductory price for early Pro customers.
- Compatible with the currently supported Burp Suite Professional release line.
- Works with hosted, local, and self-hosted model-provider setups, including major cloud APIs and compatible local runtimes.
- Analyse traffic with AI-powered active scan checks, targeted context-menu scans, and the AI Chat editor.
- Keep sensitive traffic on infrastructure you control by using local or self-hosted provider endpoints.
- Validate provider setup with Test request before scanning or chatting.
- Tune provider-specific request parameters such as temperature, top_p, stop, and max_tokens.
- Build reusable scan prompts with request, response, URL, method, header, and body placeholders.
- Manage prompts with add, remove, import, export, and send-to-settings workflows.
- Keep generated insights inside Burp's native issue workflow for easier triage.
- Provider accounts, quotas, and API usage costs are managed separately.
Questions, answered.
BurpGPT Pro supports three analysis workflows:
- Enable the
AI scannerto run AI-assisted checks during Burp active scans. - Right-click selected traffic and choose
Scan with AIfor targeted manual analysis. - Use the
AI Chateditor tab to ask follow-up questions about the selected request, response, or both.
One BurpGPT Pro licence can be activated on up to three devices. To use BurpGPT Pro on more devices, you need another licence.
There are several common reasons for licensing issues:
- Network or security controls may be blocking HTTP/HTTPS traffic to
burpgpt.apporburpgpt.netlify.app. - The activation limit may have been reached. The licence dialog shows your current activation count.
- Clock-related issues can occur if the local system time is out of sync.
- A corrupted
.jardownload can cause validation or activation errors. Re-download the file and try again. - BurpGPT Pro requires a supported Burp Suite Professional version.
Yes. LLM output can be incomplete, inaccurate, or overly confident, especially when the prompt or supporting traffic context is too broad.
BurpGPT Pro is designed to support security testing workflows, not replace manual validation. Treat generated findings as leads and verify them before relying on them in a report.
If you use a local or self-hosted model provider on infrastructure you control, BurpGPT Pro sends selected prompts, chat messages, HTTP requests, and HTTP responses only to the endpoint configured in Provider settings.
If that endpoint is local, the selected traffic stays local. If you configure a remote self-hosted endpoint, selected traffic is sent to that endpoint.
Local-provider traffic is not sent to a hosted third-party provider unless you explicitly configure one.
No. BurpGPT Pro integrates with external model providers, but accounts, setup, quotas, and usage costs are managed separately with each provider.
Hosted providers may charge separately for API usage. Local or self-hosted providers may have their own infrastructure costs.
Yes. BurpGPT Pro supports local, self-hosted, and custom provider setups when they expose a compatible chat API.
In Provider settings, configure:
- The base URL.
- The API key, if required.
- The exact model identifier.
- Any JSON request parameters.
Then run Test request before scanning or chatting.
Looking for more answers? Visit our documentation site for a comprehensive FAQ section.