Enhance your Burp Suite with Large Language Models (LLMs)

Experience enhanced web security testing with BurpGPT, our Burp Suite extension integrating major cloud-based model providers such as Anthropic, Google AI Gemini, and OpenAI, while also supporting fully local processing—ensuring that no data ever leaves your device—through Ollama and Hugging Face for advanced vulnerability scanning and traffic-based analysis.

Effortless security testing

Effortlessly integrate Burp GPT into your security testing workflows with user-friendly documentation.

Developed by AppSec specialists

Developed by application security experts, Burp GPT represents the cutting-edge of web security testing.

Continuously evolving security

Burp GPT continuously improves based on user feedback, ensuring it meets evolving security testing needs.

BurpGPT. unleash the power of LLMs.
the ultimate Burp Suite add-on.

BurpGPT is a robust tool developed to enhance precision and efficiency of application security testing. Extended with advanced language processing capabilities and an intuitive interface, it enhances security testing for both beginners and seasoned testers alike.

Perform sophisticated technical tasks; such as evaluating cryptographic integrity of custom libraries or even detect zero-days. By leveraging the power of large language models (LLMs). With your imagination and the quality of your prompts as the only constraints, assess web applications using an AI Co-Pilot to perform web traffic analysis.

BurpGPT is the comprehensive tool for consultants and security experts alike. By employing local large language models (LLMs), a feature exclusive to the Pro edition, BurpGPT negates third party data sharing to ensure client engagement confidentiality.

Additional Pro edition features include prompt libraries and support for custom-trained models. Have internal repositories of appsec data? Train your own model and harness your internal knowledge-base.

Join the community of forward-thinking professionals who use BurpGPT. Elevate your application security testing and ensure privacy and compliance using large language models.

2k+
GitHub ⭐
250+
active users
120,000+
open-source LLMs supported from
Hugging Face and Ollama

Pricing

Community

£0 /year
  • Compatible with Burp Suite Professional edition.
  • Maximise Burp's scanner efficiency with seamless LLM integration.
  • Supports four deprecated GPT-3 models from OpenAI.

Pro

£99

£79 /year

Limited-time promotional offer while we actively enhance BurpGPT Pro.

  • Compatible with Burp Suite Professional edition.
  • Maximise Burp's scanner efficiency with seamless LLM integration.
  • Supports leading LLM providers, including Anthropic, Google AI Gemini, OpenAI, and more.
  • Local LLM support via Hugging Face and Ollama for complete data privacy.
  • Access 120,000+ PyTorch open-source models on Hugging Face Model Hub.
  • Access 100+ open-source models on Ollama Model Library.
  • Compatible with custom-trained models built on Flax, PyTorch, and TensorFlow.
  • Scan targeted traffic to reduce API usage costs and skip unnecessary Burp-wide scans.
  • Manage, store, and share prompts with an extensive prompt library.
  • Enhanced user interface and improved user experience.
  • ...

Frequently Asked Questions

Is it possible to operate BurpGPT editions independently as standalone applications?
To use BurpGPT Community and Pro editions you must have Burp Suite Professional installed on your system.
How many devices are allowed for activation with a single BurpGPT Pro license?
A single BurpGPT Pro license allows activation on a maximum of three devices. If you require activation on more than three devices, you must obtain a new license.
What could be the reasons for a failed activation of my BurpGPT Pro?
There could be various reasons why you're facing issues with the licensing system:
  • It's possible that the licensing system endpoints aren't being reached because you're connected through a VPN or a similar network. If that's the case, please try again after disconnecting the VPN.
  • Alternatively, the problem could be with your Java version, and you might need to use Java 14 or a later version.
  • Lastly, it's also possible that the .zip file you downloaded is corrupt, causing issues with the licensing system. In that case, please download the file again and try once more.
Can false positive outcomes be generated by BurpGPT editions?
Most software that automates processes are prone to generating false positives. This holds even more accurate when referring to LLMs since the output quality is primarily determined by the prompt and supporting data for the models.
Does the local LLM feature, exclusive to the Pro edition, share any data with OpenAI or any third party?
When you use the local LLM feature with the pre-built model type options from the Hugging Face Transformers library, it downloads the pre-trained model and associated files from the Hugging Face Model Hub. However, this process does not send any of your data to Hugging Face or any other external service. The model generation itself is performed locally on your machine, so your text inputs and generated text stay within your system. To summarise, using the local LLM feature will only result in downloading - and then caching - the model files from the Hugging Face Model Hub, and no data will be sent to external services during the actual text generation process.
The server is unable to start and is displaying an error message indicating that port 3000 is already in use. What steps can I take to resolve this issue?
When you encounter an error message stating that port 3000 is already in use and the server fails to start, it means that another service is currently using that port or a previous instance of BurpGPT has not been gracefully killed. To resolve this issue, you can run the following commands to kill the server before restarting BurpGPT:
  • On Windows: Get-Process -Id (Get-NetTCPConnection -LocalPort 3000).OwningProcess | Stop-Process -Force
  • On Linux/Mac: sudo lsof -t -i :3000 to find the process ID, then kill -9 <PID> to force-kill the process.
What is the process for utilising a custom-trained model in BurpGPT Pro?
To use a custom-trained model in BurpGPT Pro edition, your model should be based on a pre-trained model from the Hugging Face library. Once you have created your custom model, you can simply enter the path to the folder containing all the required files (such as config.json, vocab.json, tokenizer.json, etc.) in the "LLM directory" field of the Local LLM tab. After this, all of the requests, responses, and prompts will be processed using your custom-trained model. It's worth noting that BurpGPT Pro doesn't currently support loading non-Hugging Face models directly.

Looking for more answers? Visit our documentation site for a comprehensive FAQ section.