Enhance your Burp Suite with Large Language Models (LLMs)

Experience enhanced web security testing with BurpGPT our Burp Suite extension which integrates OpenAI's LLMs for advanced vulnerability scanning and traffic-based analysis. The Pro edition also supports local LLMs, including custom-trained models, ensuring greater data privacy and more accurate results according to your needs.

Effortless security testing

Effortlessly integrate Burp GPT into your security testing workflows with user-friendly documentation.

Developed by AppSec specialists

Developed by application security experts, Burp GPT represents the cutting-edge of web security testing.

Continuously evolving security

Burp GPT continuously improves based on user feedback, ensuring it meets evolving security testing needs.

%%{ init: { 'theme': 'base', 'themeVariables': { 'primaryColor': '#0d6efd', 'primaryTextColor': '#fff', 'primaryBorderColor': '#808080', 'lineColor': '#808080', 'secondaryColor': '#495057' } } }%% graph TB A(Web Application) -- Traffic --> B{Burp Suite} B -- Analysed Traffic --> C{BurpGPT Pro} C -- AI Co-Pilot --> D(Web Traffic Analysis) E(Custom Trained Models) -- Integration --> C D -- Enhanced Report --> B

Burp GPT. unleash the power of LLMs.
the ultimate Burp Suite add-on.

Burp GPT is a robust tool developed to enhance precision and efficiency of application security testing. Extended with advanced language processing capabilities and an intuitive interface, it enhances security testing for both beginners and seasoned testers alike.

Perform sophisticated technical tasks; such as evaluating cryptographic integrity of custom libraries or even detect zero-days. By leveraging the power of large language models (LLMs). With your imagination and the quality of your prompts as the only constraints, assess web applications using an AI Co-Pilot to perform web traffic analysis.

Burp GPT is the comprehensive tool for consultants and security experts alike. By employing local large language models (LLMs), a feature exclusive to the Pro edition, BurpGPT negates third party data sharing to ensure client engagement confidentiality.

Additional Pro edition features include prompt libraries and support for custom-trained models. Have internal repositories of appsec data? Train your own model and harness your internal knowledge-base.

Join the community of forward-thinking professionals who use Burp GPT. Elevate your application security testing and ensure privacy and compliance using large language models.

1k +

GitHub ⭐

100 +

active users

120,000 +

open-source LLMs supported from the Hugging Face Model Hub

Pricing


Community

£0/year

  • Compatible with Burp Suite Professional edition.
  • Maximise Burp's scanner efficiency by integrating LLMs.
  • Support for four (4) - now deprecated - GPT-3 LLMs from OpenAI.
Pro

£99

£79/year

Make the most of our time-limited promotional offer, available while we actively enhance and refine BurpGPT Pro.

  • Compatible with Burp Suite Professional edition.
  • Maximise Burp's scanner efficiency by integrating LLMs.
  • Support for the latest LLMs from OpenAI, including GPT-4.
  • Support for Azure OpenAI Service.
  • Support for local LLMs ensuring complete data privacy.
  • Leverage 120,000+ PyTorch open-source LLMs on Hugging Face Model Hub.
  • Compatibility with custom-trained LLMs based on Flax, PyTorch and TensorFlow.
  • Scan targeted traffic, skip the Burp-wide scanner, and reduce API usage costs.
  • Manage, store, and share prompts using the extensive prompt library.
  • Enhanced user interface and improved user experience.
  • ...

Frequently Asked Questions


To use BurpGPT Community and Pro editions you must have Burp Suite Professional installed on your system.
A single BurpGPT Pro license allows activation on a maximum of three devices. If you require activation on more than three devices, you must obtain a new license.
There could be various reasons why you're facing issues with the licensing system:
  • It's possible that the licensing system endpoints aren't being reached because you're connected through a VPN or a similar network. If that's the case, please try again after disconnecting the VPN.
  • Alternatively, the problem could be with your Java version, and you might need to use Java 14 or a later version.
  • Lastly, it's also possible that the .zip file you downloaded is corrupt, causing issues with the licensing system. In that case, please download the file again and try once more.
Most software that automates processes are prone to generating false positives. This holds even more accurate when referring to LLMs since the output quality is primarily determined by the prompt and supporting data for the models.
When you use the local LLM feature with the pre-built model type options from the Hugging Face Transformers library, it downloads the pre-trained model and associated files from the Hugging Face Model Hub. However, this process does not send any of your data to Hugging Face or any other external service. The model generation itself is performed locally on your machine, so your text inputs and generated text stay within your system. To summarise, using the local LLM feature will only result in downloading - and then caching - the model files from the Hugging Face Model Hub, and no data will be sent to external services during the actual text generation process.
When you encounter an error message stating that port 3000 is already in use and the server fails to start, it means that another service is currently using that port or a previous instance of BurpGPT has not been gracefully killed. To resolve this issue, you can run the following commands to kill the server before restarting BurpGPT:
  • On Windows: Get-Process -Id (Get-NetTCPConnection -LocalPort 3000).OwningProcess | Stop-Process -Force
  • On Linux/Mac: sudo lsof -t -i :3000 to find the process ID, then kill -9 <PID> to force-kill the process.
To use a custom-trained model in BurpGPT Pro edition, your model should be based on a pre-trained model from the Hugging Face library. Once you have created your custom model, you can simply enter the path to the folder containing all the required files (such as config.json, vocab.json, tokenizer.json, etc.) in the "LLM directory" field of the Local LLM tab. After this, all of the requests, responses, and prompts will be processed using your custom-trained model. It's worth noting that BurpGPT Pro doesn't currently support loading non-Hugging Face models directly.