web crawlers really need an option to just consider all "?something" urls as the same as the base URL and skip over them...
Anyone know if you can use #Burpsuite or similar to force a HTTP-301/302 redirect of them back to the base URL?
web crawlers really need an option to just consider all "?something" urls as the same as the base URL and skip over them...
Anyone know if you can use #Burpsuite or similar to force a HTTP-301/302 redirect of them back to the base URL?
Me, pretty much every week using Burp Suite for years: It would be great to have a Burp internal task manager to figure out what is burning a full CPU while no requests are going through it.
Meanwhile Burp devs: AI! AI! AI! AI!
Cybersecurity cert prep: Lab 13 (Path Traversal) — sending Python requests through a Burp Suite proxy

Cybersecurity cert prep: Lab 14 (Path Traversal) — null byte attacks, console→Python (sys), and traversal protection

Cybersecurity cert prep: Lab 23 (JWT) — elevate to admin using JWT toolkit, rockyou.txt, and Burp Suite

Getting ready for the cybersecurity certification exam — Lab 25 (Blind SSRF) — Burp Suite Collaborator

Getting ready for the cybersecurity certification exam — Lab 4 (SSRF) — Delete a user. Python, Burp Suite

Cybersecurity cert prep: Lab 6 (SSRF) — bypassing blacklists via encoding. Python & Burp Suite

Cybersecurity cert prep: Lab 9 (Path Traversal) — build a vulnerable Flask server and test it with Burp Suite
