Securing sensitive pages from SearchBots

From Hackipedia
Jump to: navigation, search
Hackerone #3986
Target: Hackerone
Target Module:
Type: Best Practice
Payload:
Original: Link
CVE:
Archive Screenshot

Hackerone did not disallow search engine spiders from cataloging pages where session based URL parameters where being used.

How To Perform

  1. Try different Google dorks to find URLs containing sensitive information, tokens, CSRF parameters, etc
    1. site:example.com inurl:&
    2. site:example.com inurl:token
  2. Ensure that URL directories using tokens are barred from scanning in robots.txt