Securing sensitive pages from SearchBots

From Hackipedia
Jump to: navigation, search
Hackerone #3986
Target: Hackerone
Target Module:
Type: Best Practice
Original: Link
Archive Screenshot

Hackerone did not disallow search engine spiders from cataloging pages where session based URL parameters where being used.

How To Perform

  1. Try different Google dorks to find URLs containing sensitive information, tokens, CSRF parameters, etc
    1. inurl:&
    2. inurl:token
  2. Ensure that URL directories using tokens are barred from scanning in robots.txt