Scraper/Parser
Learn about the scraper and its specifications, how to set it up, important information to know and other details.
Valid or Filtered?
The valid links are all links scraped from the search engine(s), meaning they're not sorted and contains a lot of duplicates, links without parameters and other useless lines.
We always recommend people to download filtered links if they want to be effective, save time and get high injectable rates.
The filtered links are cleaned links without duplicates, bad links or empty parameters.
Page count
This feature works with all search engines except for Google since we already grab as much pages as possible.
URL check
This setting will ensure you don't get links you already scraped before, it's a local AntiPublic system basically.
Domain check
The domain check is here to clean up the duplicates, if you get for example those 2 links:
It'll only keep one of these two links.
Remove duplicates
This setting is here to back up the domain check if it's disabled. It'll remove exact same links and filter them out.
Keep unfiltered
This feature is only useful if you know what to do with links that are raw without parameters. Do not enable if you need links with parameters.
Last updated