Webslayer help:

WebSlayer introduction

Webslayer is a tool designed to help the Web application security testers to perform brute force attacks in any part of the application. Webslayer was mainly designed as a frontend for Wfuzz. After some time programming the frontend, more options and functions were added to the tool, now it has a Payload Generator, Enconding interface and Raw request generator. Webslayer is a very flexible web application bruteforcer, with a great interface to work with the results. AIt can be used to perform login bruteforce, resource location prediction, parameter bruteforcing, headers bruteforcing, authentication bruteforcing (NTLM, Basic). The idea in the tools is to use a keyword (FUZZ) where you want to perform the bruteforce. This FUZZ will be replaced by the payload selected by the user.

Interface description

The interface is divided in 7 tabs. The first one is for setting up the request to be bruteforced, and the options for the attack.

In this tab you can select the type of the payload:

Also you can inject the payload in all the parameters from the Url, Post or Headers, without the need of using the keyword FUZZ.

Authentication:

Here you can set the username and password in case it's necessary, or if you need to bruteforce the authentication. It support Basic, NTLM and Digest(not fully tested) The format is username:password

Payload selection:

Here you can select or set the payload. In case of dictionary you should choose a file. In case of Range, you should set the range like 1-10 and if you selected payload, you just have to press import from Payload Generator (after you created a valid Payload). For all the payloads, you can select a encoding to apply.

Options:

With these options you can set a http proxy, choose the threads, set anonymous browsing, and some filters. Anonymous browsing: This is a temporal workaround for preventing the web browser to load the request directly when you are using a proxy. Ignore Codes (Filters) These filters are for controlling the output in the results table. As this tool will present all the responses in a table, if we can leave out things that are not interesting will make a snappy interface, and less clutter results to work with.

Resource Location prediction:

This options are valid when doing resource location prediction (directory or file discovery) Recursion level This option allows to set how many levels webslayer must go. For example 3 levels is : /admin/console/test/ Non standard code detection This option is very useful for identifying custom responses. When this option is activated if we detect thata custom not found code, it will add this code to the ignore code list. In case of the code is 200 we will check with another file and compare de characters length, and then we are going to use the characters length in the ignore code, in this way we discard all non interesting requests. Extensions This extensions will be added to all words in the dictionary. Example: php,txt,bak

Basic usage

Working with results

coming soon...