Friday, May 11, 2012

Testing CSRF aware webapps with Burp

Nowadays, is not uncommon to face websites protected using some variant of the synchronizer token pattern to mitigate the risk of Cross-Site Request Forgery (CSRF) attacks.

The basic functioning of these webapps is to generate a random token submitted with each HTTP request, ensuring that the token is present and is valid for a given HTTP request.

Below is shown an URL including its associated token:

Although these protections help to mitigate effectively the risk of CSRF, they make almost impossible to analyse an application in an automatic or semi-automatic way, as this involves requesting an URL with a given token several times, which are discarded by the web server. Furthermore, a common action is to close the established HTTP session.

The only way to request a URL several times is to re-generate the associated token. One possible way of doing this is using the referrer header in order to go backwards identifying all the requests involved in the creation of the token and request them again in order to regenerate it.

Shown below is an example where a user has to click through two menus in order to reach the "inject.jsp" page:

  1. http://localhost:8080/injection_CSRF/index,jsp
  2. http://localhost:8080/injection_CSRF/index2.jsp?org.apache.catalina.filters.CSRF_NONCE=2AC1162C4CA176122F2BAA591B1F360E
  3. http://localhost:8080/injection_CSRF/inject.jsp?id=1&org.apache.catalina.filters.CSRF_NONCE=1EB3952B5BB391C7936A862DF8940DF0
In order to perform an automatic scan of the third request, the two previous requests must be performed sequentially as each of these requests contains a new generated token needed for the next request, forming a token chain.

A POC in the form of a Burp suite plugin has been developed to verify this approach, it can be downloaded at This plugin has been successfully tested against a simple vulnerable application using tomcat CSRF protection (all the requests in the token generation chain must be requested before testing a given URL).

It should be noted however that this code is a POC and it requires further development in other to be able to work against real environments (any link of a webapp with this behavior is appreciated).

Monday, August 8, 2011

Wfuzz 2.0 released!

Hi All!

After Christian presentation at BlackHat/2011 Tools Arsenal, I'm pleased to announce  a new version of WFuzz! It is now more flexible, dynamic and extensible than ever!

Wfuzz is a tool designed for bruteforcing Web Applications, it can be used for finding resources not linked (directories, servlets, scripts, etc), bruteforce GET and POST parameters for checking different kind of injections, bruteforce Forms parameters (User/Password), Fuzzing,etc.

Highlights in this version:

- Infinite payloads. You can now define as many FUZnZ words as you need .
- Multiple encoders per payload. You can now define as many encoders as you need for each payload independently.
- Payload combination. You can now combine your payloads in different ways by specifying 
- Increased flexibility. You can now define in an easy way new payloads, iterators, encoders and output handlers and they will be part of wfuzz straight away.
- Baseline support. You can now define a default value for each payload and compare the results against them.

Other new features include:

- New payloads
- New encoders
- Magictree output
- Support for multiple proxies
- Time delay between requests
- Follow HTTP redirects
- Fuzz within HTTP methods
- HTTP HEAD scan
- SOCKS4/SOCKS5 support

More detailed examples in the README and the google code project page !

Stay tuned! We have a lot of improvements and ideas coming up!

Sunday, July 31, 2011

Blackhat Arsenal USA

Hi all, we are proud to announce that we are going to present at Blackhat Arsenal USA 2011.

We are presenting on Wednesday Wfuzz and Webslayer 2.0 and on Thurdsay theHarvester + Metagoofil 2.0!  both days at 11:15hs.

If you want to say hello pass by our pod!

See you there


Wednesday, June 22, 2011

Scanning ports through SSH Port Forwarding

In one of the latest penetration tests we faced a SSH server that was based in Maverick SSHTOOLS.

The funny thing is that this server was implemented by copy & pasting the example from the web, which had the Port forwarding feature enabled.

After running a bruteforce attack, we found that the admin account had the "admin" password (strong password policy btw), but when we tried to login there was no shell,  the server echoed everything we typed. So we went for the Port forwarding option, we forwarded some ports to interesting services like Terminal Server in the same machine and it worked, so then we though that would be great to be able to scan the internal network through this port forwarding feature, and that´s how we came up with this SSHscan tool. will allow you to scan a internal network through a SSH with port forwarding enabled. The tool allows to create a port forward in localhost for every open port detected in the internal network range.

This tool is not one that can be used in every engagement but when you have the opportunity and the need it will came handy.

The tool has been included in the edgeSSH kit, where we will include all the scripts related with SSH, at the moment only bruteSSH, a SSH login bruteforcer and scanSSH are included in the kit.

You can download the code here://

Command line options:

       -h: target host
       -u: username
       -p: password
       -l: targets lists to scan
       -t: threads
       --remote-host: host to scan
       --remote-ports: port list to scan
       --default-ports: scan default ports
       --all-ports: scan all 65535 ports
       --keep-tunnels: Forward all open ports

Examples: -h -u root -p passowrd -t list.txt -h -u root -p password --remote-host --remote-ports 80,443 -h -u root -p password --remote-host --default-ports

Enjoy Edge-Security

Friday, July 9, 2010

OWASP VI Spain Meeting -2010: And still bruteforcing

Hi all, the past 19 of June i presented at OWASP VI Spain Meeting, a review of Bruteforce attacks in web applications, this is an old technique that is still useful for the attackers, and i showed with examples that is present in many attacks that affect big companies like Facebook, Yahoo, AT&T, Tuenti, etc. Also i presented the latest version of Webslayer a tool to perform all kind of bruteforce attacks in web applications.

You can find the presentation here:

And also the video of the talk (spanish) here.

Here is a picture of the conference showing the Webslayer results interface:

Next version will add some requested features like multiple proxies support, delay between request, and many more.

Stay tuned and enjoy...


Friday, May 14, 2010

Massive Web Application discovery with Wfuzz

Last week i had to review like 40 websites for a penetration test in a short period of time, so the first thing i wanted was to search for directories or files in the web servers, so how can i automate the full scan with Wfuzz? We can use a command like this:

$ wfuzz -c -z file -f urllist.txt,dictionary.txt --html --hc 404 http://FUZZ/FUZ2Z 2> results.html

The first FUZZ will be replaced with the content of urllist.txt, where you should have the websites address in the format "", and the second FUZ2Z will be replaced with the dictionary, in my case i used the big.txt.

Soon i will release an update of Webslayer, and will show how to do this with it.



Monday, January 11, 2010

Security Ezines 2010

Hi all, in this brief post i will like to share some new ezines about security that were relased this year, the first one is called Into the Boxes, and it's centered in Forensics and Incident response, it's a join effort from Harlan Carvey and Don (securityripcord), this ezine looks promising. You can download the first issue here.

The other ezine is the one launched by Hack In the Box (HitB), this magazine has a very professional look and a lot of articles, can be compared with a Hakin9 magazine, but free. They relaunched the ezine this year. You can download the first issue here.

It's cool to see fresh initiatives for sharing knowledge :)

I would like to see a Kindle version of them ;)