|\ __________                          __   __                         __
         | |   __     |          _____ __    __\__/_|  |_ __ ___   _____   ___ |  |\_____     
         | |  /  \    |         /  _  \  \  /  /  |_    _|  /   \ /  _  \ /  _ \  |/  _  \    
         | |  \  /___ |        /  /_\  \  \/  /|  | |  |\|   /\  \  / \  \  / \   |  /_\  \   
         | |__/    _ \|        |  _____||    |\|  | |  | |  |\|  |  |\|  |  |\|   |  _____|\  
         | |___/\  \\_\        \  \____/  /\  \|  | |  | |  | |  |  \_/  /  \_/   |  \___ \|  
         | |    /   \_|         \_____/__/ /\__\__| |__| |__| |__|\_____/ \____/__|\_____/\   
         | |   / / \___|         \____\__\/  \__\__\|\__\|\__\|\__\\____\/ \___\\__\\____\/   
         | |__/_/_____|     
         |/                

Last changed: 23.03.2019

Cracking Passwords


If you come across password hashes during a penetration test, you can check for weak passwords by running a password cracking tool. Good wordlists combined with good rules can reduce the time needed for cracking essentially.

cracking

In my crackstuff repository you can find some scripts and rule files which can help creating good password candidates.

The following table compares the cracking speed of various hashing algorithms in relation to SHA1

algorithm relative speed
NTLM 540%
MD5 280%
SHA1 100%
SHA256 40%
NetNTLMv2 20%
WPA2 0.005%
DCC2 0.004%
SHA512CRYPT 0.002%

The 'Crack Me If You Can' challenges can be used to check your capabilities.

hash cracking


brute force

john --incremental=Alnum <hash_file> --format=raw-md5
hashcat -a 3 -m 2100 dcc2.hash -i ?l?l?l?l?l?l?l?l
hashcat -a 3 -m 5600 netntlmv2.hash <hcmask_file>

wordlist

john --wordlist=/usr/share/wordlists/rockyou.txt <hash_file> --format=nt
hashcat -a 0 -m 2500 wpa.hccap big.txt -r rules/best64.rule

hybrid

hashcat -a 6 -m 1800 linux_sha512.hash wordlist.txt mask.hcmask
hashcat -a 7 -m 1000 ntlm.hash mask.hcmask wordlist.txt

show results

You can use standard linux tools to print a nice table with the results.

join -i -t: -14 -21 -o1.1,2.2 <(sort -t: -k4 ntlm.pwdump) <(sort ntlm.pot) | sort | column -t -s:

brute force


To reduce the number of guesses during brute force cracking hashcat offers to use charset masks. These can be generated with the tool mask_extractor.py of my github repository. To evaluate the quality of a mask the repository contains the tool mask_eval.py.

./mask_extractor.py <found.pwds> | sort | uniq -c > extracted.masks
./mask_eval.py extracted.masks | sort -rn | head -n 50 > best50.hcmask
hashcat -a3 -m0 hashes.txt best50.hcmask

If password policies are known masks should be checked for validity. For example the following script filters for masks containing three charset groups.

for i in `cat file.hcmask`;do if [[ n=`echo $i | grep -o [lusd] | sort -u | wc -l` -gt 2 ]]; then echo $i; fi; done

The tool princeprocessor generates password candidates by combining parts of the words from a given wordlist. As combinations from different languages could be unwanted the wordlist should only contain a single language.

princeprocessor < english.txt | hashcat -m 1000 ntlm.hash

wordlist creation


automated tools

cewl -w ./filename.cewl "http://en.wikipedia.org/wiki/Password_cracking" -d 0
crunch 12 12 -t SP-7%3FA1%%% -o arcadyan.lst
grep -E '\w+' -ho text_files/* | sort -u > unique.words

spellchecking lists

The linux spellchecker contains good wordlists for different languages.

aspell dump master -d en_US | cut -d '/' -f 1 > english_US.txt

postprocessing

grep -P '^.{1,5}$' wordlist.txt > shorts.txt
hashcat -a1 --stdout shorts.txt -j '$ ' shorts.txt > comb_shorts.txt
hashcat --stdout comb_shorts.txt -r mega.rule > mega_comb_shorts.txt

charset statistics

awk -vFS="" '{for(i=1;i<=NF;i++)w[$i]++}END{for(i in w) print i,w[i]}' wordlist | sort -rn -k2

cleanup wordlist

Depending on the scenario it may prove suitable to remove non ascii printables or lines exceeding a certain length.

grep -ax '.*' rockyou.txt > rockyou.printable
sed '/.\{50\}/d' rockyou.printable > rockyou.p.trimmed
sort -u rockyou.p.trimmed > rockyou.p.t.sorted

If you want to keep the order or sort by line lenght use awk.

awk '!x[$0]++' file > unsorted_unique
awk '{ print length, $0 }' file | sort -n -s | cut -d" " -f2- > line_length

Sometimes the ruleset is optimized for a certain format of the wordlist. A possible format could be first char uppercase and rest lowercase.

sed -i 's/^\(.*\)$/\L\1/' wordlist.txt
sed -i 's/^\(.\)/\U\1/' wordlist.txt

worlist rules


To create and evaluate wordlist rules I use a list of leaked passwords like rockyou.txt and a smaller wordlist. Hashcat offers a debug mode which logs every time a rule hits a password.

benchmark rules

hashcat -m 99999 <passwords> <wordlist> -r <rules> --potfile-path tmp.pot --debug-mode 1 --debug-file tmp.matched -o /dev/null
sort tmp.matched | uniq -c | sort -rn > result.txt
cut -b 9- result.txt | head -n 100 > my_best100.rule

Hashcat offers different debugging modes which log the following information into the debug file.

debug-mode debug-file
1 <matching_rule>
2 <matching_word>
3 <matching_word>:<matching_rule>
4 <matching_word>:<matching_rule>:<found_password>

With gnuplot the results can be plotted into a graph.

gnuplot -p -e 'plot "result.txt"'

grep regex to count rules

grep -P "^(((l|u|c|C|t|T.|r|d|p.|f|\{|\}|\\\$.|\^.|\[|\]|D.|x..|O..|i..|o..|\'.|s..|@.|z.|Z.|q|X...|4|6|M|k|K|\*..|L.|R.|\+.|\-.|\..|\,.|y.|Y.|E|e.) ?){1}\$)" all.rules

If you want to filter your rules for a certain password complexity the crackstuff repository contains the script complexity_filter.sh.

complexity_filter.sh all.rules > complex.rules

non ascii charsets


If your target lies outsite the English-speaking countries you may need to try non ascii characters in passwords. This will only be successful if you convert your foreign wordlist into the same encoding scheme which is used by the targeted hash. This can be done on the fly by hashcat as well.

iconv -f utf-8 -t cp1252 -o german.cp1252 german.utf8
hashcat -m 1000 hash.txt wordlist.utf8 --encoding-from=utf8 --encoding-to=iso8859-1

The following command replaces 'umlaute' in a german text.

cat german.txt | sed 's/Ä/Ae/g' | sed 's/Ö/Oe/g' | sed 's/Ü/Ue/g' | sed 's/ä/ae/g' | sed 's/ö/oe/g' | sed 's/ü/ue/g' | sed 's/ß/ss/g' | grep -vP '[\x80-\xff]'

To crack ntlm hashes of utf-16 encoded passwords you can skip the charset conversion of ntlm manually by using md4 with utf-16 encoded passwords

hashcat -m 900 hash.txt wordlist.utf8 --encoding-to utf16le

To create a ntlm hash the following perl script can be used

perl -e 'use utf8; use Encode; use Digest::MD4 "md4_hex"; print md4_hex(encode ("UTF-16LE", "password"))."\n";'

rainbowtables


For unsalted hashes rainbowtables can be precomputed. On windows the gui tool ophcrack can be used. On linux there is rainbowcrack.

rtgen ntlm alpha 1 4 0 1000 1000 0
rtsort .
rcrack . -h 0cb6948805f797bf2a82807973b89537
rcrack . -ntlm <pwdump file>

online cracking


Sometimes you cannot copy the hash to crack it offline on your computer. Instead an online attack has to be executed by automatically trying to login using credentials from a file.

hydra

hydra -l user -P wordlist ssh://<target>
hydra <target> https-post-form "login.php:user=^USER^&pass=^PASS^:Error message" -L userlist -P wordlist