Last changed: 18.08.2021
Cracking Passwords
If you come across password hashes during a penetration test, you can check for weak passwords by running a password cracking tool. Good wordlists combined with good rules can reduce the time needed for cracking essentially.
In my crackstuff repository you can find some scripts and rule files which can help creating good password candidates.
The following table compares the cracking speed of various hashing algorithms in relation to SHA1
algorithm | relative speed | time factor |
---|---|---|
NTLM | 540% | 0,19 |
MD5 | 280% | 0,36 |
SHA1 | 100% | 1 |
SHA256 | 40% | 2,5 |
NetNTLMv2 | 20% | 5 |
WPA2 | 0.005% | 20000 |
DCC2 | 0.004% | 25000 |
SHA512CRYPT | 0.002% | 50000 |
The 'Crack Me If You Can' challenges can be used to check your capabilities.
hash cracking
To run hashcat
you will need the proprietary graphics drivers (arch linux
packages nvidia
and cuda
for nvidia respectively opencl-amd
for amd)
brute force
john --incremental=Alnum <hash_file> --format=raw-md5
john --mask=password?d?d?d <hash_file>
hashcat -a 3 -m 2100 dcc2.hash -i ?l?l?l?l?l?l?l?l
hashcat -a 3 -m 5600 netntlmv2.hash <hcmask_file>
wordlist
john --wordlist=/usr/share/wordlists/rockyou.txt <hash_file> --format=nt
hashcat -a 0 -m 2500 wpa.hccap big.txt -r rules/best64.rule
With the paramter -O
you can speed up hashcat
but be aware that password
candidates with more than 15 characters are skipped this way.
hybrid
hashcat -a 6 -m 1800 linux_sha512.hash wordlist.txt mask.hcmask
hashcat -a 7 -m 1000 ntlm.hash mask.hcmask wordlist.txt
show results
You can use standard linux tools to print a nice table with the results.
join -i -t: -14 -21 -o1.1,2.2 <(sort -t: -k4 ntlm.pwdump) <(sort ntlm.pot) | sort | column -t -s:
wordlist creation
automated tools
cewl -w ./filename.cewl "http://en.wikipedia.org/wiki/Password_cracking" -d 0
crunch 12 12 -t SP-7%3FA1%%% -o arcadyan.lst
grep -E '\w+' -ho text_files/* | sort -u > unique.words
spellchecking lists
The linux spellchecker contains good wordlists for different languages.
aspell dump master -d en_US | cut -d '/' -f 1 > english_US.txt
cleanup wordlist
Depending on the scenario it may prove suitable to remove non-printable chars or lines exceeding a certain length.
grep -ax '.*' rockyou.txt > rockyou.printable
sed '/.\{50\}/d' rockyou.printable > rockyou.p.trimmed
sort -u rockyou.p.trimmed > rockyou.p.t.sorted
If you want to keep the order or sort by line lenght use awk
.
awk '!x[$0]++' file > unsorted_unique
awk '{ print length, $0 }' file | sort -n -s | cut -d" " -f2- > line_length
Sometimes the ruleset is optimized for a certain format of the wordlist. A common format is only lowercase words.
sed -i 's/^\(.*\)$/\L\1/' wordlist.txt
benchmark worlists
Hashcat offers different debugging modes which log the following information into the debug file.
debug-mode | debug-file |
---|---|
1 |
<matching_rule> |
2 |
<matching_word> |
3 |
<matching_word>:<matching_rule> |
4 |
<matching_word>:<matching_rule>:<found_password> |
hashcat -m 99999 <passwords> <wordlist> -r <rules> --potfile-path tmp.pot --debug-mode 2 --debug-file tmp.matched -o /dev/null
sort tmp.matched | uniq -c | sort -rn > result.txt
cut -b 9- result.txt | head -n 10000 > best_words.txt
worlist rules
To create and evaluate wordlist rules I use a list of leaked passwords like
rockyou.txt
and a smaller wordlist.
benchmark rules
hashcat -m 99999 <passwords> <wordlist> -r <rules> --potfile-path tmp.pot --debug-mode 1 --debug-file tmp.matched -o /dev/null
sort tmp.matched | uniq -c | sort -rn > result.txt
cut -b 9- result.txt | head -n 100 > my_best100.rule
With gnuplot
the results can be plotted into a graph.
gnuplot -p -e 'plot "result.txt"'
grep regex to count rules
grep -P "^(((l|u|c|C|t|T.|r|d|p.|f|\{|\}|\\\$.|\^.|\[|\]|D.|x..|O..|i..|o..|\'.|s..|@.|z.|Z.|q|X...|4|6|M|k|K|\*..|L.|R.|\+.|\-.|\..|\,.|y.|Y.|E|e.) ?){1}\$)" all.rules
If you want to filter your rules for a certain password complexity the
crackstuff repository contains
the script complexity_filter.sh
.
complexity_filter.sh all.rules > complex.rules
non ascii charsets
If your target lies outsite the English-speaking countries you may need to try non ascii characters in passwords. This will only be successful if you convert your foreign wordlist into the same encoding scheme which is used by the targeted hash. This can be done on the fly by hashcat as well.
iconv -f utf-8 -t cp1252 -o german.cp1252 german.utf8
hashcat -m 1000 hash.txt wordlist.utf8 --encoding-from=utf8 --encoding-to=iso8859-1
The following command replaces 'umlaute' in a german text.
cat german.txt | sed 's/Ä/Ae/g' | sed 's/Ö/Oe/g' | sed 's/Ü/Ue/g' | sed 's/ä/ae/g' | sed 's/ö/oe/g' | sed 's/ü/ue/g' | sed 's/ß/ss/g' | grep -Px '[\x00-\x7f]*'
To crack ntlm hashes of utf-16 encoded passwords you can skip the charset conversion of ntlm manually by using md4 with utf-16 encoded passwords
hashcat -m 900 hash.txt wordlist.utf8 --encoding-to utf16le
To create a ntlm hash the following perl script can be used
perl -e 'use utf8; use Encode; use Digest::MD4 "md4_hex"; print md4_hex(encode ("UTF-16LE", "password"))."\n";'
show active codepage in windows
chcp
show active encoding in linux
locale
transfer from windows encoding to linux encoding
cat wordlist.win | iconv -f cp850 > wordlist.utf8
brute force masks
To reduce the number of guesses during brute force cracking hashcat offers to
use charset masks. These can be generated with the tool mask_extractor.py
of
my github repository. To evaluate
the quality of a mask the repository contains the tool mask_eval.py
.
./mask_extractor.py <found.pwds> | sort | uniq -c > extracted.masks
./mask_eval.py extracted.masks | sort -rn | head -n 50 > best50.hcmask
hashcat -a3 -m0 hashes.txt best50.hcmask
If password policies are known masks should be checked for validity. For example the following script filters for masks containing three charset groups.
for i in `cat file.hcmask`;do if [[ n=`echo $i | grep -o [lusd] | sort -u | wc -l` -gt 2 ]]; then echo $i; fi; done
The tool princeprocessor
generates password candidates by combining parts of
the words from a given wordlist. As combinations from different languages could
be unwanted the wordlist should only contain a single language.
princeprocessor < english.txt | hashcat -m 1000 ntlm.hash
charset statistics
awk -vFS="" '{for(i=1;i<=NF;i++)w[$i]++}END{for(i in w) print i,w[i]}' wordlist | sort -rn -k2
rainbowtables
For unsalted hashes rainbowtables can be precomputed. On windows the gui tool
ophcrack
can be used. On linux there is rainbowcrack
.
rtgen ntlm alpha 1 4 0 1000 1000 0
rtsort .
rcrack . -h 0cb6948805f797bf2a82807973b89537
rcrack . -ntlm <pwdump file>
online cracking
Sometimes you cannot copy the hash to crack it offline on your computer. Instead an online attack has to be executed by automatically trying to login using credentials from a file.
hydra
hydra -l user -P wordlist ssh://<target>
hydra <target> https-post-form "/login.php:user=^USER^&pass=^PASS^:Error message" -L userlist -P wordlist
generating passwords
There can be multiple options to prevent others from cracking your passwords. If you can choose the encryption or hashing method you can select the strongest one. The Password itself should not be included in a wordlist and should not be created using simple rules. So you cound generate one using random characters or words.
openssl rand 400 | tr -dc 'A-Za-z0-9' | sed 's/.\{4\}/&./g' | head -c 19; echo
Password manages like KeePassXC can be used to store passwords that are difficult to remember.