
Go Count Hashes
Get more from your ntds.dit data, generally we acquire this file to maul an environment, but there's a blue way to provide value without cracking a single hash, it's to count them.
A common goal for Pentesters, Redteamers, Ransomware groups, those that exist in adversarial simulation or just the cyber crime space is to obtain an organisation's NTDS.dit file, where all the juicy details live, the NTDS.dit file contains usernames, hashes and a whole bunch of other things in it's database, Adversarially we just want the means to inherit credentials and identities.
This script isn't specific to NTLM hashes; it's just my use-case. The same principle can be applied to other contexts, counting hashes, deriving conclusions and/or focus points for cracking.
Traditionally we get the NTDS.dit file, parse it with Impacket's secretsdump.py, run Hashcat through all the hashes, and get ourselves some more access or compromise en-masse... depending on how weak the passwords are and how capable the cracking rig is. Most of us use Hashcat. The problem with Hashcat is that it doesn't give you the 'frequency of hash seen' in the list, and we're probably not going to get a post-cracking summary. I love Hashcat, I'm not throwing shade, but what I'm alluding to is that the 'frequency of hash seen' has value. It tells the security team that there's a password problem in the organisation. That password problem might be IT performing org-wide password resets in a way that isn't very considered, it might be that RBAC looks less useful because users are using the same password across multiple accounts varying in different access and importance, or it might be that the Joiners, Movers, Leavers process is broken and there are old accounts that need removing...but for attackers or offsec practitioners, the biggest concern is, how much access can I get from this lot?
I wrote a little Go script** that will take a list of flat hashes (hashes.txt), count the duplicates, and present you with the results. This will help you if you're up against the clock as an offsec person, and if you're purple/blue, it will highlight that the problem is there without even touching Hashcat (if you're sensitive to cracking those hashes for...$reasons)
You'll need Go; I'm using version go1.20.2 linux/amd64.
If you have this Go file in the same folder as your hashes.txt, it will automatically pull that file. If not, you will need to use the -p flag: 'go run hashcounter.go -p /path/to/hashes'
By default, it will list all duplicates. If you just want the top 10, 20, 40 (etc.) offenders, you can provide the -d flag: 'go run hashcounter.go -d 20' You can always run 'go run hashcounter.go -h'"
Running go run hashcounter -d 8 would show something like this:
You'll be left with the top duplicate hash offenders (or all if you don't use the -d switch) you'll have some Hash Statistics and Hash Duplicate volumes
Expected Output
Top 20 highest duplicates:
_______________________________
91b7d88b0c18b11a778b59f1d7ed0c34: 2977 Users
15268c7e47dfdc8d06fc03864cf0e4a4: 2967 Users
cec74c83ea90b9a9cc3af3a7355e5ca5: 998 Users
320a3a5623e5c5b69487a39fc2fa7e58: 891 Users
c5ed6f31e6ce2b6dfda2256c86b6c947: 370 Users
cec74c83ea90b9a9cc3af3a7355e5ca5: 241 Users
b3e0d7ca8f6d95f1e2d2b34e7e8812b4: 198 Users
41cbe4fa36d4a4a69083405a6604f9b8: 158 Users
5bdc1c20cbda9987c2b60bbf721ba9f9: 131 Users
a507a8b6e18ba6e37359c1bbdc4148c3: 95 Users
064a9c6d36520f8f3cbcd92f3c77e888: 91 Users
e8c31e74f0b2dcb9d829688bb8c2b26f: 81 Users
9a5069f1f7d0a810bebb078e330750c8: 71 Users
2c6df0806886a2c6dbda11e7e1d69ed1: 55 Users
8c3fb3d33d6e04dd7b2f9c0da7a3891f: 54 Users
f6a73f79a3d3de19dfef63a7f9c9e1c7: 52 Users
4e42530b520a357b07e54680bea481b1: 47 Users
39f0d7e098e846b87a4a9f1c91397491: 44 Users
7427b031e8b69eb765d1a50cb4230ee5: 42 Users
7abaff3c6d3bdf470a9dd9c7d1af244a: 39 Users
_______________________________
_______________________________
Hash Statistics:
_______________________________
Total Hashes: 13337
Total Duplicates: 31337 (1.65%)
Total Users with Duplicate Passwords: 31337 (313.37%)
_______________________________
Hash Duplicate Volumes:
_______________________________
2977 duplicates: 1
2967 duplicates: 1
998 duplicates: 1
891 duplicates: 1
370 duplicates: 1
241 duplicates: 1
198 duplicates: 1
158 duplicates: 1
131 duplicates: 1
95 duplicates: 1
91 duplicates: 1
81 duplicates: 1
71 duplicates: 1
55 duplicates: 1
54 duplicates: 1
52 duplicates: 1
47 duplicates: 1
44 duplicates: 1
42 duplicates: 1
39 duplicates: 1
35 duplicates: 1
31 duplicates: 1
28 duplicates: 1
27 duplicates: 1
23 duplicates: 5
22 duplicates: 4
20 duplicates: 1
19 duplicates: 2
18 duplicates: 3
17 duplicates: 4
16 duplicates: 4
15 duplicates: 3
14 duplicates: 4
13 duplicates: 4
12 duplicates: 13
11 duplicates: 11
10 duplicates: 16
9 duplicates: 9
8 duplicates: 48
7 duplicates: 17
6 duplicates: 22
5 duplicates: 50
4 duplicates: 86
3 duplicates: 139
2 duplicates: 518
If you're looking to do the whole thing end to end, here are some useful tips. But equally, you could get in touch. This script and focus are related to this post, but here are some nice resources to get those hashes, should you have the permission and privilege. Read this twice, then begin once your questions are answered internally - 'Extracting & Cracking the NTDS.DIT file' https://bond-o.medium.com/extracting-and-cracking-ntds-dit-2b266214f277
If you don't want to crack any of the hashes, I'd consider just extracting the hashes from the dump. You can do this with your favourite text processor; I'll use bash for the example. 'cat ntlm-extract.ntds | cut -f 4 -d ':' > hashes.txt'.
Or if you dont want to obtain the ntds.dit file, you can use this acompanying script to play along: https://github.com/yosignals/ditty
Have fun!
If you get stuck, you can always reach out.
- https://gist.github.com/yosignals/8baa95c6a9b2d1fe552d118fd7bc63d9
- https://github.com/fortra/impacket/blob/master/examples/secretsdump.py
- https://thecontractor.io/blog/bigger-organisational-benefits-of-password-cracking/
- https://bond-o.medium.com/extracting-and-cracking-ntds-dit-2b266214f277
- https://hashcat.net/hashcat/
All the code produced has been aided by ChatGPT, i'm not a programmer, I focus on the outcome, feel free to improve the code, i'm not a programmer.