1.4 billion email:pass, December 2017, 42.96 GB
by Already - November 28, 2018 at 04:22 AM
#1
Thumbsup 
Working link.

More info:

https://medium.com/4iqdelvedeep/1-4-billion-clear-text-credentials-discovered-in-a-single-database-3131d0a1ae14

42.96 GB

Found in an underground forum on December 5, 2017
Total credentials: 1,400,553,869
Last updated: November 29, 2017

Download:

https://mega.nz/#!hdsgUaRQ!5L889VlUZAZg4FEr09Bsq05iSPMsp2JWuyXWjOWndXM
Reply
#2
Appreciate the effort dude and the fact that yuo're releasing it for free but this looks like the Breachcompilation release which was already posted late last year (the free magnet link in that thread still works):

https://raidforums.com/Thread-Breach-Com...e-one-Free
Reply
#3
The hacker who originally released the dump has a site on the deep web where he sells digital goods and services along with implementing the database on to his site allowing one to search the database.
This is the site for anyone interested:
dumpedlqezarfife.onion
Reply
#4
42GB, i cannot have space to save this Smile)
Reply
#5
thanks heaps, wow huge list.
Reply
#6
THanks dear i will try this
Reply
#7
Thank you very much. I try split and sort by domains and country.
Reply
#8
does anybody know a method for fast deduping of very large files? if i could dedupe at a reasonable speed i think i could put together a 10 billion emailpass file, certainly above 6 billion
Reply
#9
I load in all my combos, convert the emails to lowercase, MD5 encode them and trim them to 3 chars, store in an object with the 3 digit codes as keys, when the object gets big enough flush the object / append them to text files titled [md5key].txt. End up with about 4096 files of uniform size where none will dupe across different files, so I dedupe all those 4096 individually. The current batch I've loaded in had ~15.9mb files that deduped down to ~8.9mb. The MD5 encoding keeps it uniform and predictable. Merge those 4096 and they're not sorted but I've got a 36.5gb file 1,159,114,293 unique combos.

If you store them in a tree of directories like this guy did you can do a fast enough lookup just iterating over them, but I just use one directory and load the resulting big file into a mongodb collection, then build indexes on whatever fields I want. Full email, user, domain, password. Can do a few thousand lookups a second at least no problem. The whole process takes time to run but if you've got the space you can do it. At least for the splitting into the MD5 key files you're gonna be pretty good with an SSD. For your mongodb collection you don't even need one.

Also should say, if you just want to parse into a mongodb collection and build the indexes with dupes the effect on lookup speed will be negligible. If you just build an API with express and mongojs you can just dedupe before you output your query results. You don't get a total unique combo count but whatevs.
Reply
#10
Terrific collection - I want the email primarily and it's a great list - thanks everyone
Reply
#11
(December 10, 2019 at 11:30 AM)DonJuji Wrote: does anybody know a method for fast deduping of very large files? if i could dedupe at a reasonable speed i think i could put together a 10 billion emailpass file, certainly above 6 billion
I use  awk '!seen[$0]++' on a RDP with killer specs. I've deduped around billions in files with ease.
Reply

Possibly Related Threads…
Thread Author Replies Views Last Post
LegendasTV [USER:PASS & EMAIL:PASS] Databases 1 132 Today at 01:21 AM
Last Post: areele
eurogunz [USER:PASS & EMAIL:PASS] Databases 1 206 February 14, 2020 at 08:19 PM
Last Post: MinimaxI
ChristianPassions [USER:PASS & EMAIL:PASS] Databases 1 202 February 14, 2020 at 07:33 PM
Last Post: Connect

 Users browsing this thread: 1 Guest(s)