Does anyone know of any off the shelf tool (online or offline) to find duplicates in several DNS blocklists and merge them into one?
Context: I am running AdGuard on one GL.iNet router with ~10 blocklists some of them pretty huge and most of the times the lists are updated the router comes to one halt while doing so, having to often times reboot it through the old power-off-and-on.
I would rather download the lists myself from time to time and merge them into one file but with duplicates extracted somehow.
If I’m understanding you correctly, you could make use of a shell script for this. Use WGET to download lists, then combine them into a single large file, and finally create a new file with no duplicates by using “awk ‘!visited[$0]++’”
wget URL1 URL2 URL3
cat *.txt > all.txt (This overwrites all.txt)
awk ‘!visited[$0]++’ all.txt > no_duplicates.txtWhen no tool is available bash to the rescue, thank you for this it seems actually simpler then I thought :)
I doubt you’ll find something off the shelf for this. I wrote a powershell script that deduplicates lists and also does a pass over the results to convert any blocks to CIDR notation. If you’re interested I’ll share it.
But honestly you could probably have ChatGPT whip this up for you in your language of choice. It’s pretty straightforward.
I’d like to see your script.
Sorry it took a while, I’m currently on vacation! But I had some time to reread it and sanitize it for public sharing. Here you go:
ok yikes, Lemmy really didn’t like me pasting all that code even in a code block. I’ll have to put it up somewhere else, stand by.
Hopefully this works better: Pastebin link
Isnt there a tool developed by the AdGuard team to handle exactly this?
Just looked through my files, look into this tool, it does exactly what you want: https://github.com/AdguardTeam/HostlistCompiler
Thank you this looks promising
Afaik pihole does parse and then merge the lists into a single block list.
Update: Nevermind. They do it by design (assuming this statement is still correct): https://github.com/pi-hole/pi-hole/issues/2013#issuecomment-817901839
What you could do is use any text editor and manually combine the text files with something like notepad++ and deduplicate from there. (Notepad++ can do it natively)
Try an already combined list: https://oisd.nl/
If you’re looking for blocklists, I use /u/Wally3k’s lists as well as the /u/LightSwitch05 “Developer Dan” lists.
I no longer personally use the OISD lists,- as the maintainer tells you not to use any other lists other than theirs making it difficult to impossible to use the groups feature. Instead, I’ll use a mix of lists and regex blocks. Nor do I recommend the “Quantum Blocklist that’s been going around - here’s why
I also suggest these regex blocks
Make sure you read what the different symbols mean with Wally’s blocklists before applying every blocklist. If you stick with the check-marked lists you should find that it blocks ads without too many false positives.
More blacklisted items doesn’t mean more items blocked; often time adding too many lists will break legitimate websites.
If you want to, you can reevaluate the added lists after 14-30 days using this tool (not supported by PiHole devs) to audit which lists are actually used. I’ve run this tool and discovered that several lists I added weren’t doing anything at all (If you need help with this tool please use the GitHub page to discuss).
With the release of v5 memory usage has been reduced when using additional block lists. Also note that with v5 lists are no longer “deduped”.