I think there's big potential in using DNS blacklists for this: they have the advantage of being massively scalable and simple to maintain, and clients configuration to use them is also easy.
The scalability comes from the caching inherent in DNS; instead of having to have millions of people downloading text files from a website over HTTP on a regular basis, the data is in effect lazy-uploaded into the cloud of caching DNS resolvers, with no administration cost on behalf of the DNSBL operator.
Reputation whitelists (or other scoring services) would also be just as easy to implement.
DNS blacklists work fine for blocking access to sites or certain known-sketchy FQDNs/domains but do nothing to hide low-quality search engine results, which is what this is all about.