Interverse, a truly decentralized discovery service. This is only the beginning, once this is better adopted it is easier to build more efficient scapers and indexes.
Search engines are fundamentally broken and have major flaws:
- Crawling the entire web is a massive energy waste
- Even worse when censorship means that multiple entities need to crawl similar content
- There is no concept of trust. Results are unfairly ignored or promoted.
Instead of engines relying on massive bot-farms that scrape content for centralized search engines…why not rely on an interconnected web between sites?
Want to get involved?
Here’s the Code