
Sometimes a book doesn’t just change how you think—it changes what you build.
I was about halfway through Safiya Noble’s Algorithms of Oppression: How Search Engines Reinforce Racism when my brain switched into what I call “build a better mousetrap” mode. Noble’s meticulous documentation of how Google’s algorithms perpetuate racism, sexism, and the marginalization of vulnerable communities wasn’t just academic criticism. It was a blueprint for what not to do.
And that’s when the question hit me: What if you could use Islamic ethical principles as the technical specification for a search algorithm?
The Insight That Sparked Everything
Here’s what clicked: Islamic ethics, when properly applied (not cultural interpretations, but the actual foundational principles), explicitly prohibit the exact behaviors Noble documents as algorithmic oppression:
- Sexual exploitation? Haram.
- Economic predation? Haram (riba).
- Racism and discrimination? Haram.
- Marginalization of the vulnerable? Haram.
- False witness and spreading misinformation? Haram.
I wasn’t thinking about building a “Muslim search engine.” I was realizing that a comprehensive ethical framework already existed that, when applied as filtering criteria, would inherently solve the problems Noble identified.
This wasn’t niche. This was a better mousetrap.
The Decision to Build
I’ve been in tech long enough to know that “I should build that” and actually building it are separated by an ocean of complexity. But I also knew I had the pieces:
- A background in marketing and communications (Georgetown, MA, years in nonprofit digital strategy)
- Enough Python knowledge to be dangerous
- Access to AI tools (OpenRouter, Claude) that could handle the heavy lifting
- Most importantly: a clear vision of what “good” looks like
So on November 11, 2025, I opened a new conversation with Claude and said, “Let’s play a game… There is an endpoint to this. We will start here… what is a search engine like Google or Yahoo?”
That conversation became the blueprint for NotHere.one.
Building the Five-Tier Filter
The technical architecture came together over two months (with a one-month pause while I handled day-job responsibilities). The system I built uses five filtering tiers:
Tier 1: Hard Blocks
Some content doesn’t deserve a platform. Payday loans, adult content, gambling sites, hate groups verified by SPLC and CAIR, deliberate misinformation (flat earth, Holocaust denial)—these never make it into the index. Not censorship. Standards.
Tier 2: Quality & Authority
Domain age, backlink analysis, and content quality. The technical stuff that separates signal from noise.
Tier 3: Equity Boosts
Here’s where it gets interesting: minority-owned businesses get a +15 ranking boost. Women-owned? +15. Veteran-owned? +15. B-Corp certified? +10. These can stack. A woman-veteran’s business gets +30 before we even look at content quality.
This isn’t affirmative action. This is algorithmic justice. We’re deliberately counteracting the systemic bias that buries these voices in mainstream search results.
Tier 4: Media Literacy Layer
When content makes extraordinary claims, contradicts scientific consensus, or shows patterns of statistical manipulation, users see a context box with fact-checking resources. And here’s the gamification part: click that context box? You earn Media Literacy Points. We’re turning critical thinking into a game.
Tier 5: Islamic Alignment
I processed 29,240 entries from the Quran and Hadith corpus through AI to extract 509 themes across four categories: core values, social ethics, halal (encouraged), and haram (prohibited). Every page gets scored against this framework. A medical article discussing alcohol? That’s a nuanced context. A bar’s promotional site? Filtered out.
Cost for the entire theme extraction: less than $60. Cost for ongoing AI analysis: less than a penny, because I only run expensive checks on flagged content.
The Results
As of today, NotHere.one has indexed over 40,000 pages with 1.2 million URLs queued for processing. The crawler runs continuously on Railway, automatically reseeding when the queue empties. The system is live.
Health metrics show that 98.6% of crawled pages are indexable (i.e., they pass basic quality checks), and 79% score “high quality” (composite score above 70).
More importantly, the system works as intended. Search for “Islam,” and you get Islamic scholarship, not terrorist associations. Search for women’s health, and you get medical resources, not exploitative content. The algorithm serves users instead of surveilling them.
What This Really Means
I’m not trying to replace Google. That’s not the point. What I’ve proven is that you can build a search infrastructure that refuses to profit from harm. That algorithmic oppression isn’t inevitable—it’s a choice.
Noble’s book documents how tech companies chose profits over people, allowing their algorithms to reinforce every societal bias while claiming algorithmic neutrality. What NotHere.one demonstrates is that you can choose differently.
The name says it all. When someone searches for payday loans or hate group propaganda or anti-vaccine conspiracy theories, they get a simple message: “Not Here.”
Exploitation? Not here.
Oppression? Not here.
Deception? Not here.
The Bigger Picture
I built this as a one-person project in my spare time over two months, for less than $100 in API costs. I’m not a Google engineer. I don’t have venture capital. I just read a book that made me angry enough to build something better.
If I can do this, imagine what a team could build. Imagine what properly-funded infrastructure could accomplish. Imagine if the major tech companies actually gave a damn about the communities their algorithms harm.
NotHere.one isn’t finished—there’s still work to do on the frontend, more refinement needed in the scoring algorithms, and plans for mobile apps. But it’s live. It’s functional. And it proves that another way is possible.
Try It Yourself
The search engine is live, though still in beta. The owl mascot (we call him Scout) is there to guide users toward good information and away from the noise. Every search is an opportunity to practice media literacy. Every result shows transparency tags explaining why it ranked.
This is what happens when you let Islamic ethics—or any comprehensive ethical framework—drive your technical decisions instead of just slapping a “values statement” on a Wall Street algorithm.
Safiya Noble documented the problem with meticulous precision. I built one small piece of the solution. What will you build?
About This Project:
NotHere.one is a values-based search engine that uses Islamic ethical principles to filter exploitative content while amplifying marginalized voices. Built by B.C. Dodge, it’s currently in beta with over 40,000 indexed pages. The project was inspired by Safiya Noble’s “Algorithms of Oppression” and developed over eight weeks in late 2025.
Technical Stack: Python, PostgreSQL, Redis, Flask, Railway.app, OpenRouter API
Comments
What are your thoughts on algorithmic justice and values-based search? Have you read “Algorithms of Oppression”? What would you build differently? Share in the comments below.
