The Justice Mirror Conundrum
AI now gives ordinary people access to powerful investigative tools. Public records, property transfers, court filings, genealogies, and financial histories can all be analyzed at scale. This opens the door to surfacing long-buried injustices—land theft, exclusion, exploitation, erased contributions. Patterns that were once too complex or buried too deep can now be uncovered with a prompt.
For many, this feels like long-overdue progress. The ability to expose harm no longer rests solely with governments or academics. But turning on that spotlight comes with a price. AI does not draw moral lines between perpetrators, bystanders, or beneficiaries. The same data that uncovers stolen land or suppressed voices might also reveal how your own family, workplace, or neighborhood quietly profited. The lines blur fast.
What happens when the tools you use to seek justice for others bring uncomfortable truths about your own story?
The conundrum:
If you want AI to surface hidden injustices and hold others accountable, are you also willing to let it judge you by the same standard—or does justice lose meaning when we only aim it outward?
This episode is curated by Brian using ChatGPT, Perplexity Pro, and Google Notebook LM. Intro: BrianHosts: AI
En liten tjänst av I'm With Friends. Finns även på engelska.