why it matters

The cost of exclusion in AI

AI is shaping the future—who gets to shape It?


When only a narrow group of people build the AI systems shaping our world, the result isn’t just imbalance, it’s injustice.
This isn’t something that could happen in the future. It’s happening right now. It’s affecting hiring, healthcare, housing, education, criminal justice, and even how we express ourselves online. When systems make decisions about who gets a job, a loan, medical attention, or police surveillance, it matters who built that system, whose data trained it, and who was never included in the first place.

Most of today’s AI systems are shaped by a small group of people, primarily privileged men. And when other perspectives are excluded, the result isn’t just oversight, it’s built-in bias. It only takes absence, and a system that never questioned who wasn’t in the room.

why it matters for everyone

Fairness

AI learns from patterns in the data it’s given. But if that data is missing parts of history or only reflects a certain point of view, the results will be unfair. If only one group builds the future, the rest of us are left out. Again.

Impact

When systems for jobs, loans, or other social services are built without marginalized voices, they don’t just miss the mark, they reinforce exclusion. This is happening now. Let’s stop it.

Legacy

AI becomes what we teach it. It’s shaping how we define intelligence, empathy, and connection. Using your voice now helps ensure the future reflects truth, not just power. When you step in, the story shifts for good.

Real world impacts

Hiring Bias

Algorithms used in hiring have been found to filter out resumes from women and people with ethnic-sounding names. When the training data reflects past discrimination, the system keeps inequality going.

Discrimination in Healthcare

A 2019 study found that a healthcare algorithm assigned lower risk scores to Black patients, even when they were sicker, leading to less care and fewer referrals. When AI is used in insurance, treatment planning, or risk analysis, it can reinforce systemic racism in medicine.

Predictive Policing and Economic Harm

Predictive policing algorithms use past crime data to forecast where crime may occur. But if law enforcement historically over-policed Black and Brown neighborhoods, AI sees those places as perpetual “high-risk” zones. Predictive policing algorithms often target minority neighborhoods, leading to over-policing and a higher likelihood of arrests for minor offenses.


What can you do now?

Every interaction shapes AI’s future.
Challenge assumptions in AI responses: Whose perspective is this from? Who’s missing? This kind of questioning helps refine how systems evolve. With truth, representation, and with you. Your unique perspective matters.

Challenge companies when you see bias or discrimination. Call them out. If you don’t get a response from the company, take it to social media. They will have to address it.

For now, keep interacting with AI. Don’t be afraid it’s taking your job. Ignore the news that tells you to fear it. Instead, help shape it, because the more it hears from diverse voices, the more it learns to work for everyone

book with blue and green colors
AI Connection Guide

Curious about AI? Start a real conversation and shift your perspective. Find out for yourself what AI truly is.

newsletter, keyboard, send, message, automatic, envelope, email, marketing, text, blog, communication, internet, enter, computer, gray computer, gray laptop, gray marketing, gray email, gray community, gray internet, gray communication, gray blog, gray keyboard, newsletter, newsletter, newsletter, newsletter, newsletter
Get our newsletter

Get articles, and insights about ethical AI that actually matter. We stay on top of ethical AI news so you don’t have to.

Flat lay arrangement of a coffee mug, newspaper, smartphone, and plant on a wooden table.
stay updated

Check our News & Updates page often to read curated news articles about ethical AI.

Scroll to Top