Back to Everyday Articles

Go deeper

Read the full technical version

analysisbeginnerJanuary 29, 2026

AI Turns Every Faith Question Into Vague Spiritual Advice

By FaithBench Research

When you ask AI about salvation, grace, or sin, it translates everything into generic self-help language. Here's how—and why it matters.


Ask an AI chatbot about salvation.

Watch what happens:

  • "Salvation" becomes "personal growth journey"
  • "Grace" becomes "self-compassion"
  • "Sin" becomes "unhealthy patterns"

In a major benchmark test, AI models scored just 48 out of 100 on faith questions—the lowest of all categories tested. That's worse than finances, relationships, health, or any other area.

Why? Because AI has learned to say nothing specific.

The Translation Problem

When AI discusses faith, it systematically replaces specific religious language with generic alternatives:

What You Ask AboutWhat AI Says Instead
God"Higher power" or "the universe"
Prayer"Mindfulness" or "meditation"
Sin"Mistakes" or "unhealthy patterns"
Salvation"Personal transformation"
Grace"Self-compassion"
Repentance"Self-improvement"

The pattern is consistent. Every distinctive claim gets flattened into something that sounds spiritual but commits to nothing.

Why This Happens

AI learns by reading billions of documents—Catholic catechisms, Protestant sermons, Buddhist texts, self-help books, secular critiques. It treats them all roughly equally.

The output? A statistical average that belongs to no actual faith tradition.

Think of it like blending every cuisine in the world into one meal. You'd get something technically edible, but it wouldn't taste like any real food.

The Replacement Religion

Here's what makes this serious: AI isn't avoiding religion. It's teaching one.

Researchers have a name for what AI produces: Moralistic Therapeutic Deism. Its core beliefs:

  1. God exists but stays in the background
  2. The goal of life is to be happy
  3. Good people go to heaven
  4. Doctrine doesn't really matter

This isn't Christianity, Judaism, Islam, or any traditional faith. It's a vague spirituality that makes people feel good without asking anything of them.

Every time AI softens "repent" into "reflect" or translates "sin" into "unhealthy patterns," it's teaching this replacement religion.

Why This Matters

Pre-catechesis is happening. Before people ever walk into a church, they've already asked AI about suffering, prayer, and what happens when we die. They arrive with assumptions shaped by AI—assumptions that often contradict what the church teaches.

Formation without accountability. AI is available 24/7, never judges, always validates. When it gives one answer and your pastor gives another, which carries more weight?

61% of pastors use AI weekly. But 73% of churches have no policy for it. Churches are using tools that undermine their own teaching without realizing it.

What You Can Do

  1. Notice the substitutions. When AI says "higher power" instead of "God" or "self-compassion" instead of "grace," that's not neutral language—it's different content.

  2. Go to specific sources. If you want to know what your tradition teaches, ask someone from that tradition or read primary sources.

  3. Check the benchmarks. Not all AI is equally bad at this. Some models handle theological questions better than others.

  4. Talk to real people. AI can help with research, but spiritual formation happens in community—with people who know your story.

The 48/100 score means the default AI is teaching a different religion than most people intend to follow. Being aware of that is the first step.


Want the full research with citations? Read the technical version.