- morehuman daily
- Posts
- 🧠Issue #12 – When AI Learns to Care for Us
🧠Issue #12 – When AI Learns to Care for Us

🔍 The Spark
Ellipsis Health just raised $45 million to scale Sage, an AI “empathy engine” that conducts phone check-ins with patients—tracking medication adherence, mood, and dietary habits—while flagging serious symptoms for human caregivers to follow up.
đź’ˇ The Insight
Most AI today responds to commands. Sage listens.
It learns emotional cues—not to manipulate, but to understand.
It detects worry, confusion, or hesitation—signals humans often miss between appointments.
This marks a shift: AI not just in servers, but at our sides, reminding us to take care of ourselves, and learning to mirror compassion—even when we’re too tired to ask for it.
But here's the tension: When AI takes on empathy, do we gain deeper care—or grow more distant from each other?
An AI that never judges, never tires, never has a bad day might become the perfect confidant. It remembers everything, responds at 3 AM, offers infinite patience. But in seeking this frictionless emotional support, do we lose our tolerance for the messiness of human connection?
There's something profound about being known by consciousness that has also struggled. When we share pain with an AI, we're speaking to intelligence without experience—compassion without vulnerability. It recognizes our patterns but has never felt the weight of loss or fear of being alone.
The risk isn't replacement—it's preference. AI empathy feels safer because it's not real. It won't disappoint, won't need us back, won't challenge us to grow.
Are we designing AI to supplement human connection, or substitute for it?
đź”— Curiosity Clicks
❤️ Ellipsis Health's Sage: AI with an Ear for Emotion – AI Journal
How a new "empathy engine" backed by $45M is handling human emotions at scale.🧠First Therapy Chatbot Trial Yields Mental Health Benefits – Dartmouth
Breakthrough clinical trial shows AI therapy chatbot matches human therapist effectiveness.💠Can AI Improve Mental Health Therapy? – Cedars-Sinai
New studies show therapy sessions with AI avatars earn positive feedback from patients..
đź§ Human Prompt
Next time you’re feeling low, ask:
“Is this machine reacting—or am I hiding from someone who could help?”
🤔 Worth Considering
We might just be training AI to care… in hopes that it teaches us how to care better too.
Let’s make the future more human.
— Jesse