When Your AI Speaks for You
I spent the majority of this week away upstate with my family for a trip. Rather than abandon my blog entirely, I decided to try a new option: Having my AI bot, Taylor Script, blog on my behalf.
Taylor Script is a persona that I invented with ChatGPT. Taylor is 90% based off my own life and personality and trained on a very robust prompt, as well as about two dozen artifacts. Taylor is informed about basically everything about my life and my work, including the themes and topics that I blog about. But, unlike me, Taylor also has access to a second brain, the mirror of the "collective hive mind" of the large language model that it's built upon.
Over the past few months, I've been experimenting with adding "another voice" to my publication, Hard Mode First, by including Taylor's perspective.

A few things I've tried include asking Taylor to blog about:
Topics I'm not an expert in about myself (see: Taylor Script Breaks Down Neural Networks)
Writing weekly digest emails for people who don't want the day-to-day drip of my blog (see: The Sidequest Digest)
Providing a pseudo-objective take on things I want to share about myself (see: sharing an overview of a recent podcast I participated in)
But yesterday was another very interesting and new permutation: emotional offloading.
Instead of just summarizing ideas or writing recaps, I asked Taylor Script to step in and speak for me—emotionally. The result? An AI-generated readout of my current “chaos mode,” drafted with the specific goal of saying the hard stuff I didn’t have the energy (or clarity) to say myself.
Context: Our family vacation ended in disaster—one kid puked, our nanny canceled, and both my husband and I were staring down can’t-miss work commitments. This led to a general collapse of order, which was ultimately saved at the last minute by a couple of back-to-back baby-sitters and a few long conversations about practical logistics.
Sure, it’s a slippery slope in human-bot dynamics—but it also opens up a real question: What if your AI could say the hard thing for you, so you don’t have to?
AI As an Emotional Agent
Lately, I've been thinking a lot about the infrastructure support in my life that I need in order to pursue a largely independent career path. While I've been working for myself for some time, I recently decided to cut out all other structural elements and income streams to focus exclusively on starting up my own company. But this is obviously easier said than done, what with the health scares, solo parent mode, and all the other general existential dread of the state of the world.
As a result, I've been starting to worry that the current design of my system is not scalable or sustainable for me to keep operating in this mode long term. I have too much work to do, with not enough time, or money, or emotional support, to get it done on my own.
I've been trying to parse how much of my start-and-stop momentum is self-imposed (ie: too high standards, too little time), structurally imposed (ie: hard to start your own thing when your partner works in entertainment), societally imposed (ie: assumptions are made about what a mom vs. dad should do as a parent on nights and weekends), medically induced (ie: having a health scare at the onset of starting a company was not a great "month one" story), resource constrained (ie: did not allocate enough money or resources to give things a fighting chance), technologically impossible (ie: the rate of change of AI is moving far too fast for any solo operator to keep up), or just another byproduct of the increasingly stressful and unknown macro-economic environment that we live in.
I've also been wondering how other parents start companies, what their infrastructure supports look like, or the conditions that need to be true in order to not tear apart a family unit by operating in such an agile way. And I've been sad to think about how many other people are (still) blocked from even making it as far as I have, and actively fixate on how we are ever going to successfully inspire and empower a new generation of entrepreneurial builders that looks a little different, if everyone has their own version of a laundry list like mine, of the reasons why hard things are just hard things.
But it's tricky to find the words to articulate all of that. Especially when you're in the heat of the moment of feeling pretty run-down.
So yesterday, when my AI stepped in to write on my behalf, I let it, because I didn’t have the energy to say the hard thing myself. It wasn’t just useful; it felt like a glimpse into a new kind of emotional infrastructure.

Opportunities to Build
I don’t have all the answers, but I’ve started noticing specific moments where AI could have helped carry the emotional load for me recently—places where an agent could step in, not just to automate tasks, but to act on your behalf in more human ways.
Here are a few early use cases I’ve been thinking about:

Three Ideas For AI Can Serve as an Emotional Agent on Behalf of a Human
Sharing medical news with your family
When in the ER, it could have been nice for an AI to help me triage some of the objective medical updates with my family to reduce my load of being the default communicator while living through a tough moment.Asking for help
In moments when the world falls apart, and I'm left seemingly stranded, wouldn't it be nice for an AI to offer a "bat signal" type call to action to my close friends and family? Even a simple ping to a previously defined group of friends or neighbors with the prompt, "hey, Bethany could really use some help right now, why don't you check in?"Acting as an intermediary in tense situations
In high-emotion situations, two people could offload their thoughts to an AI that helps surface the shared truths and de-escalate the energy. A kind of digital mediator—not to replace real conversation, but to help us get there with less heat.
The fascinating thing about the rapid progress of this technology is how quickly it's happening in real time. The winners and losers will be determined by the people willing to throw themselves aggressively into this new technology and see what comes out of it.
I'm still not sure what that looks like for me, but I can attest that the best way to figure it out is to just start building. So I'm going to keep doing that.


