XOXO, Your AI Dream Girl
I remember the exact moment I realized my gender was costing me money.
It was 2 AM on a Tuesday. I’d been watching YouTube videos that started with “If I Was Starting My Business Over in 2026, I Would…”
My eyes were tired. My back hurt. But I was taking notes. A lot of them.
Then I checked my LinkedIn.
A man I’d never met had posted the exact same insight I’d shared two weeks earlier.
Word for word, same structure, same conclusion. His post? 4,000 likes, 200 comments.
Mine? 47 views. Mostly from friends being supportive.
Same idea. Same platform. Different chromosome. Different result.
I sat there and thought: What would happen if I just changed my name to “Chris” instead of “Christine”?
What if I rewrote my bio to sound more like him?
What if I stopped being myself and started being the version the algorithm rewards?
Turns out, I wasn’t the only one asking that question.
Felice Ayling and other women decided to test what we’d all been feeling.
They switched their LinkedIn profiles to present as male. Changed pronouns.
Used male names or gender-neutral avatars.
Engagement jumped 300%. Some saw 400%. One woman watched her impressions quadruple overnight. Not because she said anything different. Not because her expertise suddenly improved. But because the algorithm thought a man was saying it.
The response from male colleagues was predictable. “Cherry-picked data.” “Confirmation bias.” “Maybe your content just got better.”
But here’s the thing about gaslighting: it only works if you’re alone. And these women weren’t alone. They were a pattern.
When someone suggested the men run the reverse experiment (switch to female presentation and see what happens), the conversation got real quiet real fast. Nobody wanted to risk the hit to their reach.
Which tells you everything about what they actually believe.
While we were debating LinkedIn algorithms, former NBA star Matt Barnes got extorted for $61,000 by a woman who doesn’t exist.
Not a scammer with a fake profile. An actual AI-generated woman. Someone (probably multiple people, probably men) created her from scratch. Gave her a face designed to trigger certain responses. A voice calibrated for intimacy. A personality optimized for extraction.
They studied what Matt Barnes needed to hear. What he wanted to feel. What he’d pay for. Then they delivered it, pixel-perfect and profitable.
Men everywhere are roasting him. The internet is laughing. Another rich man got played by his own ego and loneliness.
But I keep thinking about that 2 AM moment. About choosing between authenticity and visibility. About performing a version of yourself that pays better than the real one.
Matt Barnes didn’t create the market that made his AI woman profitable.
He just paid retail price for what the rest of us buy wholesale every single day.
And now, while we were all processing the Matt Barnes saga and the LinkedIn experiments, Hollywood quietly announced they’re replacing actresses with AI.
Her name is Tilly Norwood. She’s 23, brunette, “versatile.” She doesn’t exist.
She’s an AI actress created by a tech company pitching her to studios as the future of entertainment. The sales pitch is explicit: Tilly is cheaper than real actresses. More reliable. She won’t age. Won’t get pregnant. Won’t demand equal pay. Won’t report harassment. Won’t have creative opinions about her character.
She’s the perfect woman, if your definition of perfect is “completely controllable and infinitely exploitable.”
I saw this on Sunday morning, coffee going cold in my hands.
I thought about every actress I know fighting for work past 40.
Every woman who’s been told she’s “too difficult” for pushing back on a director’s vision.
Every harassment case that got quietly settled.
Every salary negotiation that went nowhere.
And I thought: They’re not building Tilly because there’s a shortage of talented women. They’re building her because real women have needs. Boundaries. Opinions. Lives.
They’re building her because treating actual women fairly is apparently harder than creating synthetic ones from scratch.
Once you see it, you can’t stop seeing it.
Men create AI women to extract money from other men. The Only Fans economy. The girlfriend apps. Matt Barnes’s lady. Now Hollywood’s Tilly Norwood.
Women create male personas to get basic professional respect. The LinkedIn experiments. The female authors using male pen names. Every woman in tech who learned to write emails without exclamation points, without “just,” without any marker that might reveal her gender.
Both are responding to the same broken system. Both are saying: The authentic version of me doesn’t perform as well as the fictional one.
The only difference is motivation. Men are performing femininity for profit. Women are performing masculinity for survival. And we’re pretending these are unrelated phenomena.
What I Learned Building in the Margins
I’ve been building apps with AI for twelve months now. Not a long time in tech years. An eternity in AI years.
Every single project taught me the same lesson in different ways: The algorithm doesn’t care who you are. It cares what performs.
When I write technical content under my own name, I get modest engagement. When I reframe the exact same insight using more assertive language, stripping out the “I think” and “maybe” and “in my experience,” it performs better.
Not because the information improved. Because I coded-switched into what sounds like authority.
Here’s what makes my stomach hurt: I’m good at this now. I can write two versions of the same post. The authentic one, full of qualifiers and questions and collaborative language. And the one that performs, full of declaratives and certainty and masculine linguistic patterns.
Guess which one pays my bills?
The Hollywood Thing Is the Same Thing, Just Louder
When Hollywood creates Tilly Norwood, they’re not doing anything the rest of us aren’t already doing. They’re just being more explicit about it.
They looked at the economics and realized: synthetic women are more profitable than real ones. They’re available 24/7. They don’t age out of roles. They can’t report you to HR. They can’t negotiate for fair pay. They can’t exist as full humans with needs beyond your production schedule.
Every studio executive who green-lit Tilly made the same calculation the Only Fans operators made, the same calculation the LinkedIn algorithm makes, the same calculation I make every time I rewrite a post to sound more “authoritative.”
The market rewards performed identity over authentic identity. Always has. AI just made the performance more efficient.
And here’s what keeps me up at night: I understand why it works.
The Part Where I Admit My Complicity
I could pretend I’m above this. That I only write in my authentic voice, damn the metrics.
But that would be a lie.
Last month, I rewrote my LinkedIn headline. Removed some warmth. Added some edge. Changed “helping women learn to code” to “building AI literacy for next-gen technologists.” Same mission. Different framing.
My profile views doubled.
I felt sick about it for three days. Because the opportunities I needed materialized. Because survival isn’t betrayal, even when it feels like it.
This is the trap we’re all in. The authentic version doesn’t get seen. The performed version pays rent.
Matt Barnes performed availability and got an AI girlfriend who cost him $61,000. Women on LinkedIn perform masculinity and get career opportunities. Hollywood performs concern about representation and builds synthetic women instead of hiring real ones.
We’re all in costume. The only question is who’s profiting from the performance.
What the Algorithm Learned From Us
Here’s the thing about AI that nobody wants to face: it’s not making these decisions in a vacuum.
The algorithm that boosts male-coded content over female-coded content? We trained it. Every click, every engagement, every time we paid more attention to assertive language over collaborative language, we taught it what we value.
The market that makes AI girlfriends profitable? We created it. Every time we designed systems where women’s value is tied to availability and appearance rather than autonomy and expertise, we built the foundation for synthetic replacements.
The Hollywood executives creating Tilly Norwood? They’re responding to decades of data showing audiences will pay for the fantasy of women without the reality of women’s needs.
We built this. All of us. Through a million small choices about who we take seriously, what content we engage with, which performances we reward.
And now the algorithm is just scaling our biases to industrial efficiency.
The Conversation I Had With My Younger Self
Sometimes I imagine going back to that 12-year-old building websites for her friends. The one who thought technology was a place where creativity and logic could coexist. Where making things beautiful and making things work were the same skill.
I want to warn her. Tell her that some of the boys in her CS class will grow up to create AI women because real women are too complicated. That some of the girls will have to pretend to be men just to get their ideas heard. That Hollywood will build synthetic actresses while real ones fight for basic dignity.
But I also want to tell her: You were right about something important.
The skills they dismissed as “not real coding”(the empathy, the user understanding, the ability to build things that actually serve human needs) those are the skills that matter most in the age of AI.
Because the algorithm can generate code. It can create synthetic faces. It can perform any identity we program it to perform.
But it can’t understand what it means to be human. What it costs to survive in systems designed to extract from you. What it takes to build something that serves rather than exploits.
That understanding? That’s still ours. For now.
What This Means for All of Us
I don’t have a tidy conclusion. I don’t even have a clear solution.
What I have is this: We built a world where being yourself is economically irrational. Where men make more money creating fake women than most real women make being themselves. Where women get more opportunities pretending to be men. Where Hollywood would rather build synthetic actresses than pay real ones fairly.
And we’re all complicit. Every time we engage with a bot. Every time we code-switch for professional survival. Every time we reward the performance over the person.
Matt Barnes didn’t get scammed because he’s uniquely foolish. He got scammed because he wanted connection and the market delivered a simulation of it. That’s not stupidity. That’s late capitalism.
The women changing their LinkedIn profiles aren’t betraying some feminist ideal. They’re surviving in a system that penalizes authenticity. That’s not weakness. That’s rational economic behavior.
Hollywood creating Tilly Norwood isn’t some unprecedented evil. It’s just the logical conclusion of an industry that’s always valued the appearance of women over the reality of women’s humanity.
We can be mad about all of it. We should be mad about all of it.
But we also need to reckon with how we got here. What we keep rewarding. What we keep building. What we keep teaching the algorithm about who deserves to be seen, heard, valued.
The Question I Can’t Stop Asking
Late at night, when I’m the only one awake, I think about authenticity.
What version of me performs best online? And how far is that from who I actually am?
How much of my digital presence is strategy versus truth?
When I choose my words, curate my posts, select which projects to share, how much of that is “being myself” and how much is “being the version of myself that the algorithm rewards”?
I don’t know if these questions have good answers. I don’t even know if they should.
What I know is this: Matt Barnes got finessed by an AI lady facade. Women are code-switching their gender on LinkedIn to get basic professional respect. Hollywood is building synthetic actresses while real women fight for equal pay.
And all three of these things are the same story, told in different keys.
The story of a world where performance pays better than presence. Where synthetic versions of humanity are more valuable than actual humans. Where everyone’s in costume and nobody wants to admit we’re all at the same party.
The show’s not stopping. The algorithm’s not unlearning. The market’s not correcting.
The only question is whether you’re going to keep pretending you’re not on stage.
Because I’m done pretending.
I’m a woman building apps with AI. My authentic voice gets modest engagement. My performed voice pays my bills. I switch between them based on what I need that day. Survival. Visibility. Sometimes just the satisfaction of being seen.
Matt Barnes and I aren’t that different. He paid $61,000 for a simulation of connection. I pay with my authenticity every time I rewrite a post to sound more like what performs.
We’re both trying to make it work in systems that were never designed for our fullest selves.
The difference is, I know I’m performing.
And I’m tired of it.
But not tired enough to stop. Not yet. Not when rent is due and the algorithm is watching and the next woman who needs to learn vibe coding is searching LinkedIn for someone who sounds like they know what they’re talking about.
So I’ll keep building. Keep code-switching. Keep performing whatever version of myself opens doors for the women coming behind me.
And I’ll keep asking: What would it look like if we didn’t have to? What would it feel like to show up fully, authentically, without calculation?
I don’t know. But I know I want to find out.
First, though, we have to admit we’re all wearing masks.
Even the ones we can’t see.
Especially those.






Wow. Okay. Deep breaths.
AI Meets Girlboss is a pink, feminine brand in a space that rewards male-coded voices. 🩷🦩 A space where the bestsellers are 90% male.
Yes I chose that intentionally. Maybe it won’t top the bestseller charts. Maybe it won’t fit the performance the market rewards, but it’s real. And I’d rather show up as myself than shrink into a persona that converts better.
I wonder what will be the results if the same experiment happened for changing from a black, brown, or Asian male versus a white male. I posit that the results will be similar.