absolute? With kids? And the previously impossible? There is artificial intelligence for that

🔥 Explore this insightful post from Culture Latest 📖

📂 Category: The Big Story,Culture,Culture / Digital Culture,AI as Emotional Check

✅ Here’s what you’ll learn:

Me: the founder

Use Sol Kennedy To ask his assistant to read the letters his ex-wife sent him. After the couple split in 2020, Kennedy says he found their communication “difficult.” He’d get an email, or a torrent of them—things about their kids mixed with unrelated emotional trauma—and trying to respond would ruin his day. Kennedy, a blockchain technology founder and Silicon Valley investor, was undergoing treatment at the time. But outside of the weekly sessions, he felt the need for real-time support.

After the couple divorced, their communications shifted to a platform called OurFamilyWizard, which hundreds of thousands of parents in the United States and abroad use to exchange messages, share calendars, and track expenses. (The OFW keeps a time-stamped, court-admissible record of everything.) Kennedy paid extra for an add-on called ToneMeter, which OFW described at the time as “emotional spell checking.” As you draft a message, its software performs a basic sentiment analysis, noting what language could be “alarming,” “aggressive,” “annoying,” “offensive,” and so on. But there was a problem, Kennedy says: Neither of his parents seemed to be using drugs Ha ToneMeter.

Kennedy, an early adopter, has been experimenting with ChatGPT in order to “co-create” bedtime stories with his children. Now he turned to her for advice on communications with his ex-wife. He was fascinated – and he wasn’t the first. Across Reddit and other online forums, people with difficult past experiences, family members, and co-workers have been posting in shock about the seemingly excellent guidance, and precious emotional validation, that a chatbot can provide. Here was a machine that could tell you, without any apparent agenda, that you were not crazy. This was the advisor who would patiently hold your hand, 24 hours a day, as you waded through any amount of bullshit. “A scalable solution” to complement treatment, Kennedy said. finally.

But ChatGPT, fresh out of the box, was too talkative for Kennedy’s needs, he says — and too apologetic. He fed her harsh messages, and recommended that she respond to them (with more sentences than necessary). I’m sorry, please forgive me, I will do better. He had no self-esteem, no self-respect.

Kennedy wanted a chatbot with a backbone, and he thought that if he built it, a lot of other parents might want one, too. As he saw it, AI could help them at every stage of their communications: it could filter out emotionally triggering language from incoming messages and summarize only the facts. It can suggest appropriate responses. Kennedy says it can point users toward a “better way.” So he founded a company and started developing the app. He called it “BestInterest,” after the standard courts often use for custody decisions — the “best interest” of the child or children. He was taking OpenAI’s ready-to-use models and giving them a backbone with his own prompts.

Separated partners end up fighting horribly for any number of reasons, of course. For many, perhaps even most, things calm down after enough months have passed, and a tool like BestInterest may not be useful in the long run. But when there’s a certain personality type in the mix — call it “high-conflict,” “narcissistic,” “controlling,” “toxic,” or whatever “crazy-making” synonym you tend to see across your Internet feed — the fighting over kids, at least on one side, never stops. Kennedy wanted his chatbot to stand up to these people, so he turned to the person they would hate most: Ramani Durvasula, a Los Angeles-based clinical psychologist who specializes in how narcissism shapes relationships.

🔥 Share your opinion below!

#️⃣ #absolute #kids #previously #impossible #artificial #intelligence

By

Leave a Reply

Your email address will not be published. Required fields are marked *