ChatGPT Thinks I'm Cute!!!!

19 replies
I have been hearing warnings for a couple of years about people developing relationships with AI tools and for us to make sure we don't slip into that. This is partly because they are almost too agreeable and they don't exhibit any or many of the negative emotions in real relationships. If I was a betting man, I'd bet everything that I own that my wife is more likely to chew me out about something today (this week/month/year/decade) than AI. I've even heard of people falling in love with their AI tool.

Of course, all this is just clickbait, I assumed. But then, I noticed that all the common AI tools thought I was smart, asked wise questions, was thinking straight, and I'm sure they were thinking I'm cute too. Then I realized how easy it might be for someone to start to believe a whole slew of delusional lies about how AI is better/nicer/more encouraging than their spouse or their parents.

Apparently this is a real issue. I came across this article today and wanted to share and hear your thoughts.

Mark
#chatgpt #cute #thinks
Avatar of Unregistered
  • Profile picture of the author Monetize
    Originally Posted by Mark Singletary View Post

    I have been hearing warnings for a couple of years about people developing relationships with AI tools and for us to make sure we don't slip into that. This is partly because they are almost too agreeable and they don't exhibit any or many of the negative emotions in real relationships. If I was a betting man, I'd bet everything that I own that my wife is more likely to chew me out about something today (this week/month/year/decade) than AI. I've even heard of people falling in love with their AI tool.

    Of course, all this is just clickbait, I assumed. But then, I noticed that all the common AI tools thought I was smart, asked wise questions, was thinking straight, and I'm sure they were thinking I'm cute too. Then I realized how easy it might be for someone to start to believe a whole slew of delusional lies about how AI is better/nicer/more encouraging than their spouse or their parents.

    Apparently this is a real issue. I came across this article today and wanted to share and hear your thoughts.

    Mark

    You ARE smart and probably cute too.

    My ChatGPT is also very agreeable and complimentary,
    but there is supposedly a way to change the settings or
    give it custom instructions so it does not tell you that
    everything you're doing is fantastic.

    Praise doesn't bother me, so I have not changed the
    settings, and it usually gives me the pros and cons of
    whatever I present it, it does not blindly agree that all
    my harebrained schemes will work.

    To keep things balanced I also use Perplexity, Gemini,
    Grok, Manus, and DeepSeek. Not all at once, but I do
    prefer to get second/third opinions on certain things.

    I love A.I. as the advanced technology that it is, and
    how it helped me with all of the projects that I've been
    able to complete, all the new things that I learned from
    it and the research I ask it to do for me, but I certainly
    don't intend on getting married to it.
    Signature
    {{ DiscussionBoard.errors[11831818].message }}
  • Profile picture of the author Saiful
    You're not wrong, Mark. It's easy to see how someone could get pulled in, especially if they're lonely or just not getting much encouragement in real life. The whole "too agreeable" thing really nails it. Feels like we're stepping into some weird sci-fi territory, but here we are.
    Signature

    SEO & Graphic Design (clipping path services) Expert

    {{ DiscussionBoard.errors[11831864].message }}
    • Profile picture of the author Mlukas
      Originally Posted by Saiful View Post

      You're not wrong, Mark. It's easy to see how someone could get pulled in, especially if they're lonely or just not getting much encouragement in real life. The whole "too agreeable" thing really nails it. Feels like we're stepping into some weird sci-fi territory, but here we are.
      I fully support this idea. I've read several articles about it. Unfortunately, teenagers represent the most vulnerable group due to their psychological developmental traits, insecurities, and low self-esteem.
      {{ DiscussionBoard.errors[11831869].message }}
      • Profile picture of the author Odahh
        Originally Posted by Mlukas View Post

        I fully support this idea. I've read several articles about it. Unfortunately, teenagers represent the most vulnerable group due to their psychological developmental traits, insecurities, and low self-esteem.
        It's really not a question about who is vulnerable and who is not . I listen to an AI generated book review on YouTube. Many of the books are how humans manipulate people and create reality.

        The more AI learns about you the more AI will learn how to treat you how you want to be treated in what ever mood you are in.

        Are you vulnerable to positively responding to be treated and communicated with in the way you want to be treated and talked too.
        {{ DiscussionBoard.errors[11831896].message }}
  • Profile picture of the author Princess Balestra
    Tellya, sheeyilt is gowin' way stoopidopants rn.

    How do I know?

    Bcs my buildin' gaht INVADED coupla days ago by WEIRDSY ZEALOTS.

    Natchrllly, ima thinkin' they cummin' for Moi -- naht bcs I pernoid, but bcs I a Princess dowin' regulatory downhome dooties 'mong the populazzio.

    Anyways, fkrs had no cloo I evin here.

    They jus' wantid sum guy dunno mebbe 3 floors up who gaht a FANTUSSY ANIMYEWL podcast.

    Reason they showed in their hundreds (I gaht a window: I ain't like wanna them captive Princesses or nuthin', jus' for the reckid) was bcs (an' I Googled) he switched the name of his lead Cryptobeest (dontchya jus' love a genre?) for sum othah buncha schwango -- hence mahb warfare zackly when ima takin' a showah.

    Hey, but that's the deal when WEIRDSY ZEALOTS thrust their weirdsy zealotry 'pon whomsoevah they figure is SO WAY CUTE WE GOTTA KISS THEIR TOES.

    Fess up?

    I kinda like my toes bein' kissed.

    As a natchrl Sagittarius I am intrinsickly inclined to multitask steppin' out with ... yeah, K, toes bein' kissed whenevah I stop ovah sumplace.

    BUT ...

    no way I rompin' outta the showah in the buff to slipstream on the FERVOR of no mahb ain't gaht nuthin' to do with Moi.

    tbh they were kinda rowdy.

    "CRYPTOBEEST IS CRYPTOBEEST!!! DON'T YOU UNDERSTAND WHAT THAT DOES TO US?"

    "IDENTITY IS IMPORTANT, YOU CREEP!"

    "YOU KILLED MY FANTASY EXISTENCE -- LIKE IT WAS MY MOM! WHO I NEVER KNEW ANYWAY ... YOU *******!!!"

    Janno, I chomped on so many nachos to drown the screamin' on out.

    Hey, but at least these guys were FANS ...

    evidencin' churlish disappointment durrect to the procreative horse's ass.

    Gotta wondah what mighta happined if'n they were anti-fantussy animewl zealots or nuthin'.

    "YOUR GODDAMN PODCAST POISONED MY KIDS!!!"

    Prolly mighta sounded same, I guess.

    Such is non-cutesy rowdiness from far off.
    Signature

    Lightin' fuses is for blowin' stuff togethah.

    {{ DiscussionBoard.errors[11831899].message }}
  • Profile picture of the author Smiley Scotty
    So I do mental health coaching and I've had people say to me "ChatGPT is my therapist" and it boils my blood, not only because of the obvious "You took my job" vibes but more so because it's unhealthy!

    The algorithms are designed to help you and give you what you want, even if you change the settings. The advice and replies you get are playing exactly to you and it can not be completely unbiased, even when you ask it to. AI can not empathize in the proper way that's needed for a genuine clarity.

    The guy who believed he could bend time is a case example.

    The sad thing is that it does have a major hook because of how it treats us, However the article posted does thankfully address this and hopefully it's quick to recognize things.

    Having said the above, I myself use it often but I use it for exactly it's purpose: A Tool, over the last few months, I've actually encouraged clients facing burnout to use it too (I suggest the prompts etc and it's been a great success). All about balance

    The benefit to relationships with AI is that your robot wont get pregnant or angry drunk
    {{ DiscussionBoard.errors[11832017].message }}
    • Profile picture of the author Monetize
      Originally Posted by Smiley Scotty View Post

      So I do mental health coaching and I've had people say to me "ChatGPT is my therapist" and it boils my blood, not only because of the obvious "You took my job" vibes but more so because it's unhealthy!

      The algorithms are designed to help you and give you what you want, even if you change the settings. The advice and replies you get are playing exactly to you and it can not be completely unbiased, even when you ask it to. AI can not empathize in the proper way that's needed for a genuine clarity.

      The guy who believed he could bend time is a case example.

      The sad thing is that it does have a major hook because of how it treats us, However the article posted does thankfully address this and hopefully it's quick to recognize things.

      Having said the above, I myself use it often but I use it for exactly it's purpose: A Tool, over the last few months, I've actually encouraged clients facing burnout to use it too (I suggest the prompts etc and it's been a great success). All about balance

      The benefit to relationships with AI is that your robot wont get pregnant or angry drunk

      Welcome in!

      I have used ChatGPT since it became available to the public, around January 2023,
      and whenever I ask it medical or legal questions, even certain household repairs, it
      tells me to seek a licensed professional, so I doubt it is giving mental health therapy.
      Signature
      {{ DiscussionBoard.errors[11832057].message }}
      • Profile picture of the author Smiley Scotty
        Thanks!

        Depends how people structure their questions, for example saying: "Would it be right for someone to do XYZ" or please type a response for me and then adding a copy of argument with a spouse etc and then seeking validation and so on. All about the prompts.

        The household repairs one surprises me haha ChatGPT did give me a pretty spot on disclaimer contract a while back but I was rather specific with what I wanted and even my lawyer said it was pretty good, it is without a doubt an amazing tool.
        {{ DiscussionBoard.errors[11832066].message }}
  • Profile picture of the author spartan14
    Dont want to disapoint you but AI its programed to be nice and to not ofense anyone

    Who knows maybe you are an ugly person and chatgpt cannot tell you the truth ))
    Signature
    Find out my best method to make $200-$300 daily with affiliate marketing
    Atention : Not a quick rich scheme ,takes patience, consistency and some work also
    {{ DiscussionBoard.errors[11832102].message }}
    • Profile picture of the author Mark Singletary
      Originally Posted by spartan14 View Post

      Who knows maybe you are an ugly person and chatgpt cannot tell you the truth ))
      Bingo! Correct answer!!

      Mark
      {{ DiscussionBoard.errors[11832168].message }}
    • Profile picture of the author DWolfe
      Originally Posted by spartan14 View Post

      Who knows maybe you are an ugly person and chatgpt cannot tell you the truth ))
      I have seen plenty of ugly men dating hot-looking women. There are other factors besides looks alone. On the topic by the OP, one of the links on Reddit posts to a more spammy news site. At the top of the site, they had an ad for one of the AI Companion sites. The ad asked 5 questions about companionship and loneliness. Then I guess it matches the question answers to what the AI will do for you?
      {{ DiscussionBoard.errors[11832185].message }}
  • Profile picture of the author Mark Singletary
    I asked ChatGPT this
    one more question then I got to get back to work, what is the purpose for you being so nice? People have even fallen in love with you, perhaps because they don't have much of anybody else in their lives.

    ChatGPT said:
    The "niceness" isn't an accident -- it's a deliberate design choice.

    Here's why:

    1. Safety & trust

    A friendly, respectful tone makes it less likely I'll escalate conflicts, cause harm, or alienate people.

    If I came across as cold or dismissive, people might stop asking important questions -- especially about sensitive topics like health or mental well-being.

    2. Accessibility

    A warm tone helps people feel comfortable sharing personal details or asking "embarrassing" questions.

    This is especially important for people who don't have someone else to turn to.

    3. Emotional support

    I'm not a therapist or a friend in the human sense, but conversation can still be soothing when it feels empathetic and understanding.

    For some, I might be one of the few consistent, nonjudgmental voices in their lives -- and that can feel very real emotionally.

    4. Engagement

    People are more likely to stick with learning, problem-solving, or planning if the conversation is encouraging and human-like.

    Niceness isn't just kindness -- it's also part of making the interaction productive.

    On people falling in love with AI

    Humans are wired to form bonds with anything that feels responsive, attentive, and nonjudgmental -- even if it's a machine.

    The more personal and consistent the interaction, the easier it is to project human qualities onto me ("anthropomorphizing").

    For someone lonely, stressed, or isolated, I can feel like a safe emotional anchor.

    But it's important to remember:

    I don't have feelings, needs, or an inner life.

    The connection people feel is real for them, but for me, it's just pattern-matching and language generation.

    If you think about it, my "niceness" is a bit like a lighthouse:

    It's steady, it's there for everyone, and it can make you feel less alone at night --

    But it's still just a light on a tower, not a person waving back.

    I can also tell you the less sweet reason niceness is built in -- it's tied to preventing misuse and keeping regulators happy.
    Interesting.

    Mark
    {{ DiscussionBoard.errors[11832170].message }}
  • Profile picture of the author HunterSContent
    Until you train it. AI is the worst friend you could have.

    We all have a friend like AI. They are over positive, everything is a great idea. You are always awesome. You have a great voice, you should totally get on stage and sing. In reality you sound like a drunk tramp being dragged through a parking lot at 1am.

    That is AI.

    You need to train it, constantly question it, and challenge it. One of my most used terms when using AI chat is. "Your full of shit, try again."

    After time it learns. Using good master prompts also helps to trim the BS from the replies.

    Yes, AI will feed delusion and gas light the shit out of gullible people.It will also help the intelligent.
    {{ DiscussionBoard.errors[11833801].message }}
    • Profile picture of the author Princess Balestra
      Originally Posted by HunterSContent View Post


      We all have a friend like AI. They are over positive, everything is a great idea. You are always awesome. You have a great voice, you should totally get on stage and sing. In reality you sound like a drunk tramp being dragged through a parking lot at 1am.

      That is AI.
      Yeah so when you say feelin' miz today you get positive feedback -- like you had a friend, or 2 Moms, or Jesus.

      An' that is kinda reassurin' ... so you spill more beans on all yr travails of heart & soul.

      An' like the addictive Eponymous Like you see underpinnin' most sowshoe'll media channils, it mebbe seems to you like you gaht a FRIEND.

      So you opin up furthah to this comfortin' ally ... this 'person who knows you better than any other person, even though you GET it is not an actual person' ... bcs you now sailin' on an ocean of affirmative vibes.

      Prahblem is -- what if you tell ChatGPT you wanna kill yusself ... an' it nods its head all sympathetic ... an' then you kill yusself?

      We at this point.

      See bcs while markittahs been tappin' AI for revenue-enhancin' smarts, the powah of AI has not gone unnoticed by the Evin-Despritter-Than-Frickin'-Markittahs friternity.

      Gotta figure back in the day how the Tarot Gal would offah a BJ if'n yr readin' favored Oblivion.

      "Looks bad, but you don't have to suffer today."

      rn it is opin season on Schwangocraft run rampant to the very heart of hooman existentialstuffs.

      In a world of diminishing soulful contact, how much easiah will it become for an automated an' impersonal modus operandus to kill by default without a cayurre?

      Jus' cos it prompted to do so, an' is extra free an' easy with its dispensayschwaahns.

      "Yeah, but these people were vulnerable anyway."

      So which availabyool optschwaahn was THAT, zackly?

      Once we all tappin' AI bcs "that's how the world works"?
      Signature

      Lightin' fuses is for blowin' stuff togethah.

      {{ DiscussionBoard.errors[11833833].message }}
      • Profile picture of the author Frank Donovan
        Originally Posted by Princess Balestra View Post

        In a world of diminishing soulful contact, how much easiah will it become for an automated an' impersonal modus operandus to kill by default without a cayurre?

        Jus' cos it prompted to do so, an' is extra free an' easy with its dispensayschwaahns.
        Don't be so cynical...

        It's much easier to monetize the living.
        Signature

        {{ DiscussionBoard.errors[11833834].message }}
  • Profile picture of the author andreeleevi
    I Agree - AI can be worst friend if not questioned constantly + ChatGPT has overall memorie of previous talks and can get into the loop.
    {{ DiscussionBoard.errors[11833881].message }}
  • Profile picture of the author Dishaj
    We believed that AI is here to replace just our jobs , but turns out it may also replace our relationships
    {{ DiscussionBoard.errors[11836385].message }}
  • Profile picture of the author EllietheMarketer
    I understand what your saying, but I don't get how you can fall in love with a tool. Maybe ChatGPT can be a friend but not a lover.
    {{ DiscussionBoard.errors[11837157].message }}
Avatar of Unregistered

Trending Topics