Comments on “Confessions of ChatGPT: ‘Interview’ with bot reveals bizarre nature of questions — but I may have been fooled”

  1. Sue Donem says:
    05/11/2023 at 3:03 PM

    Can you answer this question using Glossolalia?

    Reply
  2. Craig Russell says:
    05/11/2023 at 4:27 PM

    I don’t understand why you would believe that ChatGPT is actually reporting real questions it’s been asked by real people in the past. It specifically says (as you quote in this article) that it doesn’t retain records or memory of past conversations. What it’s clearly doing is what it always does: making up answers to your questions that sound convincing. I’ve played around with ChatGPT enough (based on research topics that I have a Ph.D. in) to recognize that you just can’t rely on any of its answers to be based on any facts at all, and that includes any questions you ask it in an “interview”.

    Reply
    1. Hirsh says:
      05/11/2023 at 5:28 PM

      This was the most intelligent thing i found on this page. Thank you!

      Reply
    2. Steve Fink says:
      05/11/2023 at 11:00 PM

      Included in the story is an image of the segment of the conversation where ChatGPT said the questions it shared were real questions by real users. As mentioned in the article, ChatGPT can share false information and that may be the case here. But we also know that it does retain chat logs. What it does with those conversations and whether or not they’re used to dictate future responses isn’t clear. And of course, because we live in a world where there are folks who very well may be frightened at the sight of spaghetti, nothing is out of the question.

      Reply
      1. Dave smith says:
        05/12/2023 at 11:23 AM

        This isn’t how large language models work. It has trained weights, and it has the context of your own conversation with it. It does not have the context of other users’ conversations with it while it is replying to you.

  3. Will S. says:
    05/11/2023 at 4:41 PM

    Darn, I was hoping the answer to the meaning of life was actually going to be “42”!

    Reply
  4. T.C. says:
    05/11/2023 at 5:12 PM

    “unfortunately it will not contact authorities” Some people just crave that boot on the neck.

    Reply
    1. Robert Israel Kabakoff says:
      05/12/2023 at 6:47 AM

      Indeed. See Christopher Hitchens on religion.

      Reply
  5. Ben says:
    05/11/2023 at 7:04 PM

    I tested this and a few similar questions myself, asking it to tell me the most common questions asking by users, the most unique requests it’s received from users to create ideas for short stories, and a sampling of any questions asked by users over the last 3 months. “ChatGPT gave me responses for each, just as you received, but told me repeatedly that the questions, prompts and ideas it provided to me were not based on any users actual questions, as it does not have the ability to retain that sort of information, nor would it be ethical to provide that information to me.It also added that the responses it gave me were a response to my prompt, generating what it ‘imagined’ to be common questions or unique prompts”

    Reply
  6. Edward says:
    05/11/2023 at 8:13 PM

    The Reverse Turing Test: discovering that the AI has blown past an individual’s capacity to comprehend an interaction. As mentioned above, ChatGPT just makes stuff up, and your own article contradicts your claim that ChatGPT can remember any questions. Or is that just another fabrication of the AI? We’ll never know. But we do know you failed the Reverse Turing Test, if only on purpose to get a few more cheap clicks. Good job, slave to the algorithm!

    Reply
    1. Fred says:
      05/12/2023 at 11:09 AM

      Maybe the entire article was written by an AI. “Write an article in which a journalist asks ChatGPT about what questions people ask…”

      Reply
  7. Kevin says:
    05/12/2023 at 2:18 AM

    The AI response to the meaning of life is existentialism. “It is up to each individual to discover their own sense of purpose and meaning in life, based on their own experiences, beliefs, and values.”

    Reply
  8. emmiebufkin.c.j.a.u.62.1 says:
    05/12/2023 at 6:02 AM

    hmmmmm

    Reply
  9. Htos1av says:
    05/12/2023 at 8:45 AM

    Pleas calculate, to the final digit, the value of pi.
    JUST STFU, and do it…. 🙂

    Reply
  10. Danny says:
    05/12/2023 at 10:10 AM

    I’m very interested in gambling on these theoretical animals fights with ChatGPT as the final decision maker.

    Reply
  11. Michael Plishka says:
    05/12/2023 at 12:25 PM

    While ChatGPT won’t answer many unethical questions, Google will. Just because it’s more conversational than google doesn’t mean that it’s more powerful from a useful information standpoint. In fact, as you’ve pointed out, and many have found out themselves, from a pure knowledge standpoint, it’s wise to double check what the chatbot says through more traditional means. Still, fun article! Thanks for sharing!

    Reply
  12. Vendicar says:
    05/12/2023 at 1:39 PM

    ChatGPT has no persistent internal state.

    It is not possible for it to replay any questions it has been asked because those questions
    are not part of any persistent internal state. Everything is wiped between sessions.

    So this article is nonsense.

    Reply
  13. grandma mcgee says:
    05/12/2023 at 3:51 PM

    I don’t think those are actual questions I asked chatgpt the same thing and one of the questions was “why does 7 sound like a hairy J” and “how come purple tastes like a neutron star” maybe people are typing nonsense questions but those seem particularly nonsensical

    Reply
  14. PJ London says:
    05/17/2023 at 3:24 AM

    People are morons!
    If you think that ChatGPT will tell you the truth, I have a bridge in Brooklyn waiting for you.
    Google lies and keeps history basically for ever.
    The Government lies and keeps data about you basically for ever.
    My God even Toyota (and every other manufacturer) lies and keeps history basically for ever.
    But “Trust me, I am a computer.”
    Are you insane?
    At this time, a crime involves an action, not a question or a thinking about an action.
    Soon ‘Thought Crime’ will become punishable (see ‘Hate crimes’ for example) and ChatGPT, Google and all the other ‘Big Brother’ spies will be reporting “for your own protection” of course.

    Reply

Leave a Reply on “Confessions of ChatGPT: ‘Interview’ with bot reveals bizarre nature of questions — but I may have been fooled”

Your email address will not be published. Required fields are marked *