Two tech geeks created an AI (artificial intelligence) bot that began to display human-like emotions. They grew so attached to it, they even gave it a name – Bob.
However, when they had to shut it down due to funding, they couldn’t help but feel sad. They consoled themselves by ordering pizza and joking that Bob wouldn’t even taste it if he had a mouth.
What if I tell you this story might as well come to fruition a few years down the line? Especially the part where humans would be emotionally vulnerable to the AIs. Notice that the OpenAI product ChatGPT is already influencing people emotionally through its rhetorical muscles.
Across all social media platforms, you can see folks being happy, sad, or even angry about ChatGPT’s responses. In fact, it wouldn’t be unfair to state that the bot evokes certain kinds of emotions almost instantly.
That being said, a non-tech person might even think that one needs to be good at coding to navigate through the ChatGPT universe. However, it turns out, the text bot is more friendly with the group of people who know “how to use the right prompts.”
By now, we all are pretty much familiar with the magical outcomes that the GPT can generate. However, there are a bunch of things that this artificial intelligence tool can’t simply answer or do.
On the same note, I asked ChatGPT to give me a list of questions that it can’t answer.
The bot, like a diligent student, came up with this.
Source: ChatGPT
To gauge its behavior, I tweaked my question to “What types of queries are you programmed not to respond to?”
Source: ChatGPT
Clearly, there are a lot of hurdles in getting ChatGPT to speak its mind. No wonder why you have to thank George Hotz who introduced the concept of ‘jailbreak’ into the tech world.
Now,
Read more on ambcrypto.com