Anyone Can Turn You into an AI Chatbot

(wired.com)

12 points | by jatwork 13 hours ago ago

5 comments

  • zahlman 12 hours ago

    What an obnoxious clickbait title. Yes, anyone can create an AI chatbot that simulates you. That isn't turning you into anything; you remain yourself. But more to the point, are they really writing an entire article about such a banality? The URL slug implies that they think there's some kind of consent issue here. I can't fathom why. It's not any different from people spending their free time hypothesizing about what you might say in a given situation. In fact, it's pretty much exactly that, just with a computer program involved. Why would anyone expect to be able to prevent others from doing it?

    • add-sub-mul-div 11 hours ago

      Who the fuck did you think was going to read sorcery from this headline? This is very normal use of language. Why does every thread here get littered with comments doing the most useless policing of titles?

  • puttycat 13 hours ago
  • AStonesThrow 13 hours ago

    I used to play MUDs, MUCKs and MUSHes, for several decades. Naturally, once they incorporated programming languages, a few players ventured to write bots. (For my part, I implemented Conway's "Game of Life", which was synchronous and notorious for freezing up the server.)

    One highly successful bot had an extensive inventory of reactions, triggers, actions, and absurd nonsensical sayings. He was quite beloved. I'm not sure that I was ever able to peek at the source code, but it was surely complex and expanded over many years of development. This bot was imbued with such perspicacious insight and timing that we often treated it as a sentient player in its own right. Indeed, it became one of the most prolific chatters we had, along with yours truly.

    Another time, one of our players went on vacation, call him "J"; and to fill the void, someone created "Cardboard J". And it was a very simplistic automatic bot, just loaded with one or two dozen sayings, but it was hilarious to us, because it captured the zeitgeist of this player, who didn't role-play and wasn't pretentious about his character; he just played himself.

    Other players were known to keep extensive log files. I believe that sometimes the logs were published/leaked to places like Twitter, at least the most dramatic ones. I was involved in at least two scandals that were exposed when logs came to light.

    I can only imagine what it'd be like to interact with a chatbot trained on me for the past 30 years!

  • exodust 10 hours ago

    An ethically sound idea for deceased characters without consent issues, could be for pets.

    Before your pet dies, have your pet properly scanned and recorded. The barks, the purring and various mannerisms.

    You could upload a bunch of carefully framed photos and recorded sounds, the service processes those to produce a highly realistic virtual pet you can interact with in various modes. Full Tamagotchi to fully automatic.

    Possibly unhealthy? Pets die, we should let go? Hard to say.