Perhaps his most fundamental heresy was the belief that the computer revolution, which Weizenbaum not only lived through but centrally participated in, was actually a counter-revolution. There is so much in Weizenbaum’s thinking that is urgently relevant now. Weizenbaum’s pessimism made him a lonely figure among computer scientists during the last three decades of his life he would be less lonely in 2023. There are different opinions on which risks we should be most worried about, but many prominent researchers, from Timnit Gebru to Geoffrey Hinton – both ex-Google computer scientists – share the basic view that the technology can be toxic. Today, the view that artificial intelligence poses some kind of threat is no longer a minority position among those working on it. Artificial intelligence, he came to believe, was an “index of the insanity of our world.” But by the 1970s, Joseph Weizenbaum had become a heretic, publishing articles and books that condemned the worldview of his colleagues and warned of the dangers posed by their work. As an MIT professor with a prestigious career, he was, in his words, a “high priest, if not a bishop, in the cathedral to modern science”. His experience with the software was the beginning of a remarkable journey. As Colin Fraser, a data scientist at Meta, has put it, the application is “designed to trick you, to make you think you’re talking to someone who’s not actually there”.īut the Eliza effect is far from the only reason to return to Weizenbaum. What distinguishes ChatGPT is not only the complexity of the large language model that underlies it, but its eerily conversational voice. Inside the chatbot is a “large language model”, a mathematical system that is trained to predict the next string of characters, words, or sentences in a sequence. Take the way many people relate to ChatGPT. While he never used the term himself, he had a long history with psychoanalysis that clearly informed how he interpreted what would come to be called the “Eliza effect”.Īs computers have become more capable, the Eliza effect has only grown stronger. Weizenbaum had stumbled across the computerised version of transference, with people attributing understanding, empathy and other human characteristics to software. This concept helps make sense of people’s reactions to Eliza. The residue of our earlier life, and above all our childhood, is the screen through which we see one another. When we interact with other people, we always bring a group of ghosts to the encounter. While it is amplified by being in psychoanalysis, it is a feature of all relationships. Briefly, transference refers to our tendency to project feelings about someone from our past on to someone in our present. Instead, something more interesting was going on: transference. It wasn’t because he was exceptionally charming or good-looking, he concluded. Rather, Eliza illuminated a mechanism of the human mind that strongly affects how we relate to computers.Įarly in his career, Sigmund Freud noticed that his patients kept falling in love with him. The reason that people are still thinking about a piece of software that is nearly 60 years old has nothing to do with its technical aspects, which weren’t terribly sophisticated even by the standards of its time. ![]() In the last year, Eliza has been invoked in the Guardian, the New York Times, the Atlantic and elsewhere. More recently, the release of ChatGPT has renewed interest in it. ![]() It caused a stir at the time – the Boston Globe sent a reporter to go and sit at the typewriter and ran an excerpt of the conversation – and remains one of the best known developments in the history of computing. “I believe this anecdote testifies to the success with which the program maintains the illusion of understanding,” he noted.Įliza isn’t exactly obscure. After a few moments, she asked Weizenbaum to leave the room. In a follow-up article that appeared the next year, he was more specific: one day, he said, his secretary requested some time with Eliza. “Some subjects have been very hard to convince that Eliza (with its present script) is not human,” Weizenbaum wrote. Similarly, Eliza would speak in such a way as to produce the illusion that it understood the person sitting at the typewriter. The cockney flower girl in George Bernard Shaw’s play uses language to produce an illusion: she elevates her elocution to the point where she can pass for a duchess. ![]() He called the program Eliza, after Eliza Doolittle in Pygmalion. It looked at the user input and applied a set of rules to generate a plausible response. Weizenbaum published this sample exchange in a journal article that explained how the chatbot worked.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |