1972

PARANOID PARRY

“The best performance overall in HMC (human-machine conversation) has almost certainly been Colby’s PARRY program, since its release on the [Arpanet] around 1973,” wrote AI scientists Yorick Wilks and Roberta Catizone in 1999. “It was robust, never broke down, always had something to say and, because it was intended to model paranoid behavior, its zanier misunderstandings could always be taken as further evidence of mental disturbance. . . .”

In 1972, psychiatrist Kenneth Colby (1920–2001) developed PARRY, a computer program intended to emulate a person with paranoid schizophrenia. Specifically, PARRY was developed to test theories of paranoid thinking, and the AI, which appeared to have delusions about the Mafia, had a knowledge representation involving feelings of self-inadequacy and protection against certain interview questions (people with paranoid schizophrenia are highly suspicious of others’ motivations). Colby hoped that PARRY could be used to teach students, and he also believed that paranoid patients’ sentences were produced by an underlying organized structure of rules that could be taught to computers, studied, and used to treat patients.

PARRY handled conversations by assigning weights to various dialogue inputs. Interestingly, psychiatrists who interacted via text with PARRY failed to perceive that they were interviewing a computer program, and they also could not identify which “patients” were human and which were computer programs. Perhaps PARRY had come close to passing the Turing test, at least in a special setting (i.e., interacting with a simulated person who is not sane). PARRY was also available on the Arpanet (a predecessor of the Internet), where it engaged in more than 100,000 sessions, including sessions with the ELIZA psychotherapist.

In 1989, Colby started a company, Malibu Artificial Intelligence Works, that marketed a therapy program for depression. This program would be used by the US Navy and Department of Veterans Affairs and was distributed to people who (controversially) used it without consulting a trained psychiatrist. Colby told one skeptical journalist that his depression program could be better than human therapists because “after all, the computer doesn’t burn out, look down on you, or try to have sex with you.”

SEE ALSO Turing Test (1950), ELIZA Psychotherapist (1964), Ethics of AI (1976)