Thursday, August 25, 2016

"Help Eliza, I'm in trouble!"

I'm something of a subscriber to the view: 'AI's the solution .. so what's the problem?'

The problem under consideration today is that of child abuse, mentioned in this post about Internet paedophiles yesterday, and prominent in continuing revelations about abuse at Ampleforth College.
[Wikipedia: "Ampleforth College is a coeducational independent day and boarding school in the village of Ampleforth, North Yorkshire, England. It opened in 1802 as a boys' school, and is run by the Benedictine monks and lay staff of Ampleforth Abbey.
...
Several monks and three members of the lay teaching staff molested children in their care over several decades. In 2005 Father Piers Grant-Ferris admitted 20 incidents of child abuse. This was not an isolated incident.

"The Yorkshire Post reported in 2005: "Pupils at a leading Roman Catholic school suffered decades of abuse from at least six paedophiles following a decision by former Abbot Basil Hume not to call in police at the beginning of the scandal."]
---

Let me remind you about Eliza, the original chatbot developed by Joseph Weizenbaum.
"ELIZA worked by simple parsing and substitution of key words into canned phrases. Depending upon the initial entries by the user, the illusion of a human writer could be instantly dispelled, or could continue through several interchanges.

"It was sometimes so convincing that there are many anecdotes about people becoming very emotionally caught up in dealing with [ELIZA] for several minutes until the machine's true lack of understanding became apparent.

"Weizenbaum's own secretary reportedly asked him to leave the room so that she and ELIZA could have a real conversation.

"As Weizenbaum later wrote, "I had not realized ... that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people."
Eliza works by matching text input against a large database of templates. Each input template is linked to one or more possible output templates, with variables which can be instantiated to the substantive words from the input.

Eliza might, for example,
"respond to "My head hurts" with "Why do you say your head hurts?" A possible response to "My mother hates me" would be "Who else in your family hates you?"

"ELIZA was implemented using simple pattern matching techniques, but was taken seriously by several of its users, even after Weizenbaum explained to them how it worked. It was one of the first chatterbots."
In addition to crafting a reply, Eliza could easily have updated a user-database with the information it was receiving.

---

It's easy to see how this could be applied to helping victims of child abuse. A key design principle is that the abuser must not become aware that the child is passing on information: this rules out a tailored 'abuse app'.

I suggest a special WhatsApp-connected chatbot with a widely publicised name - let's say Help!.

The child contacts Help! on WhatsApp and the first thing he or she is asked to do is choose a name, say Peter, which is what will appear (instead of Help!) on their WhatsApp contacts list. I think the history of chats with Peter is going to have to vanish too, replaced with harmless confected froth.

The child is typing to an Eliza-like chatbot (maybe more like IBM's Watson than Eliza) which has been trained on scripts from charities like Childline.

Like Weizenbaum, we know that people of all ages are especially likely to confide in an AI agent.

The database which Help! constructs is a transcript of alleged abuse. The real problem is what to do with it. No doubt it's encrypted and identity-protected but at some point someone has to assess whether this is a real or false allegation, and figure out how to proceed.

But these are problems charities already have to deal with.

I think they should move on the app. There's already one for carers.

No comments:

Post a Comment

Comments are moderated. Keep it polite and no gratuitous links to your business website - we're not a billboard here.