Peter was quick to point out to those in the audience from academia that he wasn’t doing ‘proper AI’, this was a game and he was interested in results.
Most games don’t care about the player; Peter asked if you can feel loved by a game? In part he proposed achieving this through a great story, recognising that you can generate a feeling of being cared for by good character interactions (NPCs), and they hope to achieve this result through AI. The game (Fable 2) has better memory about player’s presence & actions. You can also gain a family and children giving long term engagement.
The core of the discussion was their choice to focus on dog as the AI caring presence. The reason was quite basic; it’s much simpler to do really well. Peter almost immediately came out with the brilliant statement that ‘…the tongue engine isn’t quite right…’.
The dog has 3 basic rules – 1st rule, I will not aggravate you, I will try to help, I will look after myself. Great impact through use of controller, but you have no direct control over dog. He acts as emotional cue and guide (don’t need, or have mini-map). Peter started to think of in-game presence as ‘us’. You have the ability to hurt your dog and walk away, but it will faithfully try to return to you. In the minor demo walk through, Peter engaged a couple of small baddies but got the controller a bit wrong and shot his dog (by mistake). Instead of healing the dog, which he could do, he switched the point of view to the dog, and showed himself walking away from the limping, whimpering dog that was trying to follow its master. It was a surprisingly powerful demonstration of how blatantly you can manipulate emotion and get away with it (and I don’t even own a dog).
More prosaically, Peter noted that possibly easiest way to generate empathy is for NPCs to remember you and your actions. How many times have we gone into a shop only to be faced with the same 4 default option of buying, selling, chatting, or leaving?