Readings: Simulating People

Thinking About People: Designing Games for Social Simulation a GDC Talk by Mitu Khandaker-Kokoris

Chapter 3: Face excerpt from Better Game Characters by Design – A Psychological Approach by Katherine Isbister. From an industry-oriented textbook.

Chapter 2 – Eliza Effect from Expressive Processing by Noah Wardrip-Fruin.
A critical and technical analysis of the famous Chatbot.

Posted in

5 comments

  1. Ralph

    All of the readings in some way address the issue of using technically limited AI in order to simulate complex social situations. They all advocate embracing this limitation in some way to reveal insights about social interaction more expressively than mimesis. I agree, and the fact that this viewpoint has to be presented as a counterpoint to conventional attitude highlights the problematic overvaluing of technological over artistic achievement that is somewhat characteristic of games as an industry. No one finds a photograph inherently more interesting than a painting by virtue of its higher fidelity to visual realism. No gamer finds a realistic physics simulator as entertaining as Goat Simulator or the Havok Engine. It should be rather obvious that no one would find a perfect social simulator as engaging as abstracted, dramatized systems like Crusader Kings 2.

  2. Rachel M

    These readings all touch on the kinds of emotions that are easily achievable and the intensity of those emotions are different depending on the kind of social strategy used by a developer. The unconscious social reactions in the player, like the facial feedback hypothesis, are the most interesting parts of these strategies for me. Part of the ineffable magic of a good work is manipulating visual design, as well as language, tone,mechanic, and dialogue, in ways subtle enough that the player does not notice their manipulation. The Eliza Effect is an example of how this can go south and how the “Autonomous Story and Authored Story” spectrum can tilt. In games, I think mechanics are the chief mode of metaphor, and using a mechanic to instill emotion or meaning is why the medium of game is chosen over another. This is what allows the flavor of visual design and language to have subtle weight—when there is so much occurring at once, a player can chose let the psychological effects of the game’s atmosphere work around them. For example, dys4ia relies heavily on metaphoric mechanics, rather than visual design, to tell its story and deliver its meaning. This makes the game’s visual qualities “magic” in that they work as disorienting atmosphere without having to be actively contemplated to be understood.

  3. Gregory Rose

    The industry article on facial expressions reminded me of part of a talk by Shawn Achor at Google about how we spread emotions socially — https://youtu.be/Muce2TxDlMw?t=23m40s (relevant part 23:40-36:48). In a sense, others’ facial expressions convey their emotions, and our own emotions become substantially affected by the average of those around us. So, facial expressions build not just relationships between individual people but can transmit the mood of a large community of people.

  4. kewagner

    Using an AI to simulate complex social situations induces the Eliza effect. This is why narrated stories — whether they are narrated by the user, like SIMS, or consist of predefined twine branches (or any other kind of branch) are more successful. The lack of autogeneration according to an algorithm, and, more importantly, limited time, scope, storyline and expectation create a more vivid impression world than a shallow space that doesn’t let you get into the details. Additionally, having more information about a character does not increase depth of character (and therefore empathy for said character). Rather, we follow a character through a story arc and gauge personality through their reaction to a situation. This reaction is transmitted in the form of emotion. TLDR: inducing emotion more important than additional information. This concept can be scaled for worlds, characters, …

  5. mfinn

    As I went over the readings this week, I couldn’t help but to apply the HomePlay III example game Façade as a frame of reference. The game is visibly advanced compared to the study of the early text-based game Eliza within Chapter 2 of Expressive Processing by Noah Wardrip-Fruin. You are placed in a navigable and elaborate 3D environment. You interact within the home of a couple. Their relationship is apparently rocky. Throughout gameplay, the couple’s facial expressions are somewhat uncompromising, though this trait works in accordance with the couple’s strained relationship. What is comparable, however, is the AI-based feedback that players receive from their contribution to the script of the game. You are able to speak to the couple within the game and expect a relevant reply. There is, of course, a limit to what is able to unfold within the developing script. As a player, you are able to type whatever you want and the couple will respond accordingly. If you begin to direct the conversation outside the bounds allotted by the software, however, the couples will redirect the conversation to something the program understands.

Post a comment