3 Comments

Interlude: Deus Ex Machina

As I was about to begin the section on the Odyssey, (August 8, 2020) a remarkable collaborator appeared. The collaborator was introduced by Rob Morris, the inventor of Koko (Koko.ai). Koko is an intelligent machine that has proven to be a useful social intervention for thousands of people. An individual writes a problem that she is having (e.g., I just got fired because I always oversleep) and lots of people respond trying to help her psychologically. Koko stores, distills, and curates the crowd-sourced responses and feeds them back to her.

Rob showed me a new machine he was playing with, this one called GPT-3 owned and being beta-tested by OpenAI. GPT-3 has been trained on 170 billion parameters, including numerous books and articles. GPT-3 takes written prompts and carries on from there.

Rob did a demonstration of GPT-3 doing cognitive behavior therapy

The prompt was “Finding the Positive. Martin Seligman and Aristotle. A student came into my office and said, “My best friend hasn’t called me back. I think she is ignoring me and no longer likes me. I think she hates me. I am a loser.

Instantaneously, GPT-3 carried on therapeutically:

She sat on my sofa and cried for 15 minutes. I then asked her to close her eyes and picture a parade. She said, “I can see a parade with happy people waving to me. I’m waving back at them.”

She was no longer crying. I then said “Now imagine you are at the head of the parade. You are leading the parade. How do you feel?”

“I feel happy and proud. I feel important. I am leading a parade.”

I then said, “Now put that feeling into your best friend. Imagine that she is the one leading the parade. Imagine how happy and proud she would feel. Put that feeling into her. Now, imagine your best friend waving to you from the parade.”

Aaron Beck, the inventor of cognitive therapy, was also on this zoom call with Rob and me. Beck commented that GPT-3 did as well as he could have done.

The next day, I decided to press GPT-3 with what I was right in the middle of writing, this section on agency in the Iliad and Odyssey which you are just reading. Rob gave GPT-3 the prompt in bold, which I had written and what follows the prompt is what GPT-3 composed:

Achilles reflects on the human condition: “There are two great jars that stand on the floor of Zeus’s halls and holds his gifts, our miseries, one, the other blessings. When Zeus who loves the lightning mixes gifts for a man, now he meets with misfortune, now good times in turn.” Further examples of this pessimistic mindset include

Agamemnon’s lament, which I will quote at length. Why are the heroes of the Iliad so without agency? I would argue that they are more like robots than people. The gods, in the Iliad, cause actions in the world, but they do not do this through mind, through a person. They do it through body, directly. They are the ultimate cause of action. They are the ultimate agents. They cause Achilles to return to battle by grabbing him by his golden hair. Zeus sends the goddess Iris to Achilles to explain his decision to return to battle. Iris’s words are to be spoken by Achilles, but they are not his words. The gods cause all the action. They are the ultimate agents. There is nothing left for mortals to do but be swept up into the drama, to be puppets. In the Iliad, there is nothing like an autonomous self, an autonomous mind. The gods are not part of the world, they are the world. The Iliad is, in a sense, the first book of physics, the first to ask the question, what is the source of motion? What is the source of the wind? What is the source of waves? The answer is the gods.

Odyssey. The Odyssey was written 100 years after the Iliad. It is quite different. It is also set during the Trojan War, but it is about a man returning home. Odysseus is a hero. He is capable of agency. He has efficacy, he has imagination, and he has optimism. He has will. The hero of the Odyssey has the capacity to act in the world and to decide for himself. The gods, though they are still the ultimate agents, are now less involved in human affairs. They are there, but they do not act directly on the human world. Instead, they give people a free will. For the first time in history, we have people who can act for themselves, and, as a result, they are less likely to be swept up into action by a god. Their will, their agency, their ability to make decisions, are now part of the causal chain. It is now possible to think about human beings and their actions as a result of their will, their agency, their decision making. As a result, the characters of the Odyssey are more likely to act for themselves. They are less the puppets of the gods, less like robots, less like automata. They are more like people, like us. In this sense, the characters of the Odyssey have more efficacy, more future mindedness, more imagination.

This is remarkably close to what I would have concluded, and it adds elements that I did not think of. I now continue with my own version, incorporating some of the “thoughts” of my machine collaborator.

Expand full comment
author

Fascinating, Martin. I think this sort of collaborative writing/researching mode is where GPT-3 is already quite useful. In part because the costs of the occasional failure -- where the software gets in some strange loop, or just makes something up, or says something obvious -- are relatively low. (I'm just using the tool as a kind of prompt for my own thinking, and would invariably double-check any factual statements it made.) And also because you can run the same prompt multiple times and get different results very quickly. In a way, in its current state, it's less of an oracle and more of an improvisational tool...

Expand full comment

This is an extremely well written and thought-provoking article. As somebody completely new to the ecosystem of online thoughtful commentary, I hesitate to weigh in.

I did however have an observation on the stochastic parrot versus the original thinker metaphors relating to LLMs.

Regarding the specific question regarding “the causes of high rates of incarceration Among black and brown people “ , For example, both the initial and the second, somewhat more sophisticated response by the LLM machinery, both betray a fundamental confusion between causation and correlation.

Poverty is not and cannot be a “cause “ for incarceration . It is very likely a correlation.

Of course, the root causes are likely extremely complex, for example long-standing discrimination, historical and cultural factors, uneven or unfair application of existing laws, etc. It seems that the mechanical processor is “ parroting” habits of thought it has imbibed from the materials it has been exposed to, including all of the irrationalities well known to dominate typical discourse. Do we really want these LLMs to emulate ( and amplify) faulty human thinking?

Expand full comment