ChatGPT: Hammer or Screwdriver?
For complex systems, one simple story does not make all other stories false. This feature can apply to the purpose of the system as well as its description.
Continuing to explore my many stories about complex systems. See this first post for an introduction.
For complex systems, one simple story does not make all other stories false. This feature can apply to the purpose of the system as well as its description. The convention is to declare the purpose of the system to be what the designer declared it to be. Some truth in that. Another story with some truth in it is that the purpose of a system is what it is observed to do. Some truth in that too.
This framing was first noted by an early systems scientist, Stafford Beer. It was a perspective that was successful enough to receive its own acronym, POSIWID (The Purpose Of a System Is What It Does). Sadly, that perspective has not moved much beyond the complex systems community. I say sadly, because that perspective removes a lot of frustration in dealing with complex systems.
In the Designer has the last word perspective, any deviation from the plan is seen as a failure. In the perspective where the systems own story has some truth to it, deviations from the plan are seen as a normal feature of a complex system. The system is successful, just not quite in the way the designer planned.
In my experience, treating deviation as a success can lead to new opportunities. I once started writing a Scrabble playing game on an early personal computer, and ended up writing the first personal computer spelling checker. I noticed what the system was doing effectively (creating a compressed dictionary), rather than insist it do what I had originally intended (play Scrabble).
When the deviation is negative from the design point of view, the impulse is to treat the system as broken. For simple systems, this is an effective strategy. Logically deduce which component is not working as intended, and replace it. For complex systems, this can be a frustrating exercise. Sometimes it is better to treat the system as successful, but not quite in the way you intended.
ChatGPT is a good example of this feature of complex systems. My go to story about ChatGPT is that it is a storyteller. Its purpose is to mimic a human telling plausible stories about what it read on the Internet. It does this with 99.99% accuracy. Despite this success story having a lot of truth to it, many stories about ChatGPT focus on failure. I would compare that to building an effective screwdriver, then complaining about its failure as a hammer.
The corporations footing the bill for ChatGPT would like it to be another system. A database lookup system that only returns approved facts or a spreadsheet program that does reliable calculation. It seems kind of obvious to me that a system that tells you stories about what it read on the Internet is going to be a bit liberal with what counts as a fact, or a bit hazy on mathematics. An accurate reflection of the source material.
Maybe not what the sponsors wanted, but a success in its own terms. In some contexts, ChatGPT can be more successful in telling you about the world that a database lookup engine or a maths processor. Those kind of contexts often occur when talking about complex systems. The topic of these posts. I’m impressed that I can give ChatGPT this post, and get it to rewrite it in many different styles and viewpoints. That is a valuable tool for me. A system that only ever gave the one “correct” version would be less valuable to me.
Maybe the ChatGPT system can be forced into something more corporate. Bolting on external checking or multi-step processing can move in that direction. I also believe in emergent properties of complex systems. So I am prepared to be surprised; bigger and better versions might yet learn logic and planning. The screwdriver may yet learn to be a hammer. That said, I tend towards the camp that says LLM’s need some serious design changes before they get good at logic and planning. (see Le Cunn).
So, a takeaway from this post. Complex systems have many stories, many with some truth in them. For human designed systems, the designer’s story is clearly a compelling one. But it is not the only one, or even the one with the most truth in it. As a systems designer, I am very familiar with systems I designed taking on a life of their own going in directions I didn’t expect. A much more interesting world than one it which everything goes according to plan.
See the first post for an introduction to this series. See the home page for previous posts on the topic. Of particular relevance is this one on Emergent Properties. For more about the author, see the About page.