"...I don't believe that predicting the future is really what we're about. After all, we ourselves, or at least the younger ones among us, are going to be a part of the future. So, being a part of the future, our task isn't to predict it. It is to design it..."
CMU's Herbert A. Simon reflects on how computers will continue to shape the world
October 19, 2000
Interview below By Byron Spice, Post-Gazette Science Editor
A lifelong student of how people make decisions, Simon, 84, has focused on how people use rules of thumb. These mental shortcuts depend on our ability to recognize patterns and associate them with things we have previously experienced. They are essential, he observes, because people rarely have all the time and knowledge necessary to rationally assess situations.
It's an idea at the basis of "bounded rationality" -- the theory that won him the Nobel Prize in economics in 1978. While conventional economists maintained that people make rational choices to obtain the best commodity at the best price, Simon argued that inevitable limits on knowledge and analytical ability force people to choose the first option that "satisfices" or is good enough for them, whether they are buying a loaf or bread or choosing a spouse.
In pursuing these ideas, Simon followed his own rules of thumb. He began as a political scientist, studying how parks department budgets were made in his native Milwaukee, which led him into economics and business administration. At Carnegie Tech in the mid-1950s, he and Allen Newell incorporated a new tool -- the computer -- into the study of decision making. In the process, they invented the first thinking machine and a field that would become known as artificial intelligence.
Simon's continuing interest in how people think landed him in Carnegie Mellon University's psychology department, where he continues his pursuit of cognitive science.
Symposium to explore computers' potential
The potential of computers to make the world a better place or to create problems will be discussed by experts in psychology, artificial intelligence and the arts during a day-long symposium Thursday at Carnegie Mellon University.
Q: Do you consider your Nobel work on bounded rationality to be your most significant contribution to science?
A: Not specifically that, but it really is very closely related to the work I do in computer science. I like to think that since I was about 19 I have studied human decision making and problem solving. Bounded rationality was the economics part of that. When computers came along, I felt for the first time that I had the proper tools for the kind of theoretical work I wanted to do. So I moved over to that and that got me into psychology.
Q: So you have moved from field to field as you could bring new tools to bear on your study of decision making?
A: I started off thinking that maybe the social sciences ought to have the kinds of mathematics that the natural sciences had. That works a little bit in economics because they talk about costs, prices and quantities of goods. But it doesn't work a darn for the other social sciences; you lose most of the content when you translate them to numbers.
So when the computer came along -- and more particularly, when I understood that a computer is not a number cruncher, but a general system for dealing with patterns of any type -- I realized that you could formulate theories about human and social phenomena in language and pictures and whatever you wanted on the computer and you didn't have to go through this straitjacket of adding a lot of numbers.
That seemed to me a tremendous breakthrough. And one of the first rules of science is if somebody delivers a secret weapon to you, you better use it.
I've spent a good deal of my last 20 years looking at decision making and problem solving involved in scientific discovery. We took major historical scientific discoveries and we said what would it take to write a computer program that, given no more information than the guy who made the discovery had, would make the same discovery?
[ Ссылка ]
Ещё видео!