Now, as I think about it calmly, I can figure it out. It did exactly what I was trying to do. I wanted a machine that could understand man and that's all. “Do you understand me?” “Oh, yes!” answers the listener, and both go about their business, happy with each other. In conversation it's much easier. But in experiments with computers I shouldn't have confused understanding with agreement. That's why (better late than never) it's important to figure out what understanding is.
There is practical, or goal, understanding. You put in a program; the computer understands it and does what is expected of it. “Attack, Prince!” and Prince grabs the pants cuff of a passerby, “Gee!” and the horses turn to the right. “Haw!” and they go left. This kind of primitive understanding of the gee — haw type is accessible to many living and inanimate systems. It is controlled by achievement of the goal, and the more primitive the system, the simpler the goal must be and the more detailed the programmed task.
But there is another understanding: mutual understanding. A complete transferral of your information to another system. And for this, the system receiving the information must not be any simpler than the system giving the information. I didn't give the computer a goal. I was waiting for it to finish building itself and making itself more complex. But it never finished — and that's natural. Its goal became the complete understanding of my information, not only verbal, but all of it. (The goal of a computer — that's another loose concept that shouldn't be played with. Simply put, information systems behave according to certain laws that somewhat resemble the rudiments of thermodynamics. In my system sensors, crystal units, TsVM — 12 had to reach an informational equilibrium with the environment — just as the iron ingot in the oven must achieve temperature equilibrium with the coals. This equilibrium is mutual understanding. And it cannot be achieved on the level of circuitry nor on the level of simple organisms.)
And that's how it happened. Only man is capable of mutual understanding with man. And for good mutual understanding, a close friend. My double was the product of informational equilibrium between the computer and me. But, incidentally, the pointers on the informational scales never did match up. I wasn't in the lab then and didn't meet face to face with my newly hatched double. And later everything went differently for us anyway.
In a word, it was horrifying how poorly I had run the experiment. The only point in my favor was that I had finally thought of setting up the feedback mechanism.
An interesting thought: if I had run the experiment strictly, logically, throwing out dubious variants, would I have gotten the same results? Never in my life! I would have come up with a steady, sure — fire Ph.D. thesis, and nothing more, hi science, mostly mediocre things happen — and I was prepared for mediocrity.
So everything was all right? Why does sadness gnaw at me? Why do I keep harping on my mistakes? I succeeded. Because it didn't go by the rules? Are there any rules for discoveries? Much happens by accident that you can't put down to your scientific vision. What about Galvani's discovery, or X — rays, or radioactivity, or electronic emissions, or any discovery that is the basis of some science or other and is related to chance. I still don't understand a lot of it? That's the situation with many scientists. Nothing to be upset about. Then why this self — torture?
I guess the problem is something else: you can't work that way now. Science has become very serious now, not like in the days of Galvani and Roentgen. This is the way, without thinking, that you can come up with a force that can destroy the whole world instantly — with a brilliant experimental proof….
My double came out of the bathroom rosy pink and in my pajamas and settled in front of the mirror to comb his hair. I stood behind him. Two identical faces stared out from the mirror. Only his wet hair was darker.
He took out the electric razor from the closet and plugged it in. I watched him shave and almost felt that I was visiting him; his behavior was so casual and at — home. I couldn't resist speaking up:
“Listen, do you at least realize how unusual this situation is?” “What? Don't bother me!” He was obviously beyond being interested in the fact.
The graduate student put down the diary and shook his head: well, Valentin the Original didn't know people very well.
He had also been in shock. His sense had told him that he woke up in the tank, understanding everything: where he was and how he got there. Actually, his discovery began then. And his insolence was only a cover — up. He was searching for a mode of behavior that would keep him from being reduced to a lab guinea pig.
He picked up the diary.
“But you appeared from a machine, not from a mother's womb! From a machine, do you understand?”