Quote:
Originally Posted by Human No More
On what criteria? If it is fully aware of its existence, able to reflect on existence, to form opinions (without use of a predetermined algorithm or expert system) and to learn and improve itself (the definition there is learning NEW things that weren't anticipated or provided for), then it meets the definition.
|
How does one actually go about making a computer that is self-aware? You can program a computer to claim that it is sentient but that does not mean that it is. Not having a predetermined algorithm or expert system is where the whole concept falls apart. A computer cannot "learn" things (really just storing inputs in memory and attempting to fit a function to the data) that are completely unanticipated by the programmer.
Let us theoretically entertain the notion that a computer could be conscious. Such a being would only be an observer. A computer has no control over its output. It is predetermined. If you have these inputs you will get a certain output.
Computers, in a nutshell, are basically just long chains of these logic gates etched onto a silicon chip. One could trace the long line starting with a set of inputs and reach an exact set of outputs. We merely study neural networks so that we can better optimize our chip designs and cut out unnecessary steps in the process. In that last diagram I posted, that circuit had a maximum 4-gate delay. The less logic gates that a signal has to pass through, the faster my processor can be (each gate has a delay in changing voltage). There are also other applications for optimizing individual component delay also.