Biological evolution is linear and technological evolution is exponential. Some leaders in the field say we are not more than twenty years away from the point where AI surpasses human ability.
Human ability to do what? Figure taxes maybe. Drive a car, maybe. Paint the ceiling of the Sistine Chapel, no. Reproduce, no. Program itself, no. Write a good novel, no.
I don't know, think about how far we have come over the last 50 years as compared to human existence as a whole. Hell I think Apollo 11 had a 64 KB computer, my phone has a better processor than the one that put fuckers on the moon. That's crazy.
Technology will continue to advance. But to take over from humans and supplant them . . . not a chance.
The first computer I ever owned, a Packard-Bell 386 with a 100 megabyte hard drive and 2 megabytes of ram was a mainframe supercomputer compared to what they had on Apollo 11. And I wouldn't have depended on it to get me to the moon and back.
Well,Paul Allen agrees with you, and that is pretty substantial. But there are others like Ray Kurzweil and Elon Musk who say the singularity is on the horizon. The consensus of AI experts is that it will occur by 2040 if we do not develop a new standard of ethics on AI.
Twenty years? I just don't see it. The title of this thread is Artificial Intelligence - The End of Humanity? Not . . . gonna . . . happen. Not in 20 years. There has to be a bar to be crossed that people agree on. What actually constitutes "intelligence"? I think that the ability to calculate better than a human does not constitute or substitute for human intelligence.
I think there needs to be two points to cross. The first is self awareness. The second in independence. No matter how intelligent or even self aware something is it can't do much if it can be isolated and cut off by pulling the plug. It can't run or hide. Frankly unless there is some sort of miracle I don't see self awareness anytime soon. Hell we can barely define it much less quantify it. How can we program it? Just mass memory and processing power does not self awareness make.
I think that intelligence goes far beyond self-awareness. An snail may have a degree of self-awareness, but to me intelligence is the array of virtues that set humans apart from other animals. Humans can teach themselves through observation and experimentation, develop a language, establish a social order. We can create. We can imagine. Not just exist in self-awareness. We create art, science, crafts, literature, and products. It is difficult for me to understand how a computer and attain this advanced intelligence when it has no fingers, no legs, no nose, no ears, no eyes and no brain. It cannot interact with the real world sufficiently to develop intelligence, as I measure it. They are advanced calculators, mere tools for human intelligence to exploit. Independence is indeed another vital virtue if computers are to "take over". They are not only totally dependent on human manufacture and programming to function, but they are also completely dependent on the electric power grid and the internet, also manufactured, installed and maintained by humans. Moreover they cannot reproduce. If a meteor hits the planet in 20 years and human life became extinct within a year, the machines would all die as soon as the power went out.