Главная Обратная связь To favorites

The world of the unknown - Onua.org

Onua.org - this site created to familiarize the user with the world of the unknown, news of technology, space discoveries and mysteries of the Earth UFO, Video , Photo, the Witnesses, the mysteries of history and ancient civilizations.
onua.org » Hypothesis » How big is the risk of being killed in the revolt?
To learn more about 2012
Curiosity Mission
Discovery Channel
Discovery World
Discovery Science
Animal Planet
Nat Geo WILD
National Geographic Channel
Viasat History
Viasat Explorer
News calendar

Join

Popular Onua.org
Photo
?=t('Новости аномалий и неопознанных явлений')?>
To learn more about the planet Nibiru

Предлагаем восстановить, заказать, купить диплом Вуза в любом городе России. Только настоящий бланк ГОЗНАК с гарантией.

Viewings: 5109
Насколько велик риск погибнуть в восстании машин?No, this is unlikely to happen before December 21, 2012 - if, of course, not Apple will release tomorrow some piece of metal called iGod.

But in the future there are chances, optimistic believes the group of scientists under the guidance of Huwa price (Huw Price) from the University of Cambridge (UK). And preconditions she sees today. Mass development of the Autonomous combat robots, while being only in the design stage, may well result in a total robotization war with the subsequent release of the system out of control. Exploitation same story illiterate, wsadescription pop culture protects the situation from a thorough study of the military - or even someone else. Although, the researchers note, here it is necessary to speak not about foresight, but a statement of fact: friendly fire ("fire") from UAVs are not new, yet it is only limited in scale. Strictly speaking, flashing the drones by the enemy during the fighting is not something unreal, and this before the rebellion of machines half a step.

"That is why we create this research centre (Centre for research of threats to the mankind, CSER) - to attract the attention of scientific community to the problem," said Mr. Price in interview Bi-bi-si. Although scientists are concerned and other threats (global warming and other), the analysis of the risk of death of people in the fight against intelligent machines seems, perhaps, the most innovative theme raised from newborn institution.

The main problem they see the future. By 2030 mental capabilities of artificial intelligence should surpass human, I assure us (Moore's law"relentless"). And such authority, as Hans Moravek (Hans Moravec, Director of the Institute of robotics University Carnegie Mellon University (USA), right says: "ultimately, the robots will surpass us. Humanity is clearly will face extinction". In General, about how in the book, which could often be found in the knapsack killed in the First world German soldier: "Man is a rope stretched between animal and Superman... How wonderful that man is a transitional stage of its development, not a dead end". Only instead of the Nietzschean Ubermensch we, of course, the rope to the robots.

The main argument of the supporters of such views remain the famous exclamation Fermi in cafes: "Where is everybody?" ("Where IS everybody?"). On this Foundation we can expect the victory of robots: as soon as the AI will learn to construct new robots better human engineers, our species is doomed, for he is all the more progressive relies on AI, and the latter, therefore, the need of human support will lose.

After total robotics, waiting for all civilization above a certain level, the Great radio silence of the Universe becomes obvious: the robots are unlikely to be interested in contact with the thinking piece of meat, a priori inferior even to his own extinct developers, because otherwise they have long would their robots, which immediately put an end pseudoryzomys, so recklessly spending resources on their fictional consumer needs. Accordingly, the Fermi paradox not as highly developed biological civilizations.

Scary? Well, then let's remind about alternative theory. We've all heard about Roger Penrose - a well-known mechanism, and the theory of twistors, and the theory of the strong cosmic censorship... But this multilateral mind for quite some time made an interesting attempt critics such fear. He believes, "the terminator " will never zavout Earth, for all is still perpetrated by people "artificial intelligence" so you can call except in mockery. He believes that cell organelles to some extent managed of membrane proteins by periodically updated collapse of entanglement of quantum States. Therefore, the human mind - and, apparently, any consciousness in principle - is quantum, inexplicable and cannot be modelled in the framework of classical mechanics and to explain it only with the help of quantum mechanics. And any attempts to reproduce it without mechanisms of superposition and quantum entanglement (Penrose, bases of activity of the brain) are doomed.

For a long time these views were criticized, or rather was swept aside the door under a pretext that the quantum state dekoriruyut before achieved a level sufficient to influence neural processes. Recently, however, it became clear that at least for birds dekorirovanija is slower so that they literally can see the lines of the magnetic field of the Earth in real time with quantum processes in the eyes. Of course, birds, and people can be much more primitive and less advanced evolutionary. Yet the chances of survival, if Penrose rights we have. Large quantum computers (QC)are believed to be either impossible or extremely difficult to create artificially - and if possible, they will be extremely difficult to manage that will make creating equipped QC robots meaningless. However, what we told you: go to the mirror and look into it. When was the last time you promised yourself you start running in the morning?

Based on the materials Discovery News.
Com-Eva: 0 Author: admin
You are reading news Насколько велик риск погибнуть в восстании машин? if You liked the article Насколько велик риск погибнуть в восстании машин?, prokomentiruet her.
an html link to the article
BB-link to the article
Direct link to the publication

Add comment