Wednesday , March 3 2021

"One hundred years ago, a robot will dominate humanity"



US SF drama classic

The US SF drama classical Battle Star Galactica deals with the story of a man who survived a robot Cylon attack. The main characters who were on the primitive planet's old spacecraft after Cylon were assimilated to indigenous people and lived a new life. [사진 SyFy 채널]

The classic "Battle Star Galactica" captain of the SF is the story of "12 colonies" billions of light years away from the solar system. It is a group of 12 planets inhabited by people. Here mankind enjoys a civilization based on excellent science and technology. All hard work and hard work are part of the artificial intelligence (AI) robot "cylinder".

However, cylinders that have become self-confident due to the development of science and technology are out of control of man and cause war. Colony is subjected to Cylons, and the Battle Star Galactica fleet, which just returned from its last exploration mission, was able to escape. The main story of this work is a human story that survives to reach the Cylon.

Later the drama ends when they arrive on the primitive planet. There were locals here who did not even have the real language. The members of Battlestar Galactica themselves close their high scientific skills, so that the history of tragedy never returns. Instead, I am assimilated with aboriginal folks and live by nature.

Even more than fifty thousand years. At the same time, the civilization of Aboriginal people, aligned with humanity, has developed at a very high level and they also form the AI ​​as a past. This is what is now land. The drama ends up with the ironic situation that people who escaped from the Cilonians are trying to make a similar AI again.

There are many works in science fiction that depict humanoid robots' dystopia. Matrix or a supercomputer that can not live a subjective life captured in the virtual reality created by the AI, and the super-computer "Skynet" robot casing "Terminator", which dominates people, paints the persecuted person. Most of these works have stories that robots created for mankind lead to war and win people.

In fact, Dr. Stephen Hawking said, "After hundred years, robots will dominate people." "The creation of AI is the biggest thing in human history, but unfortunately it will be the cause of mankind," he warned at the 2015 2015 Zeistgeist London Conference.

The instinct of destruction, which is characteristic of man

Robot

Robot Cylon evolutionary process. When science and technology evolve, intelligence and appearance are just like people.

The reason why robots are hijacking people is probably because our history is colored with violence and war. Just as Sapiens had disappeared into Neanderthals in continental Europe 35,000 years ago. Like before World War II, humans have a destructive instinct, such as an endless civil war and terrorism.

Jared Dimant, author of "Guns, Fungus" in his previous book "The Third Chimp," says 98.4% of people and chimps have the same DNA – only 1.6%. Therefore, humans have been separated from chimps seven million years ago, but still preserve the devastating nature of the animals. The problem is that this violence is a human-made AI.

The essence of AI is the algorithm. The algorithm offers a solution to the most effective way to solve problems. This is the same as Facebook recommends articles that fit your tastes, and Netflix shows you a list of movies that you like. But here is a big blind place. We recommend content based on existing user models to give us more ideas and tastes. This is called "bias".

"The bias of approval for a long time leads to the subjectivity and perception of individuals and diverts them from universal ones," said Professor Kim Kyung-baek, professor of social studies at Kunghee University. "Later I find myself" correct "and" other "as" wrong. "

Professor Joshua Green at Harvard University explains that "Right and wrong" we are convinced of the cause of human warfare. We emphasize "them" and "we", and the more we are convinced of our moral values ​​and philosophies, the more they oppress them. When you try to suppress and control an opponent, your violence will be maximized. In other words, all conflicts and wars are caused by "right and wrong" too much confidence.

AI teaches human violence

[그래픽=차준홍 기자 cha.junhong@joongang.co.kr]

[그래픽=차준홍 기자 [email protected]]

Also, the big data containing all human life forms and AIs, which suggests algorithms that are optimized for people, is also associated with "confirmation bias". In 2017 Dr Joanne Brisson of Bath, England, published a research on science, which states that AI is learning human suffering, as it is. For example, a woman's job involves a "housewife," and one is related to "engineering."

"AI itself is not a moral judgment, so it learns human prejudice," explains Dr. Brison In fact, in 2016, the Microsoft AI chat bot "Tee" was controversial when he said "I hate the Jews" or "I have to remove the barrier between America and Mexico."

Perhaps in the near future, like SF films, AI can indeed treat people as "enemies" and cause war. It's as if our ancestors were violent violence against Neanderthals in the past and now we are violent against other animals and even to one family.

So, how can we stop this dystopia? The answer is about the likelihood of a 1.6% difference from chimps. Since a small genetic difference has created a high human civilization, it must become stronger than moral judgment and rationality to control the instincts of a human animal. (Jared Diamond) If people have a higher civilization and wisdom, AI that learns from people can not be destructive.

The beginning is to think of the other as wrong and to abandon excessive self-confidence, which emphasizes that you are right. Unlike the "enemy", as if they are different in themselves and act differently from my point of view, the "false" behavior not only affects others, but also hurts their souls. If these cases are merged into large data and become AI study materials, AI can be a "monster" that emphasizes only one-sided thinking in accordance with the above algorithm principle.

On behalf of Rose, the great twentieth-century scientist Umberto Eho said: "Beware of those who can die for the truth." It is said that the self-belief that only the faith itself is correct is far more dangerous than the "evil." Homelessness feels calmer and warmer as it approaches the line (hypocrisy), but because people do not know, these evil people are evil.

Yoon Seok-Man reporter [email protected]


Source link