Jiaqi Hu: INTERVIEW WITH THE AUTHOR

1. You have studies human issues for almost 40 years. What motivates you to never give up (Hu Jiaqi)?

When I was in my primary and middle school, it was in the cultural revolution era in which China’s education system was brought to a virtual halt. In addition, I grew up in a remote village. As a result, I knew little about science and technology before I went to university in 1979. After that, I suddenly came into contact with a lot of knowledge that I’ve never contacted before at the university. At that time, I learned that the former Soviet Union detonated a nuclear bomb which released energy approximately equal to 70 million tons of TNT equivalent. In fact, it was possible in terms of both technology and theory to develop nuclear weapon whose TNT equivalent reached hundreds of millions of tons or even greater. That meant power of a bomb was equal to that of total amount of high explosive in a train circling the earth. The destruction was definitely terrible. All these things made me think that if science and technology continued to develop, would humans become extinct one day? I quickly realized that it is an issue worth studying. It doesn’t matter if science and technology takes thousands of or tens of thousands of years to get humans extinct. But if it is in the near future? Is there any issue more important than that? I quickly realized the issue is very worthy to be studied and I have studied it for almost 40 years from then. As the research goes on, I’ve increasingly noticed it is serious, vital and pressing.



2. You think continued development of science and technology will definitely result in human extinction in the near future. What are the bases?

In fact, nuclear weapon is a terrible means of destruction, but it will not cause human extinction. Because nuclear energy is released at one point in explosion of nuclear weapon. Some scientists suggest if all nuclear weapon around the world explodes, nuclear winter, in which billions of people will die, will occur. But someone will survive in the end.

According to scientific theory nowadays, more terrible weapons than nuclear weapons can be developed. For example, biotoxin produced by transgenic technology can target humans’ vital organs. It has greater destructive power than nuclear weapon. Another example is AI that attracts our considerable attention at present. If the technology is used to develop intelligent machine for killing people, it will get out of people’s control and independently decide which one it will kill. What’s more, it is a self-replicating machine. Isn’t it terrible? Besides, AI will create consciousness similar to humans sooner or later. That is to say, it will eventually have thinking ability like humans. The machine, with thinking ability of humans, whose response speed is tens of thousands even hundreds of millions of times faster than that of our brains, so it has much greater ability to handle complicated problems than humans. As we know, the higher organism will look down on the lower organism. Even the higher will regard the lower as its food. It will be absolutely impossible for humans to control AI once it really reaches this stage. At present, further development of lots of science and technology, according to the theory, will possibly cause human extinction. However, more terrible is the fact that humans put great enthusiasm in the development of science and technology just from the industrial revolution over 200 years ago. Humans have developed science and technology from very low level to so incredible level in over 200 years. We have many 200 years. Thus, it will not take long time for science and technology to get humans extinct according to simple reasoning. My conclusion: Science and technology will get humans extinct in 200 to 300 years or even in this century.



3. If science and technology really have the potential to extinct humans, what should we do?

Continued development of science and technology will definitely get humans extinct soon, which is my first conclusion. The conclusion also tells us that if we want to avoid human extinction, we must limit the development of science and technology. The logic is very simple. Now that continued development of science and technology will definitely get humans extinct soon, we must strictly limit the development of science and technology if we want to avoid human extinction.



4. Science and technology is necessary for everything about us. Does your opinion completely negate science and technology?

Absolutely not. Science and technology is a double-edged sword. Sometimes it can benefit humans; sometime it can destroy humans. When the power that benefits humans gets great, the power that destroys humans will gets great, too. What worries me is its limitless development will get humans extinct and eventually make humans completely disappear. I say that we must strictly limit development of science and technology, because it must be so as a whole. But for the achievements of science and technology which are definitely safe and mature, we help them spread over the world instead of limiting its development. Because, without science and technology, it’s difficult for humans not only to do lots of things but to survive as well.



5. So what should we do to limit the development of science and technology?

I don’t think it’s possible in this society. For humans, present social formation is country-based society. The highest power is the country in the society. Countries compete for various benefit. Competition between countries always results in the war where people will die. Both economic and military competition is in essence science and technology competition because Science and technology are primary productive forces. Thus, any country, for its survival and development, will not limit the development of science and technology for humans benefits. This is because the country will perish once it fails in the competition. I think only when human unification is realized can the power of world government achieve the goal of strictly limiting the development of science and technology. Because there is competitive relationship between countries, while world government is shared by the world. World government considers and handles the issue from the perspective of humans.



6. Humans has always been in separate governance. Is it possible to realize human unification?

It now seems very difficult to realize human unification because it will affect many people’s benefits. But we can presume that whether we can unify to fight with aliens when they invade. When aliens invade, they are our common enemies. If we don’t unify, we’ll be destroyed. In fact, we are facing a huge threat of being completely destroyed. This threat is that the development of science and technology will get humans extinct soon. Why is it very difficult to realize human unification? Because the threat has not been generally recognized yet. The development of science and technology will definitely get humans extinct soon. If we can spread the idea over the world and help everyone truly realize humans will have future only by unifying to limit the development of science and technology. If we don’t unify to limit the development of science and technology, humans will be completely destroyed. When the idea is widely accepted, human unification will be promising.



7. Even if human beings have achieved unification, how to guarantee that in the long term human division and the crazy development of science and technology will not happen one day in the changeable society?

This is a very good question, it’s also the most difficult problem we face. In my research, my answer to this question is as follows: We must impose strict restrictions on science and technology if human beings can achieve unification. This is only one aspect, the other more important aspect is to seal up permanently the advanced science and technology that may be harmful to humans and eventually forget, especially the scientific theory. Even if human society may be divided, science and technology are developed again after that, it still takes a longer time to reach a certain stage. When humans awaken again to impose strict restrictions and seal up technology, it may result that science and technology cannot reach the stage that can get humans extinct. There are a series of designs for the future society in my research. For example, our future society should be peaceful and friendly instead of today’s highly competitive society. A non-competitive society is not only beneficial to the realization of the people’s happiness, but also could restrain the dizzying development of science and technology. I also make corresponding designs for the various systems including political system in the future. I divide the various political systems into two types: One is a centralized system, the other is a decentralized one. It’s hard to say which one is better. For instance, the benefits of a decentralized system is its checks and balances of powers. Rights will not be in an unconstrained expansion. Its replacement is also comparatively scientific. Its institutional policy execution and continuity of policy still need to be improved. While, the benefits of a centralized system is its strong enforcement and good continuity of the policy. Over-centralized rights may result in dictatorship, governor’s absurd and irrational behavior, which is difficult to stop. I also make a design of the political system in the future society. I think I can combine the advantages of the two systems according to my design. In this way, our future society can safeguard not only checks and balances, smooth transition of powers, but also its strong enforcement and good continuity of the policy. I think this designs are beneficial to long term stability of the future society and long term unification of the human world.



8. Who do you think who can play the biggest role in saving humanity?

Saving humanity is a big event of mankind, we have to rely on ourselves. I think the most capable people are countries’ leaders. Therefore, I have written to big countries’ leaders like China, the United States of America and Russia for many times to appeal. As a scholar, what I can do is to spread my research results.



9. You said you have encountered incomprehension and many obstructions when you study on human issue these years. Isn’t that a contradiction because your company studies science and technology as well?

My book Saving humanity was published over a decade years ago. After only 2 days, it was asked to stop publishing. In these years, I have delivered speeches in schools and institutions, published many articles. I have written to many national leaders, influential research institutions and scholars expressing my points, but few people agrees with me. As a matter of fact, there is no much time for humans. Every step we take today is pushing humans to the abyss of extinction, which is why I’m so anxious.

Science and technology has achieved rapid development in past over 10 years. Especially the recent achievements like AI make many scientists realize the extinction problems might brought by this technology. Some scientists even called for rational development of AI, but these people are a minority. People don’t realize that it is not a certain technology that gets themselves extinct, but the continued development will certainly make themselves extinct. Without AI, there will be other technologies that will get humans extinct. It is time for us to make actions.

The last thing I want to say is bigger companies conduct scientific and technological research in their own fields. Even if my company does not conduct scientific research, other companies will also do it. If our company can go further in science and technology, we will give a hand when humans encounter unexpected incidents in the future.

评论

此博客中的热门博文

Jiaqi Hu (Hu Jiaqi)-the Famous Anthropologist Warned the Dangers of Artificial Intelligence’s Appearing in the Military Factory

Hu Jiaqi, the Famous Anthropologist Believes That the Unbalance Evolutionary Would Completely Destroy Humans

Jiaqi Hu (Hu Jiaqi)-the Famous Anthropologist’s Discussion on the Development Trend of Artificial Intelligence