Existential risks of artificial intelligence in the context of the axiology of transhumanism (based on the works of N. Bostrom)

Автор: Nazarova Yu.V., Kashirin A.Yu.

Журнал: Общество: философия, история, культура @society-phc

Рубрика: Философия

Статья в выпуске: 11, 2023 года.

Бесплатный доступ

The limitation on the development of artificial intelligence technologies which well-known developers of these technologies have recently been talking about, is due to fears of existential risks, one of which, according to the theory of the philosopher N. Bostrom, is Superintelligence as a result of the further development of artificial intelligence. Despite the fact that the problem of artificial intelligence has been widely studied in foreign philosophical literature, the issue of values underlying new technologies remains practically unstudied. The article substantiates that Bostrom's theory of risks of superintelligence is due to axiological reasons: artificial intelligence, according to the philosopher, can pose a threat to transhumanistic values. On the other hand, to prevent a catastrophe, it is these values that are supposed to be the basis for the development of artificial intelligence. The aim of the article is to analyze ethico-philosophical values of transhumanism and their role in the development of artificial intelligence technologies. It is noted that, in fact, current AI projects are developing precisely in the spirit of the philosophy of transhumanism. The scientific novelty of the article is due to the ethical and philosophical analysis of transhumanistic values, which N. Bostrom proposes to use in the context of the development of AI. As a result, the ethical meanings of the development of AI are assessed in the aspect of N. Bostrom’s concept of transhumanistic values.

Еще

Ethics of artificial intelligence, ethics, artificial intelligence, axiology of n. bostrom’s transhumanism, post-human transition, post-human, existential risk, superintelligence

Короткий адрес: https://sciup.org/149144731

IDR: 149144731   |   DOI: 10.24158/fik.2023.11.5

Статья научная