Pope Francis said the use of artificial intelligence in science and medicine must be based on ethical standards that put humanity and the pursuit of the common good first.
“The ethical development of algorithms, or “algorithmic ethics,” can become a bridge that allows these principles to be concretely incorporated into digital technologies through effective interdisciplinary dialogue,” the Pope said. This was stated in a message to the participants of the general meeting on April 28th. Professor at the Pontifical Academy of Life.
He further stated that “human rights represent an important convergence point in the search for common ground.”
The Pontifical Academy of Life will sponsor a major workshop on “Robot Ethics: Man, Machine, Health” at the Vatican on February 25th and 26th, focusing in particular on the use of robots and artificial intelligence in medicine and healthcare. I guessed.
This event will be followed by the General Assembly of the Pontifical Academy from February 26th to 28th, which will discuss the implications, challenges and safeguards needed for the use of artificial intelligence, as well as the implications AI poses in the areas of ethics, legal rights and health. Research was conducted exclusively on the effects. Care.
On the last day of the General Assembly, leaders from Microsoft and IBM, the world’s leading developers of AI software, as well as representatives from the European Parliament and the Food and Agriculture Organization of the United Nations, signed a charter calling for an ethical framework and ethical framework. Guidelines for the field of artificial intelligence.
In his speech, the Pope thanked the participants in the General Assembly for grappling with the impact of artificial intelligence in medicine, the economy and society, where important decisions are already “the result of human will and a series of algorithmic inputs.” did.
“Personal action has now become a point of convergence between truly human input and automated computation, resulting in the ability to understand its purpose, predict its impact, and define the contribution of each element. “Things are becoming increasingly complex,” he said.
Learning how to adapt to the benefits and potential pitfalls of new technology is nothing new, he said, noting that the steam engine, electricity and the printing press revolutionized the way humans store and share information. He pointed out innovations such as inventions.
However, the Pope said that while today’s advances in science and medicine continue to “instill a sense of limitless possibility,” they are also “blurring boundaries that were previously thought to be clearly demarcated. For example: , between inorganic and organic matter, between organic matter and organic matter, etc.” A constant interconnection between the real and the virtual, stable identities and events. ”
But dangers such as algorithms used to “extract data that allow us to control our mental and relational habits” are “undermining the immense possibilities that new technologies offer.” We are placing ourselves before a gift from God, a resource that can bring good. “Fruit,” the Pope said.
Professor Francis also expressed concern about the increasing use of artificial intelligence in the biological sciences, saying, “We cannot ignore the correlation and integration between life as it is experienced.” It costs money. ”
“The ethical questions that arise from the way these new devices regulate the birth and destiny of individuals require a new approach to preserving the humanity of our shared history,” the Pope said.
He also said that the Catholic Church’s social teaching could make a “significant contribution” to the goal of developing ethical standards that protect “the dignity of the individual, justice, subsidiarity and solidarity.”
“These are expressions of our determination to serve every individual and all peoples with integrity, without discrimination or exclusion,” the Pope said. “The complexity of the technology world requires an increasingly clear ethical framework to make this effort truly effective.”