cial Communications
(ANS – Vatican City) – On the occasion of the feast of Saint Francis de Sales, Patron of
Journalists, the Holy See Press Office has published the Holy Father’s Message for the 58th World
Day of Social Communications, which this year will be celebrated in many countries on May 12,
2024. The theme chosen this year by Pope Francis, already announced on September 29, 2023, is
topical: “Artificial intelligence and wisdom of the heart: for fully human communication.”
Dear brothers and sisters!
The development of systems of artificial intelligence, to which I devoted my recent Message for the
World Day of Peace, is radically affecting the world of information and communication and,
through it, certain foundations of life in society. These changes affect everyone, not merely
professionals in those fields. The rapid spread of astonishing innovations, whose workings and
potential are beyond the ability of most of us to understand and appreciate, has proven both
exciting and disorienting. This leads inevitably to deeper questions about the nature of human
beings, our distinctiveness, and the future of the species homo sapiens in the age of artificial
intelligence. How can we remain fully human and guide this cultural transformation to serve a
good purpose?
Starting with the heart
Before all else, we need to set aside catastrophic predictions and their numbing effects. A century
ago, Romano Guardini reflected on technology and humanity. Guardini urged us not to reject “the
new” in an attempt to “preserve a beautiful world condemned to disappear.” At the same time, he
prophetically warned that “we are constantly in the process of becoming. We must enter into this
process, each in his or her own way, with openness but also with sensitivity to everything that is
destructive and inhumane therein”. And he concluded: “These are technical, scientific, and
political problems, but they cannot be resolved except by starting from our humanity. A new kind of
human being must take shape, endowed with a deeper spirituality and new freedom and
interiority”.
At this time in history, which risks becoming rich in technology and poor in humanity, our
reflections must begin with the human heart. Only by adopting a spiritual way of viewing reality,
only by recovering a wisdom of the heart, can we confront and interpret the newness of our time
and rediscover the path to a fully human communication. In the Bible, the heart is seen as the place
of freedom and decision-making. It symbolizes integrity and unity, but it also engages our emotions,
desires, and dreams; it is, above all, the inward place of our encounter with God. Wisdom of the
heart, then, is the virtue that enables us to integrate the whole and its parts, our decisions and their
consequences, our nobility and our vulnerability, our past and our future, our individuality and our
membership within a larger community.
This wisdom of the heart lets itself be found by those who seek it and be seen by those who love it; it
anticipates those who desire it, and it goes in search of those who are worthy of it (cf. Wis6:12-16).
It accompanies those willing to take advice (cf. Prov13:10) and those endowed with a docile and
listening heart (cf.1 Kg3:9). A gift of the Holy Spirit, it enables us to look at things with God’s eyes,
to see connections, situations, events and to uncover their real meaning. Without this kind of
wisdom, life becomes bland since it is precisely wisdom – whose Latin roots apereis related to the
noun sapor – that gives “savour” to life.
Opportunity and danger
Such wisdom cannot be sought from machines. Although the term “artificial intelligence” has now
supplanted the more correct term, “machine learning”, used in scientific literature, the very use of
the word “intelligence” can prove misleading. No doubt, machines possess a limitlessly greater
capacity than human beings for storing and correlating data, but human beings alone are capable
of making sense of that data. It is not simply a matter of making machines appear more human but
of awakening humanity from the slumber induced by the illusion of omnipotence, based on the
belief that we are completely autonomous and self-referential subjects, detached from all social
bonds and forgetful of our status as creatures.
Human beings have always realized that they are not self-sufficient and have sought to overcome
their vulnerability by employing every means possible. From the earliest prehistoric artifacts, used
as extensions of the arms, and then the media, used as an extension of the spoken word, we have
now become capable of creating highly sophisticated machines that act as a support for thinking.
Each of these instruments, however, can be abused by the primordial temptation to become like
God without God (cf. Gen3), that is, to want to grasp by our own effort what should instead be
freely received as a gift from God, to be enjoyed in the company of others.
Depending on the inclination of the heart, everything within our reach becomes either an
opportunity or a threat. Our very bodies, created for communication and communion, can become
a means of aggression. So, too, every technical extension of our humanity can be a means of loving
service or of hostile domination. Artificial intelligence systems can help to overcome ignorance and
facilitate the exchange of information between different peoples and generations. For example, they
can render accessible and understandable an enormous patrimony of written knowledge from past
ages or enable communication between individuals who do not share a common language. Yet, at
the same time, they can be a source of “cognitive pollution,” a distortion of reality by partially or
completely false narratives, believed and broadcast as if they were true. We need but think of the
long-standing problem of disinformation in the form of fake news, which today can employ
“deepfakes”, namely the creation and diffusion of images that appear perfectly plausible but false
(I, too, have been an object of this), or of audio messages that use a person’s voice to say things
which that person never said. The technology of simulation behind these programmes can be useful
in certain specific fields, but it becomes perverse when it distorts our relationship with others and
with reality.
Starting with the first wave of artificial intelligence, that of social media, we have experienced its
ambivalence: its possibilities but also its risks and associated pathologies. The second level of
generative artificial intelligence unquestionably represents a qualitative leap. It is important,
therefore, to understand, appreciate, and regulate instruments that, in the wrong hands, could lead
to disturbing scenarios. Like every other product of human intelligence and skill, algorithms are
not neutral. For this reason, there is a need to act preventively by proposing models of ethical
regulation to forestall harmful, discriminatory, and socially unjust effects of the use of systems of
artificial intelligence and to combat their misuse for the purpose of reducing pluralism, polarizing
public opinion or creating forms of groupthink. I once more appeal to the international community
“to work together in order to adopt a binding international treaty that regulates the development
and use of artificial intelligence in its many forms.” At the same time, as in every human context,
regulation is itself not sufficient.
Growth in humanity
All of us are called to grow together in humanity and as humanity. We are challenged to make a
qualitative leap in order to become a complex, multiethnic, pluralistic, multireligious, and
multicultural society. We are called to reflect carefully on the theoretical development and the
practical use of these new instruments of communication and knowledge. Their great possibilities
for good are accompanied by the risk of turning everything into abstract calculations that reduce
individuals to data, thinking to a mechanical process, experience to isolated cases, goodness to
profit, and, above all, a denial of the uniqueness of each individual and his or her story. The
concreteness of reality dissolves in a flurry of statistical data.
The digital revolution can bring us greater freedom, but not if it imprisons us in models that
nowadays are called “echo chambers”. In such cases, rather than increasing a pluralism of
information, we risk finding ourselves adrift in a mire of confusion, prey to the interests of the
market or of the powers that be. It is unacceptable that the use of artificial intelligence should lead
to groupthink, to a gathering of unverified data, to a collective editorial dereliction of duty. The
representation of reality in “big data”, however useful for the operation of machines, ultimately
entails a substantial loss of the truth of things, hindering interpersonal communication and
threatening our very humanity. Information cannot be separated from living relationships. These
involve the body and immersion in the real world; they involve correlating not only data but also
human experiences; they require sensitivity to faces and facial expressions, compassion, and
sharing.
Here, I think of the reporting of wars and the “parallel war” being waged through campaigns of
disinformation. I think, too, of all those reporters who have been injured or killed in the line of duty
in order to enable us to see what they themselves had seen. For only by such direct contact with the
suffering of children, women and men, can we come to appreciate the absurdity of wars.
The use of artificial intelligence can make a positive contribution to the communications sector,
provided it does not eliminate the role of journalism on the ground but serves to support it.
Provided too that it values the professionalism of communication, making every communicator
more aware of his or her responsibilities, and enables all people to be, as they should, discerning
participants in the work of communication.
Questions for today and for the future
In this regard, a number of questions naturally arise. How do we safeguard professionalism and
the dignity of workers in the fields of information and communication, together with that of users
throughout the world? How do we ensure the interoperability of platforms? How do we enable
businesses that develop digital platforms to accept their responsibilities with regard to content and
advertising in the same way as editors of traditional communications media? How do we make
more transparent the criteria guiding the operation of algorithms for indexing and de-indexing and
for search engines that are capable of celebrating or canceling persons and opinions, histories, and
cultures? How do we guarantee the transparency of information processing? How do we identify
the paternity of writings and the traceability of sources concealed behind the shield of anonymity?
How do we make it clear whether an image or video is portraying an event or simulating it? How
do we prevent sources from being reduced to one alone, thus fostering a single approach developed
on the basis of an algorithm? How, instead, do we promote an environment suitable for preserving
pluralism and portraying the complexity of reality? How can we make sustainable a technology so
powerful, costly, and energy-consuming? And how can we make it accessible also to developing
countries?
The answers we give to these and other questions will determine if artificial intelligence will end up
creating new castes based on access to information and thus giving rise to new forms of
exploitation and inequality. Or if it will lead to greater equality by promoting correct information
and a greater awareness of the epochal change that we are experiencing by making it possible to
acknowledge the many needs of individuals and of peoples within a well-structured and pluralistic
network of information. If, on the one hand, we can glimpse the spectre of a new form of slavery, on
the other, we can also envision a means of greater freedom; either the possibility that a select few
can condition the thought of others, or that all people can participate in the development of
thought.
The answer we give to these questions is not pre-determined; it depends on us. It is up to us to
decide whether we will become fodder for algorithms or will nourish our hearts with that freedom
without which we cannot grow in wisdom. Such wisdom matures by using time wisely and
embracing our vulnerabilities. It grows in the covenant between generations, between those who
remember the past and who look ahead to the future. Only together can we increase our capacity
for discernment and vigilance and for seeing things in the light of their fulfillment. Lest our
humanity lose its bearings, let us seek the wisdom that was present before all things (cf.Sir1:4): it
will help us also to put systems of artificial intelligence at the service of a fully human
communication.