Open Letter Urges Action To Address AI, And Other Global Risks
Open letter from ‘The Elders’ group urges world leaders to address existential risks of artificial intelligence and climate change
Non governmental group called ‘The Elders’ published this week an open letter about the existential risks facing the world.
The group, founded by former South African President Nelson Mandela, and including former political leaders and public figures such as Ban Ki-moon, Helen Clark, Jimmy Carter, Richard Branson, Annie Lennox, Charles Oppenheimer, Gordon Brown, and many others, signed an open letter in which they called “on world leaders to work together to address these existential threats more decisively.”
They identified the escalating dangers of the climate crisis, pandemics, nuclear weapons and ungoverned AI, as the existential risks facing the world.
Open letter
“Our world is in grave danger,” the letter states. “We face a set of threats that put all humanity at risk. Our leaders are not responding with the wisdom and urgency required.”
The letter was published Thursday and shared with governments around the world.
“The signatories of this letter call on world leaders to work together to address these existential threats more decisively,” it stated.
“In a year when half the world’s adult population face elections, we urge all those seeking office to take a bold new approach. We need long-view leadership from decision-makers who understand the urgency of the existential threats we face, and believe in our ability to overcome them.”
The letter calls on governments and leaders to:
- Think beyond short-term political cycles and deliver solutions for both current and future generations.
- Recognise that enduring answers require compromise and collaboration for the good of the whole world.
- Show compassion for all people, designing sustainable policies which respect that everyone is born free and equal in dignity and rights.
- Uphold the international rule of law and accept that durable agreements require transparency and accountability.
- Commit to a vision of hope in humanity’s shared future, not play to its divided past.
“These principles of long-view leadership can inform urgent changes in policy,” the letter states. “Governments can get to work now to agree how to finance the transition to a safe and healthy future powered by clean energy, relaunch arms control talks to reduce the risk of nuclear war, save millions of lives by concluding an equitable pandemic treaty, and start to build the global governance needed to make AI a force for good, not a runaway risk.”
The letter urged leaders to change direction.
“The biggest risks facing us cannot be tackled by any country acting alone,” they warned. “Yet when nations work together, these challenges can all be addressed, for the good of us all.”
“Despite the seriousness of these existential threats, hope remains,” the letter states. “Our best future can still lie ahead of us. We call on leaders to take the long view, and show the courage to lead us to that better future.”
Other calls
There have been other open letters and warnings about the dangers of AI before.
One of the most recent was an open letter in March 2023, when a group of artificial intelligence (AI) experts and executives banded together to urge a six month pause in developing more advanced systems.
That letter had been signed by Steve Wozniak, co-founder of Apple; Elon Musk, CEO of SpaceX, Tesla and Twitter; researchers at DeepMind; AI heavyweight Yoshua Bengio (often referred to as one of the “godfathers of AI”); and Professor Stuart Russell, a pioneer of research in the field.
Both Elon Musk and Steve Wozniak, and others including the late Professor Stephen Hawking had previously warned about the dangers of AI.
Indeed Professor Hawking previously warned artificial intelligence could spell the end of life as we know it on Planet Earth.