Suffering risks
Suffering risks, or s-risks, are risks involving an astronomical amount of suffering; much more than all of the suffering that has occurred on Earth thus far.[2][3] They are sometimes categorized as a subclass of existential risks.[4]
Sources of possible s-risks include embodied artificial intelligence[5] and superintelligence,[6] as well as space colonization, which could potentially lead to "constant and catastrophic wars"[7] and an immense increase in wild animal suffering by introducing wild animals, who "generally lead short, miserable lives full of sometimes the most brutal suffering", to other planets, either intentionally or inadvertently.[8]
Steven Umbrello, an AI ethics researcher, has warned that biological computing may make system design more prone to s-risks.[5]
Induced amnesia has been proposed as a way to mitigate s-risks in locked-in conscious AI and certain AI-adjacent biological systems like brain organoids.[9]
See also
[edit]References
[edit]- ^ Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority" (PDF). Global Policy. 4 (1): 15–3. doi:10.1111/1758-5899.12002. Archived (PDF) from the original on 2014-07-14. Retrieved 2024-02-12 – via Existential Risk.
- ^ Daniel, Max (2017-06-20). "S-risks: Why they are the worst existential risks, and how to prevent them (EAG Boston 2017)". Center on Long-Term Risk. Archived from the original on 2023-10-08. Retrieved 2023-09-14.
- ^ Hilton, Benjamin (September 2022). "'S-risks'". 80,000 Hours. Archived from the original on 2024-05-09. Retrieved 2023-09-14.
- ^ Baumann, Tobias (2017). "S-risk FAQ". Center for Reducing Suffering. Archived from the original on 2023-07-09. Retrieved 2023-09-14.
- ^ a b Umbrello, Steven; Sorgner, Stefan Lorenz (June 2019). "Nonconscious Cognitive Suffering: Considering Suffering Risks of Embodied Artificial Intelligence". Philosophies. 4 (2): 24. doi:10.3390/philosophies4020024. hdl:2318/1702133.
- ^ Sotala, Kaj; Gloor, Lukas (2017-12-27). "Superintelligence As a Cause or Cure For Risks of Astronomical Suffering". Informatica. 41 (4). ISSN 1854-3871. Archived from the original on 2021-04-16. Retrieved 2021-02-10.
- ^ Torres, Phil (2018-06-01). "Space colonization and suffering risks: Reassessing the "maxipok rule"". Futures. 100: 74–85. doi:10.1016/j.futures.2018.04.008. ISSN 0016-3287. S2CID 149794325. Archived from the original on 2019-04-29. Retrieved 2021-02-10.
- ^ Kovic, Marko (2021-02-01). "Risks of space colonization". Futures. 126: 102638. doi:10.1016/j.futures.2020.102638. ISSN 0016-3287. S2CID 230597480.
- ^ Tkachenko, Yegor (2024). "Position: Enforced Amnesia as a Way to Mitigate the Potential Risk of Silent Suffering in the Conscious AI". Proceedings of the 41st International Conference on Machine Learning. PMLR. Retrieved 2024-06-11.
Further reading
[edit]- Baumann, Tobias (2022). Avoiding the Worst: How to Prevent a Moral Catastrophe. Independently published. ISBN 979-8359800037.
- Metzinger, Thomas (2021-02-19). "Artificial Suffering: An Argument for a Global Moratorium on Synthetic Phenomenology". Journal of Artificial Intelligence and Consciousness. 08: 43–66. doi:10.1142/S270507852150003X. ISSN 2705-0785.
- Minardi, Di (2020-10-15). "The grim fate that could be 'worse than extinction'". BBC Future. Retrieved 2021-02-11.
- Baumann, Tobias (2017). "S-risks: An introduction". Center for Reducing Suffering. Retrieved 2021-02-10.
- Althaus, David; Gloor, Lukas (2016-09-14). "Reducing Risks of Astronomical Suffering: A Neglected Priority". Center on Long-Term Risk. Retrieved 2021-02-10.