The Vatican creates own guidelines against misuse of AI in ChatGPT era
The Vatican has now issued its own handbook for navigating the ethics of AI technology in the ChatGPT era.
NEW DELHI: Amid the call to regulate and create guardrails against the misuse of artificial intelligence (AI), The Vatican has now issued its own handbook for navigating the ethics of AI technology in the ChatGPT era.
Pope Francis has teamed up with US-based Santa Clara University’s Markkula Center for Applied Ethics to create guidelines for the use of AI.
They have formed a new organisation called the Institute for Technology, Ethics, and Culture (ITEC), which released a handbook titled ‘Ethics in the Age of Disruptive Technologies: An Operational Roadmap’.
The handbook is meant to guide the tech industry through the ethical issues in AI, machine learning, encryption and more, reports Gizmodo.
Rather than wait for governments to set rules for industry, the ITEC hopes to provide guidance for people within tech companies who are already wrestling with AI’s most difficult questions.
“There’s a consensus emerging around things like accountability and transparency, with principles that align from company to company,” said Ann Skeet, Senior Director of Leadership Ethics at the Markkula Center, and one of the handbook’s authors.
The handbook spells out one anchor principle for companies, which is further broken down into seven guidelines, such as “Respect for Human Dignity and Rights”, and “Promote Transparency and Explainability”.
Those seven guidelines are then broken down into 46 specific actionable steps, complete with definitions, examples, and actionable steps.
For example, the principle “Respect for Human Dignity and Rights” includes a focus on “Privacy and confidentiality”.
The handbook calls for a commitment to “not collect more data than necessary”, and says “collected data should be stored in a manner that optimises the protection of privacy and confidentiality”.
The companies should consider specific protections for medical and financial data, and focus on responsibilities to users, not just legal requirements, it said.