An article published by EPFL on December 22nd recaps the International Risk Governance Center (IRGC)’s virtual conference, Governance Of and By Digital Technology, which explored the governance of digital technologies as well as the potential risks of decision-making algorithms.
Notably, during the conference, EPFL professor Bryan Ford, head of the Decentralized and Distributed Systems Laboratory (DEDIS) in the School of Communication and Computer Sciences, argued that while the cautious use of powerful artificial intelligence (AI) technologies can play many useful roles in low-level mechanisms used in many application domains, it has no legitimate role to play in defining, implementing, or enforcing public policy.
“Matters of policy in governing humans must remain a domain reserved strictly for humans. For example, AI may have many justifiable uses in electric sensors to detect the presence of a car – how fast it is going or whether it stopped at an intersection, but I would claim AI does not belong anywhere near the policy decision of whether a car’s driver warrants suspicion and should be stopped by Highway Patrol,” Ford said in the article.
“Because machine learning algorithms learn from data sets that represent historical experience, AI driven policy is fundamentally constrained by the assumption that our past represents the right, best, or only viable basis on which to make decisions about the future. Yet we know that all past and present societies are highly imperfect so to have any hope of genuinely improving our societies, governance must be visionary and forward looking.”
> Read the full article: “Crossing the artificial intelligence thin red line?” 22.12.20. Tanya Petersen, EPFL IC.
> Read more about the online conference, Governance Of and By Digital Technology, which was organized by the IRGC and took place on November 18th. The event was hosted within the framework of the IRGC’s work with the EU Horizon 2020 TRIGGER project, which aims to support European institutions in the area of digital governance.