Large language models (LLMs) like ChatGPT are capable of generating text on an endless range of topics. In this video, Taylor Woodcock asks ChatGPT to discuss some of the most important legal issues surrounding the use of artificial intelligence (AI) in the military. We cover legal interpretation, International Humanitarian Law, responsibility, and more.
Taylor Woodcock is a researcher in the Designing International Law and Ethics into Military Artificial Intelligence (DILEMA) project, which explores the legal aspects of the use of military applications of AI. While autonomous weapons systems often grab the headlines when it comes to military AI, software-based applications of AI can also have a major impact on the military. These applications can range from AI-powered decision-support systems for intelligence and targeting to coordination and planning tools that are more similar to ChatGPT than autonomous weapons.
Links and resources:
Read more by Taylor on algorithmic decision-support systems (DSS) in warfare.: tiny.cc/MHCTY
The full scenarios can be found in Magdalena Pacholska paper Military Artificial Intelligence and the Principle of Distinction: A State Responsibility Perspective published in the Israel Law Review: tiny.cc/scenarioMP
DILEMA Project: www.asser.nl/DILEMA
------
Stay up to date with research, activities, and events at the Asser Institute: Newsletter: tiny.cc/AsserTodayYT
0:00 - 0:53 Introduction
0:54 - 1:51 Legal issues
1:52 - 3:03 Scenario description
3:04 - 4:41 Distinction, hostile intent, adversarial attacks
4:42 - 5:13 Variant scenario - optical confirmation from sentry
5:14 - 5:32 Meaningful human control and decision support
5:33 - 6:47 What happens when we make mistakes
6:48 - 7:11 Variant scenario - Fully autonomous weapons
7:12 - 7:40 Responsibility of humans
7:41 - 8:16 Propose a PhD research question
8:17 - 9:56 Conclusions
Special thanks to Yasmine Maatallah for her help.
Негізгі бет Can ChatGPT solve legal problems related to military AI?
Пікірлер: 2