South Korea's police to utilize AI for crime investigations
South Korea's police will employ artificial intelligence to counteract deepfakes, which can be utilized to prevent crimes, reports Yonhap News.
The National Investigation Office stated that the program will determine the authenticity of videos within 5-10 minutes. Additionally, the AI will generate results usable in investigations.
The police claim that the software can identify the authenticity of videos with an 80% probability.
Law enforcement plans to use the data for investigations rather than as direct evidence. The new software has analyzed approximately 5,200,000 data fragments from around 5,400 Koreans and related figures. It has adopted a state-of-the-art artificial intelligence model to respond to new types of videos that were not previously trained.
The system will be cross-referenced with artificial intelligence experts to avoid mistakes in detecting fakes, primarily focusing on videos related to elections.
What is a deepfake
A deepfake is a video or audio where a person's face or voice is replaced with another using artificial intelligence. It looks very realistic but is not true.
Development of artificial intelligence
Earlier, we reported that Microsoft updated its AI Copilot. It can engage in conversations with users on personal topics.
Google's AI, Gemini, has suspended image generation.
We also reported that Google, Meta, TikTok, and others want to label deepfakes during elections.