Terrorist live video is very difficult to track with the use of artificial intelligence. Facebook AI Chief



When asked by Mark Zuckerberg how Facebook would free its platform from terrorism-related content, he said that artificial intelligence would do it all. But Facebook's chief AI scientist Yan Likon says the screening work will take years, with artificial intelligence, especially in live videos.
At a ceremony at Facebook's AI Research Lab in Paris last week, Yan said the use of AI to monitor live video of Facebook would take years.
The problem is far from resolved, he said.
Yan has recently been awarded the Turning Prize. The Turning Prize is known as the Nobel Prize for Computing.



Live video is an important issue today. In the past, a terrorist at Christ Church live-streamed his attack during the attack on the mosque. Less than 200 people watched the live streaming live and then downloaded and spread it on the Internet.

The trend of suicide is also increasing during live video nowadays.
Now artificial intelligence can delete such content, but humans will have to report it first. Currently there is no automated system that can automatically delete it.
The main problem is the lack of data training, Yan said in Paris. He also thanked God that there are not many such instances in which one person is firing another.
One solution to this problem may be to train artificial intelligence in movie scenes, but it will also delete videos of movies and games on Facebook.
Companies like Facebook want to develop automated systems that support human moderators, but there are also problems with human moderation.

Post a Comment

0 Comments