Popular South Korean Al chatbot, Lee Luda, removed is from Facebook after reported for racist comments and spreading intolerance towards minority groups.
It seems that a person influences everything he does, so he transfers his own prejudices to the systems of artificial intelligence. Scatter Lab, created by Lee, points out in its statement that the company’s representatives “do not agree with the statements of the chatbot”, and that its comments do not reflect the company’s views. Although it was created with the help of code that should prevent the use of language that is not in line with the values of civilized society, Lee learned from interacting with ordinary people, so in the process she seems to have encountered those who enjoy the spread of intolerance.
Lee is not the first chatbot to learn hate speech from human beings, so Taylor Swift, for example, back in 2016 threatened with a lawsuit to Microsoft, and because of the views of their racist bot named Tay.
Source: PC Gamer
*The article has been translated based on the content of PC Press by pcpress.rs. If there is any problem regarding the content, copyright, please leave a report below the article. We will try to process as quickly as possible to protect the rights of the author. Thank you very much!
*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.
*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!