A black box without empathy | Artificial intelligence
This article tackles the importance of discussing how algorithms are dealing with sensitive social issues such as suicide. Although social media platforms such as Facebook have made positive changes in the lives of society, when it comes to suicide prevention, these platforms aren’t smart enough.
An expert and security researcher Anivar Aravind said, “Responding to such searches should not be an engineering decision. It needs to have a social or psychological consultation, which is absent in most tech companies. These algorithms are black boxes as in except for the company, nobody knows how that product is programmed. The output of the algorithm reflects the sensibility of the product manager who wrote the program. Additionally, it is supplemented by the human bias of the developer or company…If a search for suicide or killing is detected, the systems should identify them and it should be backed by human decision,”
So how do you think these platforms can be made more sensitive?