> But yes, if the algorithm is suggesting pro-suicide content then the developers are morally, if not legally,
if she hadnt searched for it herself, it wouldnt have suggested it. the parents (understandly) want to blame someone for it. politics is emotion, not logic.
i mean, it kind of is. do they similarly blame chrome for showing suicide content in her auto-complete? this would have never happened if the girl hadnt searched for it herself. and if this suicide content is so powerful, how come everyone who reads it isnt killing themselves? i suspect to search for suicide content, she may well have been suicidal. a bold accusation, i know
if she hadnt searched for it herself, it wouldnt have suggested it. the parents (understandly) want to blame someone for it. politics is emotion, not logic.