In his essay “Mining Student Data Could Save Lives”, Morris suggests that by analyzing students digital activities, we could catch the oft-ignored signs of a future attack and take action before any lives are lost. At first glance, this seems like a perfect method to deter violence on campus. Sure, the students privacy is somewhat compromised, but the lives that could be saved are certainly worth the sacrifice, aren’t they? However, even if we could justify the morality and ethics of such a system, there are some logical faults in this data-powered “crystal ball”.
After a mass shooting, we often look at the evidence and wonder how no one noticed the signs – they seem so obvious. However, this is a classic example of hindsight bias, which refers to our tendency to see events that have already occurred as more predictable than they were. While some signs are indisputably concerning, such as outright threats and manifestos, many are not. Some may be subtle, and only stand out in context of the attack. Or, it may be difficult to gauge the severity and sincerity of a message, especially since people tend to be emboldened on the internet. Many indicators can have perfectly innocent, plausible explanations, and innocent behavior can seem sinister depending on one’s perspective. Finally, there’s a risk that those who design the system will build their personal biases into it, unfairly targeting certain groups.
How do we handle this ambiguity? Do we err on the side of false positives and discrimination, or should we lean towards giving the benefit of the doubt, even if we risk some attackers slipping through? If a student is identified as a threat, how do we intervene, discipline, or serve justice when no crime has been committed? Perhaps there are other ways we can prevent these violent acts, such as limiting students access to deadly weapons, building a strong community that prioritizes student care, and working to undo societal norms, standards, and pressures that contribute to violence. Since there are many other less inflammatory options, we ought to pursue them before turning to a faulty and unethical system of constant surveillance.
Leave a Reply