Morris argues that universities should mine student’s data to identify and prevent potentially threatening behaviors which could cause harm to other students of faculty. He compares data mining with a crystal ball, that universities could use to ensure the safety of those on campus. Additionally he brings up a potential objection, that FERPA could prevent this type of mining because of privacy rights and the inability to release confidential student records. To counter this he presents a case from Virginia Tech which added a clause that would make this prevention possible. Morris also brings up the data mining that cookies do for online sellers to better tailor advertising. I agree with the sentiment that Morris presents. Oftentimes we must sacrifice privacy in order to help with health and security. However, I think that his analogy of a crystal ball is a line coming from far fetched sensationalism. It isn’t the data mining itself, it’s the algorithm which interprets data and predicts future developments or makes conclusions that poses as the crystal ball. There also remains the question of how powerful these algorithms are. Predicting human behavior is difficult, and should an algorithm be wrong, a student’s life could be ruined, even though they were simply researching for their criminal psychology class. Additionally, in order to strengthen Morris’s argument I would like to bring up the impersonal aspect of this technology. Since all of these algorithms are being fed through machines, the data could be encrypted and also given to machines, so that actual humans would never see it or interpret it. That way, only machines would be running the algorithms and the only data examined would be that which poses a risk. This would help maintain privacy while increasing campus security.