Lesson Plan: Privacy & Ethics

Day 1: 

Readings: Prior to class, students will read “Big Data: The End of Privacy or a New Beginning?” by Ira S. Rubinstein. Because the material for this unit is so extensive, we will be taking a two-fold approach in covering all that we can. Our first lecture will address privacy in a big data world. Rubinstein’s article serves as an effective introduction to doctrine, discourse, and application, which we hope will encourage productive discussion and conceptual connection.

In Class:

  1. Group Activity: We’ll kick things off with an activity to get the class engaged early. The directions are as follows: We have everyone write a “secret” (anything from a fun fact to their deepest darkest secret) on a slip of paper, saying they don’t have to share if they don’t want to. Then, we ask who is willing to make their secret public for a high five (simulating the “sale” users make with Facebook or Twitter). We then share whatever they wrote on our wall. We up the ante, (a jolly rancher). At the end, we say we’re going to collect the slips of paper anyway, metaphorically showing that these apps collect data even when you don’t want them to. (We won’t actually collect in case someone does write their deepest darkest secret).
  2. Activity Reflection: we’ll discuss the real-world applications of our activity with the class, using the terms and condition of Facebook’s Messenger app as our primary example. Other examples that will be tied-in include: Angry Birds, Candy Crush, Doodle Jump, and Pandora.
  3. Key Terms: Before going any further into the class material, we will briefly review some key terms to give the class an idea of the direction we’re headed. We will define privacy as explained by the Oxford English Dictionary. From that, we will segue into an oddly humorous four-minute video that situates the ideology of privacy in a big data context. Once the video has concluded, we will go over Fair Information Practices, European Union Data Protection Directive, and Personal Data Services; the understanding of each of these is integral to the assigned reading.
  4. Visualization: Once we’ve defined some key terms, we’ll briefly look at an infographic provided by the Pew Research Center. The infographic, which gives statistical information on social media management, will hopefully make the material more tangible and familiar, given the context of a world in which social media thrives like it does.
  5. Discussion: At the conclusion of our background information, we’ll jump into the first of two discussions. The relevance of these questions should encourage the type of participation that will ultimately foster a healthy discussion. The class will be broken into six groups and assigned a corresponding question to consider and discuss in those respective groups. After a few minutes, we will reconvene as a class and review what was discussed.
  6. Reading Review: Once we’ve wrapped up the first discussion, we’ll present the material for the Rubinstein article. We’ll outline threats big data proposes to privacy, “database of ruin”, and contemporary ideas for addressing big data.
  7. Discussion: After the literature has been reviewed, we’ll jump into our second and final discussion for the day. The class will once again be broken into six groups and assigned a corresponding question to consider and discuss in those respective groups; however, this time the content of the questions will be more geared toward the reading. After a few minutes, we’ll once again come back together as a class and share our thoughts and opinions.
  8. We’ll conclude our lecture on privacy in a big data world with another class activity. The directions are as follows: Break into groups. Using the materials provided (blank sheet of paper and markers), create a Venn diagram, containing one circle of “type of information you prefer to keep private”; one circle of “type of information you generally make public”; and finally, one circle of “type of information that can be useful to third parties.” Produce one Venn diagram per group. The object of this activity is to really get the class thinking about the value of their data and who does have access vs. who should have access to that data. We will collect these as class dismisses and share the results at the beginning of Day 2.

Homework: Read “Hacker Politics and Publics” by Gabriella Coleman and “The NSA and Edward Snowden: Surveillance in the 21st Century” by Joseph Verble. Come to class prepared to discuss.

*The specific definitions to key terms, questions for discussion, and notes on the readings can be found in the Prezi*

Placeholder

Day 2: 

Readings: Prior to class, students will read “Hacker Politics and Publics” by Gabriella Coleman and “The NSA and Edward Snowden: Surveillance in the 21st Century” by Joseph Verble. In the same vein as our first lecture, our second one will address the ethical implications of privacy in a big data world. The articles of Coleman and Verble will introduce the class to some major themes, players, and situations as they pertain to the not-so-black-and-white nature of ethics of big data.

In Class:

  1. Recap: using Lucid Chart software, we will combine all of the information provided in the last activity in the previous class to draft one big Venn diagram. What we really hope to drive home from this is that there’s a lot of overlap between private data, public data, and data that can be useful to third parties. Hopefully this will lead students to look at their data in ways they hadn’t previously.
  2. Key Terms: To start off the second half of our unit, we will briefly review some key terms to give the class an idea of the framework we’re working will. We will define ethics and hacking as explained by the Oxford English Dictionary. To go with our definitions, we will juxtapose a graphic that visualizes where ethical position fits into the equation in questions such as, “What can be done technically?”; “What an organization would like to do?”; and, “What can be done legally?” The understanding of each of these terms and question valences is integral to the assigned reading.
  3. Discussion: Following our introduction of key terms and concepts, we’ll jump into the first of two discussions. The class will be broken into four groups and given a list of questions organizations should consider in order for their big data collection to be ethical. After a few minutes, which should consist of groups discussing their level of satisfaction with the list of questions, etc., we will reconvene as a class and review what was discussed. We will ask the groups to share which questions they would keep, amend, and/or add respectively with the rest of the class.
  4. Key Players: Once we’ve wrapped up the first discussion, we’ll take a look at three important figures in the grand scheme of our topic: Julian Assange, Edward Snowden, and the hacktivist group Anonymous. Providing the class with some context and background information on all of these–while putting faces to names–will set up the review of the assigned readings.
  5. Reading Review: Once we’ve wrapped up the first discussion, we’ll present the material for the Coleman and Verble articles respectively. We’ll outline hackers and their role in the techno-political climate. We’ll also talk about the NSA and government.
  6. Discussion: After the literature has been reviewed, we’ll jump into our second and final discussion for the day. The class will once again disperse into four groups; however, this time they will be assigned a number corresponding to a question they need to answer based on content of the readings. After a few minutes, we’ll once again come back together as a class and share our thoughts and opinions.
  7. We’ll conclude the second head of our lecture on privacy and ethics with a class debate that should really put both in perspective. The class will be split in two, with one half of the class assigned the pro data monitoring position, and the other opposing that. From that point, the class will debate three questions and a “winner” will be chosen on effort and soundness of argument.

Homework: 

Blog Post: 

As we discussed in class, many apps are accessing every single aspect of our phones without us realizing it, even though we technically agreed to it. Explore the “Terms and Conditions” of an appthat has not been previously discussed in class. Look through the user agreement and identify at least three methods the app uses to collect data that would not be typically assumed. What do you think the app is using that information for? Why might the information be useful to other parties? Is it ethical for the app to gather that information, knowing that–as stated in the Ira Rubenstein article–most people don’t read the Terms and Conditions? Also, should this be considered hacking?

Further think about privacy and ethics by putting yourself in the shoes of one of the famous hackers. Is it your civic duty to reveal corruption with the data you find? Would you do it? All of the examples we discussed in class have a very limited quality of life than they had before; is it worth it? With the Panama Papers data hack in mind, read into the current event, and decide what you believe to be ethical in this case. Was it ethical to hack that data in the first place? Therefore is it ethical to put people on trial based on data that is, in a way, stolen?
Keeping the unit readings in mind, write a 400-500 word response to each of these questions and pose some of your own.

*The specific definitions to key terms, questions for discussion, notes on the readings, and debate items can be found in the Prezi*