"Technology Science" (Latanya Sweeney, Chief Technologist, FTC)

Latanya Sweeney is using the Tech@FTC blog to facilitate explorations and ignite brainstorming on FTC-related topics among scholars interested in participating in innovative solutions at the intersection of technology, business and policy. Her latest blog post is titled "Technology Science" and can be read on FTC.gov as well as here below:

Technology Science

By: Latanya Sweeney, Chief Technologist | May 2, 2014 11:02AM

Does enjoying a camcorder, a new computer, or a football game mean you have to risk personal harms like loss of privacy? Sometimes we enjoy advances in technology with protections like privacy. How can we do so more often?

Before I go any further, let me advise you that I am solely responsible for this blog’s content, characterizations, ideas and choice of topic. This blog does not necessarily reflect the views of the Federal Trade Commission (FTC) or any of its Commissioners. The goal of this blog post is to spark discussion and debate.

In 1983 Sony introduced its camcorder, a mobile device for consumers to record video with sound [1]. In the United States, you can usually video a person in public without his knowledge or consent, but if you want to record his conversation, electronic wiretapping laws dictate whether you must get his permission, and requirements vary among states. The camcorder had no mute button, making it impossible for consumers to capture video without sound. Replicating the design today, most mobile phones and digital cameras still record video with sound and provide no mute option. Consequently, people routinely shoot video that may violate wiretap laws and in some states, people have faced felony charges as a result [2, 3].

For a fraction of a penny, a camcorder with a mute button would have harmonized camcorders with existing laws. In today's devices, a software option instead of a physical change would do the trick. However, broad adoption of these devices, with no mute option, seems to be changing American social and legal norms. In a 2007 case, a political activist reportedly violated a wiretapping statute by recording video of a Boston University police sergeant during a political protest [2]. Today, legal challenges are moving in favor of individuals recording law enforcement officers in public [3]. A news article describes a case of a parent wiring his special needs child and capturing insults hurled at the child by a bus driver and teachers [4]. Last year Pennsylvania passed a new law that allowed video cameras mounted on school buses to record video and sound [5]. Did the technical design of mobile sound-recording devices change societal norms?

In January 1999, Intel wanted to enhance the security of online communications by embedding an unalterable unique number in the processor of personal computers [6]. Their idea was that when a computer engaged in communication over the Internet, the communication would include the processor’s unique identifier, thereby associating the computer’s actions to its physical processor. Sales and usage records could further relate the processor to an owner and location. Within two months of the announcement, privacy and consumer groups filed complaints with the FTC [7]. Their complaints argued that a unique number could also allow unknown observers to track computer use across the Internet even if the user was not being malicious or doing anything wrong. Supplemental complaints explained that surreptitiously tracking the movements of individuals on the Internet violated expectations of anonymity and of fairness and control over personal information [8].

By April 2000, Intel acquiesced and abandoned its original technological approach due to privacy concerns. Today, mobile phones have unique numbers – media access control or “MAC” addresses – used for networked communication beyond the phone number. Retailers and others are experimenting with using these unique identifiers to track the whereabouts of mobile phones in physical space (see earlier post). Is this a case of social norms shifting about tracking or of society not yet aware of this aspect of wireless mobile communications?

In January 2001, police in Tampa, Florida tested facial recognition technology during Super Bowl XXXV, scanning faces of people in crowds, comparing them with images in a database of digital mug shots. A few months later, a council member in Jacksonville, Florida introduced legislation banning the use of facial recognition technology by the Sheriff's Office and other city agencies [9]. Other city councils and legislatures considered similar legislation, but the events of September 11, 2001 dramatically reversed the projected end for facial recognition technology.

A decade later facial recognition technologies have flourished. Contexts range from online social networks and mobile apps to digital signs. According to a 2012 FTC report, uses of facial recognition now include determining an individual’s age range and gender in order to deliver targeted advertising; assessing viewers' emotions to see if they are engaged in a video game or a movie; or matching faces and identifying anonymous individuals in images [10]. Is facial recognition now harmonized with society or might planned uses risk future disruption and abandonment?

As a final example, consider the case of credit card purchases over the Internet. In 1994, almost as soon as web browsers became available, retail merchants sought to sell items using the Internet, spawning the phrase electronic commerce (or “e-commerce”). The problem was how to buy goods online without allowing an eavesdropper to learn credit card information. Early attempts were awkward and clumsy. Some merchants required shoppers to download and use special programs before they could transmit their credit card numbers [11]. Other merchants required part of the transaction to be conducted by telephone. While computer scientists had strong encryption methods, a lack of standards for incorporating encryption technology into web browsers was the biggest problem. Finally, in 1994 developers at Netscape introduced the Secure Sockets Layer Protocol (SSL) as a cryptographic means for securely facilitating data transfers over the Internet [12]. By 1996, Netscape announced the first web browser that included SSL as a way for users to send encrypted messages to Web servers that used SSL. The process was seamless. SSL became the standard immediately and remains the fundamental means of securing transactions over the Web. In 2006, U.S. shoppers spent $211 billion online using SSL [13]. (For a recent SSL update, see [14].) Is this a case where society enjoys utility with protection?

In the historical cases above, technology designs and uses clashed with societal rules and concerns and user acceptance. Outcomes included revised laws, technology disruptions, inconsistency, and an uncertain future. The exception is SSL, which was a technical innovation that resolved a clash. How can we do better? Is technology the solution as the SSL example might suggest? Do we need other kinds of knowledge too?

The costs of technology-society clashes are high to businesses and to society, so it is not surprising that others have previously sought cross-disciplinary approaches.

Privacy By Design

Privacy by Design is an initiative that started in the 1990's to encourage technology developers and technology and data businesses to integrate privacy throughout the development of emerging technologies. Originating in 1995 [17], privacy by design has 7 foundational principles focused on privacy [18] that seem to generalize to the full spectrum of social issues. The first principle encourages brainstorming and assessing possible outcomes early before a clash occurs. The other principles encourage: default options on technology be set to the most protective available for society; assessments of societal issues throughout the design process; pursuit of win-win options that provide benefits of new technologies without societal harms and not trading off utility for societal harms; proper data security and management throughout the life cycle of any personal data collected; transparency of technology and business practices; and, a user-centric focus.

The onus is on technology and data companies to operationalize the privacy by design principles within their organization specific to their technology or personal data use. How has uptake been? It is not clear, but one of myearlier posts provides an anecdotal example. I examined billing practices related to in-app purchases on mobile phones. Ann Cavoukian, then Privacy Commissioner of Ontario, Canada, and Marilyn Prosch issued an advisory report aimed at engineering privacy into mobile technologies [19]. The industry did not seem to adopt the recommendations that reportedly resulted from an expert panel of top executives from a number of facets of the mobile communications industry. Had the industry done so, the issue reported in the blog post about in-app purchases may not have occurred.

Add Technologists and Policy Experts

By the end of the 1990's, technology and data-rich businesses began hiring policy and legal experts and policy groups sought technologists. While organizations added experts at all levels, senior level appointments likely pose as good indicators of purpose and training.

Technology companies and information intensive businesses were among the first organizations to hire Chief Privacy Officers (or CPOs). A primary goal of a chief privacy officer is to make sure the company complies with privacy laws and regulations. The job is critical to reducing a company's legal liability in the face of increased regulatory privacy efforts, so most chief privacy officers are lawyers. Harriet Pearson may have been the first Chief Privacy Officer at a Fortune 500 company when IBM hired her in 2000 [20]. By 2005, reportedly 74 of the Fortune 500 companies had chief privacy officers [21]. Today, the International Association of Privacy Professionals, an international professional organization that certifies and offers resources to privacy managers, boasts of more than 14,000 members in 83 countries [22]. The chief privacy officer should have enough power to affect business strategy and operating procedures, and therefore make privacy concerns an integral part of new systems and products. Nonetheless, at least one article posits that chief privacy officers are not advocates for individual rights but must represent the interests of their organizations [23].

Recently policy organizations began appointing Chief Technology Officers (or CTOs) in order to gain better understanding of emerging technologies and to improve responses to technology-society clashes. An information technology website describes the job of chief technology officer as being in a technology business, overseeing technological innovation, managing research and development, identifying opportunities and risks, and creating relevant policy [24]. The job usually requires deep technical knowledge. In policy organizations, however, the chief technology officer does not have the same kinds of responsibilities and therefore does the chief technology officer need the same deep understanding of technology?

In 2009, the White House appointed Aneesh Chopra as the first Chief Technology Officer of the United States [25]. A news venue described the job as working with applied technology (“as is” or with minor modifications) to achieve national priorities [26]. Many praised his appointment to the job [27] and he may be a remarkable, accomplished technologist, but does his background generalize for training other CTOs? Reportedly Mr. Chopra has an undergraduate degree in public health, health policy and management and a graduate degree in public policy, but he previously had worked in a similar information technology position [28].

All FTC Chief Technology Officers to date have had doctorates in computer science. The FTC consists overwhelmingly of lawyers and economists who work to prevent business practices that are anticompetitive or deceptive or unfair to consumers. The FTC appointed Ed Felten as its first Chief Technology Officer in 2010, followed by Steven Bellovin in 2012, and me this year. In general, computer scientists can create new technologies (“what could be”), whereas information technologists tend to accept technology “as is”. In a policy organization, a chief technology officer represents the interests of his organization, of course, but his training may forecast the depths at which he can engage technology and offer technical leadership.

Teach Information Technology and Public Policy

Several universities have embarked on studies that combine technology, law and policy. They tend to examine policy issues related to information technology, such as privacy, broadband, intellectual property, free speech, competition, telecommunications, and management of the Internet. Students usually have a background in either technology (computer science, engineering) or the social sciences (especially economics, politics, and sociology). There may be no academic programs dedicated to this area, but schools offering a course of this type include, but are not limited to, Princeton [29], the University of Washington [30], the University of Maine [31], and Boise State University [32]. Additionally, scholars have written about the policy implications of information technology (e.g., [33, 34]).

These courses and publications seem multidisciplinary in nature. In multidisciplinary research, investigators from two or more disciplines work on a common problem, but do so without altering their disciplinary approaches or developing shared methodology. Think tanks offer good examples. Industry and the military often fund multidisciplinary research as a means to solve narrowly defined problems quickly. Problems usually subdivide along disciplinary lines. Researchers within disciplines then work somewhat alone on their sub-problems. In multidisciplinary courses, students from other disciplines may often assume approaches to problems from the other disciplinary perspective for educational purposes. Multidisciplinary training of this kind seems to be a way to increase technical awareness among lawyers and policy awareness among technologists, but may fall short of developing unified knowledge.

A Call for an Interdisciplinary, Scientific Approach

Perspectives of law, policy, and business are already represented in technology-society clashes, so I will not take time to discuss the importance of including knowledge from these areas, but will assume it is understood that these forms of knowledge are important. Perhaps a blend of computer, social and political sciences can help.

My PhD is in Computer Science and my research mission at Harvard is a unique interdisciplinary exploration to create and use technology to assess and solve societal, political and governance problems. During my few months at the FTC, I have already blogged about some current issues and brainstormed on some lightweight alternative approaches that blend disciplines, so I will draw on these as simple examples of what other disciplines might contribute.

How can technology help? In the camcorder and Intel processor cases above, by the time the clash occurred, it was too late to change the technical designs of the shipping products, leaving society in a take-it-or-leave-it position. Society split the score evenly among these choices, anecdotally suggesting the odds are 50-50. However, emerging uses, such as mobile phone tracking and applications of facial recognition, provide ample opportunities for exploring technical alternatives that may improve the odds of acceptance.

As an example, in an earlier post, I described how some retail stores are experimenting with tracking probes from mobile phones as consumers roam around stores. The conventional approach is to have an “opt-out registry”. Participating stores post signs to alert consumers that they are tracking mobile phones and offer instructions for how a consumer can choose not to participate using a website and entering the unique number associated with his phone so that the store removes his information once captured. I engaged in brainstorming activities to offer some alternative approaches. One possible alternative fused app technology and existing loyalty programs to inspire “loyalty apps” so that participating consumers may get rewards for participation. Another alternative suggested a store might host a special Wi-Fi network for direct communication between the store and in-store consumers using mobile phones. Instead of Internet access, the store could offer just-in-time specials based on the consumer's physical location in the store. A third approach described how consumers could control the probes that emit from their mobile phones to decide whether to participate. These nuanced approaches used technical insight harmonized with business models. I am not claiming that any of these nuanced approaches necessarily pose the best solutions, nor am I claiming that only those listed are worthy of consideration. Instead, I am using these as simple exemplars to show how technical insight helped uncover nuanced solutions worthy of consideration that might otherwise not be considered.

Scientific know-how can help too. Scientific methods consist of naturalistic observation, surveys, interviews, case studies, and experimentation –the kinds of methods popularized in the social sciences. While much of computer science relies on engineering and algorithmic methods, there have been studies using social science methods to assess the context in which a technology operates. Topics include: inequitable Internet access (“the digital divide”) [35], the social interactions of people using the Internet [36], privacy tradeoffs on the Internet [37], how teenagers use the Internet [38], and discrimination in online ad delivery [39].

In another earlier post, I described prior work that discovered data sharing arrangements of personal health data. My team at Harvard leveraged public records requests to learn about flows of personal health information that would otherwise remain hidden. We compiled a diagram that detailed flows of data sharing, and then used the diagram to identify a possible privacy risk. Conducting an experiment allowed us to quantify the risk, which led to real-world policy improvements, including changes to law. Critical activities relied on survey and experimentation, which are scientific methods.

Studies using social science methodology might have been able to forecast societal reactions to the camcorder, Intel processor, and facial recognition technology early, before the clashes. As opportunities unfold for new uses of facial recognition technology, opportunities exist to infuse scientific knowledge to navigate better possible outcomes.

Possible solutions and new scientific knowledge must be vetted. To avoid a lengthy discussion of ways to do so, let me share one possibility with you. Michael Smith, James Waldo and I teach students at Harvard to identify nuanced solutions and vet possibilities using threat modeling and risk assessment tools. We have done so for years in our popular, “Privacy and Technology” course, and more recently in my “Data Science to Save the World” course. Threat modeling involves brainstorming on issues that may occur and ways to address them. A risk assessment assigns relative values to possible threats and solutions to compare costs, likelihoods and benefits. To achieve optimal outcomes across all stakeholder interests, risk assessments are best done globally or in a scientific community. Why? Consider the camcorder case. A risk assessment done solely from the company's perspective might have identified the conflict beforehand and correctly concluded that it would take years to unfold and therefore holds no penalty to the company. I am not saying that Sony or any other camcorder manufacturer actually made such a determination, and in fact, it is unlikely that they did so, but I am using this hypothetical to show why threat modeling and risk assessment is best done considering the interests of all stakeholders instead of just one.

Finally, there is the decision-making apparatus itself. The power of different stakeholders varies among situations. Once empowered by the particulars of a clash, a stakeholder can have more say in the outcome. No one group –policy makers, enforcers, businesses, or advocates– has global authority and the limited perspectives can bias results. An enforcement agency could not require a camcorder manufacturer to add a mute button, but if a policy organization had scientific-legal knowledge of the pending impact, it could offer advice to the company and educate consumers.

Technology Science

A science is the intellectual and practical study of phenomena to produce an organized body of knowledge. Usually when we think of “science”, we think of naturally occurring phenomena like the stars and sky or the human body. In technology-society clashes, the phenomenon are the organic relationship between a use of a technology and related businesses, individuals, and societies. This mix generates its own wonder, worthy of study. Such a study seems critical to navigating a future with fewer disruptions to technology investments, better informed policy decisions, and less unwanted erosion of societal norms.

Perhaps I am talking about a new discipline, which I might coin as Technology Science. A discipline is a corpus of knowledge and a system of problem-solving methods, techniques, and skills. Examples of disciplines include anthropology, psychology, and zoology. The life of a discipline is dynamic. Research often creates specialized fields of studies that evolve into disciplines. Once a discipline solves all of its key problems or its problems are no longer valued, it could then self-eradicate. The knowledge and know-how of disciplines do not die. Instead, newer or broader disciplines absorb them.

A discipline often seeks to answer an over-arching question. In the case of Technology Science, that question seems to be: “How does society enjoy the benefits of new technologies without societal harms?” Suggested activities in Technology Science include:

  • Conducting scientific experiments that provide useful knowledge to decision-makers
  • Navigating emerging technologies and early technical designs to avoid clashes
  • Exploring techno-business-policy alternatives when clashes occur
  • Constructing technological solutions to improve adoption of other technologies
  • Improving decision-making about emerging technologies and uses
  • Developing forensic-like tools for investigations and policy enforcement
  • Building assessment tools and instruments
  • Appraising trust relationships between technology and individuals
  • Weaving policy and technology together
  • Forecasting societal outcomes
  • Formulating methods for validating efforts and principles for generalizing results
  • Transferring vetted practices and standards to real-world practice
  • Educating students, area experts and the public

Interdisciplinary and multidisciplinary ways of conducting research play important roles in the creation and development of disciplines. In interdisciplinary research, researchers from multiple disciplines synthesize their disciplinary approaches to better identify, formulate, and solve a shared problem. Typically, the resulting approach leads to changes in problem solving in the originating disciplines and/or introduces a new field. In fact, interdisciplinary research is fundamental to the creation of most new fields. There are numerous examples. Bioinformatics combines molecular biology with computer science. Other disciplines that began as interdisciplinary research include biochemistry and biomedical engineering. Even computer science itself started as an interdisciplinary effort that fused electrical engineering and mathematics. In comparison to multidisciplinary efforts, researchers conducting interdisciplinary research tend to focus on all aspects of the same problem jointly, even though they do so from different approaches initially.

Technology science would be an interdisciplinary pursuit. A full spectrum interdisciplinary approach that includes computer science and technology governance seems ideal. The term “governance” is not limited to government or public policy, but includes all ways technology and data are governed (including by technology design) and those doing the governing are not limited to governments, but include other entities like companies and markets.

What Do You Think

Is there a new discipline on the horizon? Can some of the earlier approaches work better? This inquiring mind wants to know what you think. Perhaps you have your own approach to describe, a technology-society clash to discuss, or a comment to make.

To help build competency and knowledge in this area, the FTC is launching new Fellowship programs across levels of academic training and disciplinary boundaries. The first program, Summer Research Fellowship in Technology and Data Governance (www.ftc.gov/summerfellows), starts this summer at the FTC.

References

1. About Sony Videocamera. Sony Corporation. http://www.sony.net/SonyInfo/CorporateInfo/History/sonyhistory-f.html

2. Miles T. BU protester fined, could face jail time. The Daily Free Press. December 6, 2007.http://dailyfreepress.com/2007/12/06/bu-protester-fined-could-face-jail-time/

3. Hudson D. Good Cop, Bad Citizen? As Cellphone Recording Increases, Officers Are Uneasy. ABA Journal. March 1, 2012.http://www.abajournal.com/magazine/article/good_cop_bad_citizen_as_cellphone_recording_increases_officers_are_uneasy

4. Mulvihill G. Wire-Tapping Disabled Children At School A New Trend?. Associated Press. Huffington Post. April 25, 2012. http://www.huffingtonpost.ca/2012/04/25/disabled-needs-children-wire-tapping_n_1453822.html

5. New State Law Allows School Buses To Record Sound. CBS News Pittsburgh. March 24, 2014.http://pittsburgh.cbslocal.com/2014/03/24/new-state-law-allows-school-buses-to-record-sound/

6. Gelsinger P. Intel Keynote Address. RSA Data Security Conference and Expo 1999. January 20, 1999. Archivedhttp://web.archive.org/web/19990421114426/http://intel.com/pressroom/archive/speeches/pg012099.htm (Original: www. intel. com/pressroom/archive/speeches/pg012099 .htm)

7. Center for Democracy and Technology, Consumer Action, Privacy Rights Clearinghouse, and Private Citizen, Inc. Complaint and Request for Injunction, Request for Investigation, and for Other Relief. Before the Federal Trade Commission Washington, DC 20580. February 26, 1999. Archived:http://web.archive.org/web/20010304191327/http://www.cdt.org/privacy/issues/pentium3/990226intelcomplaint.shtml (Original www. cdt. org/privacy/issues/pentium3/990226intelcomplaint. shtml”)

8. Center for Democracy and Technology, Consumer Action, Privacy Rights Clearinghouse, and Private Citizen, Inc. Supplement to Intel Complaint: Potential Harm to Individuals. Before the Federal Trade Commission Washington, DC 20580. April 8, 1999. Archived:http://web.archive.org/web/20090512005702/http://www.cdt.org/privacy/issues/pentium3/990408intelcomplaintsupp.shtml (Original: www. cdt. org/privacy/issues/pentium3/990408intelcomplaintsupp. shtml”)

9. Galnor M. Police Snooper Camera Fight Still Alive Councilwoman Wants Facial Recognition Ban. The Florida Times Union. August 31, 2001 Archived: http://www.questia.com/read/1G1-77759652/police-snooper-camera-fight-still-alive-councilwoman

10. Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies. Staff Report. U.S. Federal Trade Commission. October 2012. http://www.ftc.gov/sites/default/files/documents/reports/facing-facts-best-practices-common-uses-facial-recognition-technologies/121022facialtechrpt.pdf

11. Gilbert A. E-commerce turns 10. CNET News.com. August 11, 2004. http://news.cnet.com/E-commerce-turns-10/2100-1023_3-5304683.html

12. Hickman K., “The SSL Protocol”, Netscape Communications Corp., Feb 9, 1995. http://tools.ietf.org/html/draft-hickman-netscape-ssl-00

13. The State of Retailing Online 2006. A Shop.org study by Forrester Research. National Retail Federation. 2006. http://www.nrf.com/Attachments.asp?id=9510

14. Earlier this month, the public learned of vulnerability in the popular library OpenSSL, which provides SSL communications in popular online applications such as financial transactions and email. Coined as the Heartbleed Bug, it allows attackers to eavesdrop on communications, steal data directly from online services and to impersonate services and users [15]. The Heartbleed Bug is not endemic to SSL, but to this specific implementation of SSL. As long as the vulnerable version of OpenSSL is in use, abuses can occur. System administrators are installing fixes widely but abuses can occur on any system not updated with the fix [16].

15. OpenSSL Security Advisory: TLS heartbeat read overrun (CVE-2014-0160). April 7, 2014.https://www.openssl.org/news/secadv_20140407.txt

16. The Heartbleed Bug. April 2014. http://heartbleed.com/

17. Hes R. “Privacy Enhancing Technologies: white paper for decision-makers”. Dutch Data Protection Authority. December 2004. (Earlier reference: Hess R and Borking J. Privacy enhancing technologies: the path to anonymity. September 1998) http://www.dutchdpa.nl/downloads_overig/PET_whitebook.pdf

18. Cavoukian A. “7 Foundational Principles of Privacy by Design”. Information and Privacy Commissioner/Ontario.

19. Cavoukian A and Prosch M. The Roadmap for Privacy by Design in Mobile Communications: A Practical Tool for Developers, Service Providers, and Users. December 2010. (Also appears in Future Challenges in Security and Privacy for Academia and Industry. 26th IFIP TC 11 International Information Security Conference. SEC 2011. Lucerne, Switzerland.) http://www.privacybydesign.ca/index.php/paper/the-roadmap-for-privacy-by-design-in-mobile-communications-a-practical-tool-for-developers-service-providers-and-users/

20. IBM Names Harriet P. Pearson as Chief Privacy Officer. IBM Press Release. November 29, 2000. http://www-03.ibm.com/press/us/en/pressrelease/1464.wss

21. Shalhoub Z. Analysis of Industry-Specific Concentration of CPOs in Fortune 500 Companies. Communications of the ACM. v52(4) April 2009.

22. About the IAPP. International Association of Privacy Professionals.https://www.privacyassociation.org/about_iapp

23. Hasson, J. 3 principles for chief privacy officers. FCW. September 5, 2005.http://fcw.com/articles/2005/09/05/3-principles-for-chief-privacy-officers.aspx

24. Chief Technology Officer. TechTarget SearchCIO. http://searchcio.techtarget.com/definition/Chief-Technology-Officer-CTO

25. White House Profile: Aneesh Chopra. The White House Blog.http://www.whitehouse.gov/blog/author/Aneesh%20Chopra

26. MacNeil J. 1st Chief Technology Officer of the US is named, April 18, 2009. EDN Network. April 18, 2014.http://edn.com/electronics-blogs/edn-moments/4412327/1st-Chief-Technology-Officer-of-the-US-is-named--April-18--2009

27. O'Reilly T. Why Aneesh Chopra is a Great Choice for Federal CTO. Radar. April 18, 2009.http://radar.oreilly.com/2009/04/aneesh-chopra-great-federal-cto.html

28. Aneesh Chopra: About. Facebook. https://www.facebook.com/apchopra/info

29. Felten E. WWS 586F / COS 586: Information Technology and Public Policy: Reading List. Princeton University. Spring 2010. https://www.cs.princeton.edu/courses/archive/spring10/cos586/reading.html

30. Lazowska E and Maurer S. CSE P 590TU: Information Technology & Public Policy. University of Washington. 2004. http://courses.cs.washington.edu/courses/csep590/04au/

31. Information Technology and Public Policy. University of Maine. https://umaine.edu/pubadmin/graduate-programs/

32. PUBADM 512 Information Technology and Public Policy. Boise State.https://sspa.boisestate.edu/publicpolicy/course-descriptions/

33. Lessig L. Codev2. http://codev2.cc/

34. Janisch H. Information and Public Policy: Regulatory Implications for Canada. Osgoode Hall Law Journal. v20(3). September 1982. http://digitalcommons.osgoode.yorku.ca/cgi/viewcontent.cgi?article=1989&context=ohlj

35. Exploring the Digital Divide: Charting the Terrain of Technology Access and Opportunity. Computer Science and Telecommunications Board (CSTB) and the Division of Behavioral and Social Sciences and Education (DBASSE). 2001. Archived:http://web.archive.org/web/20061202060703/http://www7.nationalacademies.org/cstb/wp_digitaldivide.pdf(Original: www. nationalacademies. org/cstb/whitepaper_digitaldivide. html”)

36. Rice, R. and Katz, J. Social Consequences of Internet Use: Access, Involvement, and Interaction. The MIT Press. Cambridge: 2002.

37. Acquisti A and Grossklags J. Privacy and Rationality in Decision Making. IEEE Security and Privacy. January/February, 24-30, 2005. http://csis.pace.edu/~ctappert/dps/d861-09/team2-3.pdf

38. Boyd D. It's Complicated: the social lives of networked teens. Yale University Press. 2014.

39. Sweeney L. Discrimination in Online Ad Delivery. Communications of the ACM. v56(5). Pages 44-54. http://cacm.acm.org/magazines/2013/5/163753-discrimination-in-online-ad-delivery/fulltext