“privacy as an individual right”
- privacy is a control of information: controlling our private information shared with others
- free choice with alternatives and informed understanding of what’s offered
- control over personal data collection and aggregation
- privacy as autonomy: your agency to decide for what’s valuable
- autonomy over our own lives, and our ability to lead them
- do you have agency?
“privacy as a social group”
- privacy as social good: social life would be severely compromised without privacy
- privacy allows social
- privacy as a display of trust: privacy enables trusting relationships
- “fiduciary”: proxy between you and a company
- “should anyone who has access to personal info have a fiduciary responsibility?”
key trust questions
- who/what do we trust?
- what do we do if trust isn’t upheald?
- how to approach building trust
trust
trust: to stop questioning the responsibility of something
- intentions
- dependence
- extensions of agency
We mostly don’t trust software; instead, we trust the people that developed the software.
accountability
a lot of people who are accountable in this chain:
- hardware designer (intel)
- OS developer (iOS, ec.)
- app developer
- users
stakeholder
- direct stakeholders (people who are operating, technicians, etc.)
- indirect stakeholders: patients
purchase = long-term support —- what do you do to get it fixed/repaired.
time
- support duration
- obsolescence (how long is the end of support)
- products/services may not be garanteed forever
- problems with halting use—requires deleting entire pentagram account
meltdown vulnerability
meltdown: hardware vulnerability that allows an user program to access kernel level pages of system memory.
potential ways of fixing a vulnerability/violation of trust:
https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_ReflectionsonTrustingTrust.pdf
loss of privacy
aggregation
Through the loss of privacy, information can be piecemeal built up to understand somebody’s profile.
exclusion
Not knowing or understanding or control how our information being used.
secondary use
Using information for purposes not intended without permission.
trust
trust exposes people to the risk of being betrayed/let down. Differential privacy is used to anonomyze information. especially, for operation systems, each bug can have a massive impact because it impacts billions of users.
“trust means to stop questioning the dependability of something; you become vulnerable to it”
trusting software is the task of extending your own AGENCY to a piece of software: “agential gullibility”.
examples:
- ios bug: alams didn’t go off
- printnightmare: printing caused remote code execution
- 2017 admin access without password
- eternalblue (caused wannacry)
key points
- trust between different stakeholders are intertwined
- trust is about extending agency
- trust emerges through various pathways
- we can design ways to partially substitute the need for trust
pathways to trust
trust by assumption
- trust absent any clues to warrent it due to timing
- trust because there is imminent danger
trust by inference
- trust based on information you had before
- brands
- affiliation
- past performance
- trust in prior version of software
trust by substitution
- trust something, but having a fallback plan
- trust a system because there would be a backup system protecting you
scales of trust
scale of impact
- a bug in an OS can be tremendously bad
- “root access” — privileged aces
scale of longevity
- people maybe on very very old OS
- it requires keeping older OSes secure against modern technologies