Sample Privacy Risk Assesment Example and Explanation





Privacy impact assessments (PIAs) are a tool that can be used to identify and reduce privacy risks. A PIAs can reduce the risks of harm to individuals by preventing the misuse of their personal information. PIAs are an integral part of taking a privacy by design (PbD) approach. They are used to design more efficient and effective processes for handling personal data. The use of PIAs is not something new, in fact the process has been used by a number of companies, entities and governments for over forty years now.  The PIA was created by the United States Office of Technology Assessment. The U.S. office of Management and Budget (OMB) publishes guidance on the implementation the privacy provisions by Federal Agencies under E-Government Act of 2002, including when to conduct a PIA.

Under GDPR,  PIAs have become a centerpiece and necessary in certain situations. A PIA must be completed if a company is doing one of the following:
  • Data controller or the data processor on behalf of the controller is using new technologies to process data;
  • The processing is likely to result in a high risk to the rights and freedoms of individuals. For example: systematic and extensive processing activities (profiling), OR large scale processing of special categories of data, OR personal data relation to criminal convictions or offenses, OR large scale, systematic monitoring of public areas (CCTV).
I have developed a hypothetical situation in which a PIA would be required under GDPR. A new Silicon Valley startup named Preppy is going to launch an app called PRYER.  The app will be a peer review site, where people can sign up and review a family member, friend and or co-worker. The app will be available on both android and Itunes in the U.S., Canada and the EU. On the App,  the reviewer will be able to list the person's (being reviewed): full name, email address, social media urls, sexual orientation, religion, and upload their picture. Pretty much anyone can create a review about someone. The individual's consent is not required. Poster's/Reviewer's name will show as anonymous and their personal information will not be posted on the site, pursuant to the app's privacy notice. PRYER only stores the Reviewer's login information (email and password) so that they can use the app.  Posting to the site is free; however, people can pay $5 subscription fee in order to read and access the reviews. Preppy's position is that they are no different than Yelp and are not controllers of the data as they merely process the information reviewer/poster provides. Preppy does not have a data protection or privacy officer. In fact, they do not have any polices or procedures for data management or even most of their internal processes and procedures. Therefore Preppy has an Adhoc Privacy Maturity Model and has no formal basis for international data transfers (such as BCRs, MCs, Privacy Shield, Codes or certifications). The following risk assessment measures the risks to the data subject (person being reviewed) under the General Data Protection Regulation (GDPR).

Generally a PIA requires a more in depth analysis, but for purposes of this article I will provide a clear and concise analysis.  The below figure shows a more formal risk assessment evaluation.

STEPS

1. Is a PIA required under GDPR ? YES 

2. Review the flow of information 
  • Personal Identifiable Information (PII) shared on the app: Name, email, social media urls, telephone number, sexual orientation, religion, criminal background and picture (see processing sensitive or special categories of data).
  • Recipients of the data: subscribers of the app.
  • Data Storage: Data will be stored indefinitely. 
  • Processing information: All data is then accessible through the company IT application, hosted within the company network.
  • Access: Personnel who need access to perform job duties, including but not limited to: IT tech support, HR, information security, privacy, legal, audit, call center and associated data processors.
3. What are the risks?  
  • Harm to the Reputation of the Data Subject 
  • Identity theft (if too much PII is shared)
4. What is the basis for processing :
  • Consent from the DS - Not likely 
  • Contract with DS - No 
  • Legal Obligation - No 
  • Vital Interests - No 
  • Public Interest - No
  • Legitimate Interest - this is a stretch but maybe the only basis Preppy can claim 
5.  Solutions
  • Continue with the roll out of the app and provide data subjects with a mechanism to opt-out and request either correction or deletion of their PII.
  • Consult with 29 Working Party and or an individual Data Protection Authority and seek further advice.
  • Limit the app to U.S. users and subjects.
  • Kill the app all together.
6.  Plan 
  • Summarize the solution chosen and implement this finding in product life cycle of the app. 
  • Update the privacy notices and policy according to solution selected.



Comments

Popular posts from this blog

Preserving User Privacy in Digital Advertising: Navigating Consent and Privacy by Design

Cross Border Transfer Mechanism: Model Clauses