General Data Protection Regulation GDPR

What is GDPR?

GDPR stands for the General Data Protection Regulation. The GDPR is a set of rules and regulations that were implemented on 25-May-2018. These regulations are applied to any organisation inside or outside of the EU that stores or processes any data belonging to EU citizens.

Privacy by Design and Privacy by Default

Data protection by design and by default is a condition applied to all international personal data transfers. The data collecting company should select which process works the best to give protection to users. Privacy by design and Privacy by default requires suitable technical and organisational steps to execute GDPR principles and individual rights. This should be implemented into all software architecture.
table1
table img21
Privacy must be proactive and preventive and not reactive or remedial. Anticipate privacy issues before they reach the user.
table img22
Privacy must be the default setting. Users should not have to take actions to secure their privacy. Consent for data sharing should not be assumed.
table img23
Privacy must be embedded into design. It must be a core function of the product or service, not an add-on.
table img24
Privacy must be a positive sum and should avoid dichotomies. Ex: PbD sees an achievable balance between privacy and security, not a zero-sum game of privacy or security.
table img25
Privacy must offer end-to-end lifecycle protection of user data. Engage in proper data minimization, retention and deletion processes.
table img26
Privacy standards must be visible, transparent, open, documented and independently verifiable. Processes must stand up to external scrutiny.
table img27
Privacy must be user-centric. Users should have granular privacy options, maximized privacy defaults, detailed privacy information notices, user-friendly options and clear notification of changes.

Who does GDPR apply to?

Applies to all businesses, organizations, sectors, situations, and scenarios, regardless of a business’s size, head count, or financial turnover.
img1 1
img2 1
Applies to all businesses irrespective of where the business is located. If the business deals with Europe, or collects data on European users, the business must protect their data in full accordance with the regulation as if the business was set up in Europe.
Applies to all personal data collected, processed, and retained about persons within the European Union regardless of citizenship or nationality.
img3 1

Who is the Data Protection Officer?

img4 scaled

Under GDPR, companies processing certain kinds of data must appoint a Data Protection Officer (DPO), a named individual who is accountable and responsible by law for an organization’s privacy compliance, including PbD. This requirement is regardless of a company’s size, which means that even the tiniest business engaged in certain kinds of data processing must appoint a DPO.

It is encouraged to voluntarily appoint a DPO regardless of the nature of their work. DPO acts as the health and safety officer for privacy. DPO acts as the “good cop” to keep the development processes legally compliant and acts as the “bad cop” if the practices slip.

What is personal data according to GDPR?

Personally identifiable information, the data attached or related to a person is considered personal data, any information relating to an identified or identifiable natural person. This can be one piece of information or multiple data points combined to create a record. See the below image for types of personal data. Note: Any other form of system-generated data which identifies a natural person. The data users generate within your app is personal data.
img5

Privacy by Design - Compliance for App

PbD compliance means factoring in data privacy by default. Let us look at the different compliance for app developers.
img7

Privacy by design at app’s initial design stage

  • Create a privacy-impact assessment (PIA) template for your business to use for all functions involving personal data.
  • Review contracts with partners and third parties to ensure the data you pass on to them is being processed in accordance with PbD GDPR.
  • Do not request unnecessary app permissions that imply privacy invasion, such as access to contacts or to the microphone.
  • Audit the security of your systems.
img8

Privacy by Design throughout its lifecycle

  • Minimise the amount of collected data.
  • Minimise the amount of data shared with third parties.
  • Where possible, pseudonymize personal data.
  • Revisit contact forms, sign-up pages and customer-service entry points.
  • Enable the regular deletion of data created through these processes.
img9

Privacy by Design throughout the user’s engagement with the app

  • Provide clear privacy and data-sharing notices.
  • Embed granular opt-ins throughout those notices.
  • Don’t require social media registration to access the app.
  • Don’t enable social media sharing by default.
  • Separate “consent for essential third-party data sharing” from “consent for analytics and advertising”.
  • Periodically remind users to review and refresh their privacy settings.
img10

Privacy by Design after the end of engagement and mothballing

  • Allow users to download and delete old data.
  • Delete the data of users who have closed their accounts.
  • Delete all user data when the app’s life comes to an end
img11

Privacy impact assessment

img12 scaled
A privacy impact assessment (PIA) is a process of documenting the issues, questions and actions required to implement a healthy PbD process in a project, service or product. PIAs are a core requirement of GDPR, and in the event of a data protection issue, PIA will determine the shape of the engagement with a regulatory authority. PIA should be done when starting a new project, and PIA evaluation should be run on existing ones.

Doing a PIA

img13 1 scaled

Identify the source and type of information gathered. Who collected the information, the method and for what purpose?

How and when will the information be used?
Does the person have authorization?
How will authorized users be identified?

arrow

How and when will the information be used?
Does the person have authorization?
How will authorized users be identified?

img14 scaled
arrow
img15 scaled
Identify security control including how data is retained, transferred or shared. What is the format and location of the data?
arrow
How, When and Why will sensitive information be disclosed; by whom and to whom.
img16
arrow
img17
How and when will sensitive information be destroyed and by whom.

PIA - data collection - details

img18
  • For how long is data stored, and when is the data deleted?
  • Is the data collection and processing specified, explicit, and legitimate?
  • What is the process for granting consent for the data processing?
    • Is consent explicit and verifiable?
  • If not based on consent, what is the legal basis for the data processing?
  • Is the data minimised to what is explicitly required?
  • Is the data accurate and kept up to date?
  • How are users informed about the data processing?
  • What controls do users have over the data collection and retention?

PIA - technical and security measures

img19
  • Is the data encrypted?
  • Is the data anonymized or pseudonymized?
  • Is the data backed up?
  • What are the technical and security measures at the host location?

PIA - Personnel

img20
  • Who has access to the data?
  • What data protection training have those individuals received?
  • What security measures do those individuals work with?
  • What data breach notification and alert procedures are in place?
  • What procedures are in place for government requests?

PIA - Legal

img21
  • Are the obligations of all data processors, including subcontractors, covered by a contract?
  • If the data is transferred outside the European Union, what are the protective measures and safeguards?

PIA - risks

img22
  • What are the risks to the data subjects if the data is misused, mis-accessed, or breached?
  • What are the risks to the data subjects if the data is modified?
  • What are the risks to the data subjects if the data is lost?
  • What are the main sources of risk?
  • What steps have been taken to mitigate those risks?

Security measures to be taken.

  • Password hashing and salting
  • Data sandboxing
  • Pseudonymization and anonymization
  • Encryption at rest and in transit
  • Responsible disclosure
  • Staff training and accountability on data protection
  • Physical security of servers, systems and storage

Development

  • GDPR does not define a list of right or wrong programming languages, version control systems, or testing tools.
  • Whatever is used should be clearly defined and followed for each and every situation.
  • Standard tools and infrastructure can be modified at any time, as long as they are documented as the frameworks which must be used.

Coding standards

  • Create a list of approved standards and methodologies used for both coding and testing.
  • Disable unsafe or unnecessary modules, particularly in APIs and third-party libraries.
  • An audit of what constitutes an unsafe module should be about privacy by design, such as the unnecessary capture and retention of personal data, as well as security vulnerabilities.
  • Code reviews should include an audit for privacy by design principles, including mapping where data is physically and virtually stored, and how that data is protected, encrypted, and sandboxed.

Data management

img23 scaled
  • Collect the absolute minimum amount of personal data required – on the front end and back end.
  • Do not link personal data with other data sets stored in a single location.
  • If aggregating data, remove personal and identifying data as much as possible.
  • Anonymisation, though not explicitly required, is recommended; however, the greater the risk that anonymized data could be indirectly identifiable or re-identified by aggregating it with additional data, the enhancing the risk of using it at all.
  • Data retention and deletion schedules are to be planned.
  • Data should be deleted, either automatically or through user actions, when it is no longer needed.
    • Should be deleted from the main systems.
    • Should be deleted from archives and backups.
    • Should be removed from third parties to whom it is passed.
  • Not everything has to be deleted as a rule.
    • Ex: Keep all purchasing records for ten years for tax and auditing purposes Document that fact and its rationale.
    • User’s request for account deletion would not require you to delete their purchasing records from your tax and accounting records; the right to be forgotten is not the right to fly under the radar.
  • Personal data should not be visible in plain view, either on the front or back end.
  • All users of a system should not have universal access.
  • Data should also be encrypted in transit and at rest.
  • Design reviews should view flows of personal data through the eyes of an attacker, be it human or otherwise.

Consent

img24
  • Users have the right to be informed about the data flows containing their information, and their rights over them, in the most privacy-positive ways possible.
  • Provide good consent mechanisms and user controls.
    • Can be done in control panels, user dashboards, account settings, and privacy centers.
  • Consent choices must be granular.
    • Users must be able to invoke any aspect of control over their data at any time.
  • A user setting up an account for the first time should enjoy optimal privacy settings by default.
    • They should not have to opt-in to privacy, or switch off defaults to achieve it.
  • Consent must never be assumed through a lack of action – develop ways to alert users to the fact that they have not yet granted opt-in consent to any applicable choices and options:
    • Failure to tick a box.
    • Mere creation of an account.
  • Back-end development process will also need to ensure timestamped documentation of:
    • What consent a user gave?
    • How they gave it?
    • Whether or not they have withdrawn it.

Access

img25
  • Provide an interface for individual access rights:
    • Right to edit and correct information.
    • Right to download data.
    • Right to restrict processing.
    • Right to data deletion.

Testing

img26
  • Add privacy by design and data protection by default to the testing processes.
  • Penetration testing should be planned by default.
  • Privacy testing procedures should predict the ways unauthorized users would access actual data on your system.
    • Would a suspicious search for user data, alteration to a record be logged as a security vulnerability?
    • Is data stored in login cookies?
    • Could someone gain access to data by intentionally triggering an error?
    • Can users have plaintext or silly passwords?
    • If you are applying privacy by design retroactively to an existing project, have you tested how easy it is to access legacy data?
  • Testing for data protection should also consider external alerts.
    • Public should have an option to alert you to data breaches, either potential or real.
  • Golden rule of GDPR — document it, or it didn’t happen. Test results, methodologies used to achieve them, need to be noted and actioned as living documents.

Privacy Information Notice - provided to end users

  • What data are you collecting?
  • Why are you collecting it, and is that reasoning legally justifiable?
  • Which third parties are you sharing it with?
  • What third-party data are you aggregating it with?
  • Where are you getting that information from?
  • How long are you keeping it?
  • How can the user invoke their rights?
  • How can the user invoke their rights?

Example: Granular permissions – Facebook
Users can choose which permissions they grant through a permission request flow. For example, if an app requests “Page and Groups” permission, people receive a request to grant those permissions.

img27 1
They can also choose which Pages, Groups, or business assets they grant permissions for.
img28

Example: List of 3rd Parties – Paypal

PayPal’s notice lists over 600 third-party service providers – https://www.paypal.com/uk/webapps/mpp/ua/third-parties-list

img29

Contact us to find out how Techpearl can develop the best apps according to your requirements adhering to the General Data Protection Regulations.

Read More Articles

 Contact Us Now

Talk to us to find out about our flexible engagement models.

Get In Touch With Us