Story

Photo by Markus Spiske on Unsplash

The Privacy, Ethical and Social Impact Assessment (PESIA) Framework?

The Privacy Ethical and Social Impact Assessment represents a new manner to assess the risk related to the uses of personal information. It is based on previous experience, like the Privacy Impact Assessment.

The PESIA wants to address a wider range of issues, taking into account the ethical dimension and the societal consequences of the use of data

The Privacy Impact Assessment procedures are already adopted in different countries, but they are mainly focused on privacy issues. In this sense, the PESIA wants to address a wider range of issues, taking into account the ethical dimension and the societal consequences of the use of data mainly when data, in some contexts (for example when we use big data analytics), are used to make decisions that may affect individuals and groups, with potential impacts in terms of discrimination that are not the traditional issues that we address when we talk about privacy and data protection.

What are the main differences between the PESIA model and the data protection impact assessment of the General Data Protection Regulation (GDPR) adopted by the European Union?

In the GDPR, the assessment is mainly focused on the risks in terms of data security, in terms of unlawful use of data.

The PESIA that we suggest adopts a different approach because it is focused on the publicity of the entire assessment, giving the user the chance to know the risk related to the potential uses of the personal information
Moreover, the impact assessment as described in the GDPR is an impact assessment focused on the individual dimension of data protection, it is not public because all the documents related to the impact assessment remain inside the entity that has made the assessment – (the private or public entity that play the role of data controller) and, finally, in the GDPR this assessment does not adopt a mandatory, participatory model, so the different stakeholders are not necessarily involved in the assessment process.

 

The PESIA that we suggest adopts a different approach because it is focused on the publicity of the entire assessment, giving the user the chance to know the risk related to the potential uses of the personal information, and moreover we want to create an impact assessment based on the engagement of the different potential stakeholders.

In this sense, the PESIA adopts a participatory process in order to identify the different issues that may be relevant in terms of societal consequences of the use of data. For this reason, the model that we want to define is different from the PIA and also from the DPIA, as described in the GDPR.

What are the main challenges in developing the PESIA model?
Although we are in the initial stage of the project, we can say that the major challenge of this model of impact assessment is related to the fact that, taking into account the societal consequences of the use of data and the ethical values, we have a sort of variables that are different in different contexts. When we talk about the values, when we talk about societal values, these elements necessarily change from a context to another.

So when we talk about data security, when we talk about data protection in terms of the legal protection of personal information, we are able to define general standards that are the same in all the countries, in all the social contexts. But when we talk about the values, when we talk about societal values, these elements necessarily change from a context to another. In this sense, we have two goals: the first one is to define a general framework that represents a sort of baseline for the impact assessment, and the second goal is to provide flexibility to this framework in order to take into account the specific issues, the specific forms and aspects related to the ethical and societal values that different communities may express. We imagine adopting some solutions like, for instance, an ethics committee to give voice to the persons that may be affected by the use of personal information.

Learn more

Workshop

Develop a shared language about ethics and help participants secure an appropriation of policy developments.

Learn more

Story

Although data ethics is hugely important, focusing on data only might limit the kind of questions we ask.

Learn more

Story

As part of our research, we have conducted multi-site ethnographic fieldwork with developers, designers and entrepreneurs...

Learn more

Henter Billede