SMART
Onder de Algemene verordening gegevensbescherming (AVG) zijn wij verplicht om een data protection impact assesment (DPIA) uit te voeren. Dit wordt ook wel een gegevensbeschermingseffectbeoordeling genoemd. Het is een instrument om privacyrisico’s van de gegevensverwerking van dit project in kaart te brengen. Een DPIA moet in elk geval uitgevoerd worden als er systematisch en uitgebreid persoonlijke aspecten evalueert gebaseerd op geautomatiseerde verwerking, waaronder profiling, en daarop besluiten baseert die gevolgen hebben voor mensen; of op grote schaal bijzondere persoonsgegevens verwerkt of strafrechtelijke gegevens verwerkt; of op grote schaal en systematisch mensen volgt in een publiek toegankelijk gebied (bijvoorbeeld met cameratoezicht).
Een goed uitgevoerde DPIA geeft voor ons inzicht in de risico’s die de verwerking oplevert voor de betrokkenen. Daarnaast ook inzicht in de maatregelen die genomen moeten worden. We maken graag ons rapport openbaar om zo transparant te zijn naar alle betrokkenen.
Verkeersstroomoptimalisatie en verbeterde noodrespons
Verkeerscongestie ontstaat wanneer de vraag naar reizen groter is dan de capaciteit, en ook wanneer er verkeersincidenten plaatsvinden. Congestie verhoogt het stationair draaien van voertuigen en emissies, en vermindert economische kansen. Het SMART Mobiliteitsproject zal de stromen van personenauto’s, gedeelde voertuigen, openbaar vervoer en vrachtvervoer optimaliseren, evenals de stromen van fietsers en voetgangers door corridors en over netwerken om de prestaties van de weg te verbeteren, rekening houdend met beleidsmaatregelen die gewenste maatschappelijke resultaten ondersteunen, om het beste gebruik te maken van de bestaande infrastructuur.
De gegevens die gegenereerd worden in het SMART Mobiliteitsproject zullen het ook mogelijk maken om de impact van drukke openbaar vervoer knooppunten te analyseren en te meten, en hoe deze knooppunten de stad en haar omgeving beïnvloeden.
Het SMART Mobiliteitsproject zal zorgen voor een beter situatiebewustzijn door toegang te bieden tot real-time verkeersgegevens binnen een 4D-omgeving, waardoor snelle en geïnformeerde analyses mogelijk zijn:
• Planners zullen kennis verwerven om strategische wijzigingen door te voeren om de verkeersstroom te verbeteren en verkeersincidenten te voorkomen;
• De locatie en oorzaak van knelpunten worden snel geïdentificeerd, zodat real-time aanpassingen kunnen worden gedaan om verkeersstromen te wijzigen;
• Eerste hulpverleners kunnen sneller reageren op noodgevallen, schade beperken en de veiligheid verbeteren;
• Reizigers worden geïnformeerd over mogelijke vertragingen en alternatieve reiskeuzes. Door on-demand gegevens te combineren voor zowel openbaar vervoer-passagiers als particuliere voertuigen, kan het hele systeem worden gestroomlijnd om beter aan de behoeften van de passagiers te voldoen. Aangezien gegevens worden verzameld voor de meeste delen van een reis, kan het systeem vervolgens gestroomlijnd worden binnen elke stad, maar nog belangrijker ook tussen steden of de omgeving van de stad.
______
Data Protection Impact Assessment (DPIA)
Eindhoven University of Technology
Name of service/department: Electrical Engineering
Name of management/controller:
Name of Processing activity Coordinator:
Name of Data Protection Officer: Bart Schellekens
Name of CISO: Martin de Vries
Name of Project Manager: Egor Bondarau
Introduction
Under the General Data Protection Regulation (GDPR), it is mandatory to carry out a Data Protection Impact Assessment (DPIA) if a data processing is likely to pose a high privacy risk to the people whose data is processed. A DPIA is an instrument to map out the privacy risks of a data processing operation in advance and, if necessary, to be able to take measures to reduce the risks.
In accordance with the General Data Protection Regulation (GDPR), DPIAs are required in a number of data processing situations, including:
- systematically and comprehensively evaluating personal aspects based on automated processing, including profiling, and base decisions on them that affect people;
- processing personal data or criminal law data on a large scale; or
- systematically following people on a large scale in a publicly accessible area (for example with camera surveillance).
This document specifies the data protection impact assessment of the data processing activities performed by TU/e researchers within the SMART research project. The data type are the video sequences taken by a CCTV camera at the road intersection in Helmond.
There is no mandatory method for performing a DPIA. But the GDPR requires that during the execution of a DPIA in any case:
- A systematic description is made of the processing and the purposes, including, if applicable, the legitimate interests pursued by the controller;
- An assessment of the necessity and proportionality of the processing operations in relation to the purposes;
- An assessment of the risks to the rights and freedoms of data subjects; and
- The intended measures to address the risks. These measures may include safeguards, security measures and mechanisms to ensure the protection of personal data and demonstrate compliance with privacy legislation.
The processing of personal data in the context of camera surveillance for this particular research is one of systematic and large scale monitoring and therefore a DPIA has to be conducted. For this reason, and among other things that TU/e wants to handle personal data consciously and carefully and wants to be transparent about this with data subjects, a DPIA for this research project has been carried out.
This DPIA will be supplemented and, if necessary, reassessed when there are changes in the data processing.
A. Description of characteristics of processing activity
1. Proposal
The video processing activity is a part of a research project called SMART carried out by TU/e together with research departments of six industrial companies. The goal of the project is to research and develop an intelligent traffic control system that will be able to reduce the carbon footprint and traffic jam delays.
For this system, TU/e carries out the research and development of an innovative subsystem for traffic anomaly detection. The subsystem includes several algorithms for detection of traffic accidents, objects on a road, improper driving, object throwing, actor’s falling, etc. To train the algorithms, large historical video data on the daily traffic is required. For this, TU/e plans to use the cameras installed in Helmond on the intersection of Julianalaan and Aarle-Rikstelseweg. The videos captured by these cameras will undergo encryption, face anonymization, license plate anonymization, by the TU/e researchers involved in the project and will be brought for training of the algorithms.
2. Personal data
The following actors appear on the videos: pedestrians, byciclists and vehicles.
Category data subjects | Category personal data | Personal data | Type personal data | Source |
Pedestrian | Video data | Visual appearance of full body and face | Regular | Directly from the video camera |
Bicyclist | Video data | Visual appearance of full body and face | Regular | Directly from video camera |
Vehicle | Video data | License plate | Regular | Directly from video camera |
3. Processing activity
Data capturing and encryption
The data captured by the camera system is streamed to a local storage device. The local storage device encrypts the data to prevent the data access by unauthorized parties. The encryption keys are known only to the TU/e employee responsible for this research project (TU/e Researcher).
Privacy-critical data removal
At the next phase, the recorded data should be labeled to enable training of the traffic anomaly detection algorithms. Prior to labeling, TU/e Researcher brings the local storage device to the TU/e server room, decodes the data and runs an automated algorithm that removes all privacy-critical image details. The following objects are irreversibly removed from all the image frames in the videos:
– All faces
– All vehicle license plates
Deletion of the original data
Only when the privacy-critical data is removed, the original recorded data is fully deleted from the local storage device.
Management of the anonymized data
TU/e Researcher is allowed to start labeling the videos which now contain no personal data. The consequent phases of training, validation and testing also use the same videos with no personal data.
It is important to mention that, while all the research phases are based on this anonymized video data, TU/e Researcher is not allowed to share the data with his colleagues and to any external parties.
The TU/e Researcher is also not allowed to include the snapshots of the anonymized videos in his scientific publications.
Storage of the anonymized data
The collected anonymized data is stored on the ResearchDrive server for long-term storage. TU/e Researcher may upload and keep the anonymized data on the local server-room server temporarily, during his AI experiments (model training). When an experiment is finished, the anonymized data on the local server is deleted.
4. Processing purposes
The main purpose is to research and develop an intelligent traffic control system that will be able to reduce the carbon footprint and traffic jam delays. In detail, the camera feeds are analyzed and traffic participants and their type and behaviour are detected by the AI system to be developed. This data allows to form a scene that represents the current situation at the intersection. At each moment of time, this scene is read by the traffic light control system that optimizes the green on/off signals such that the throughtput of the pedestrians, bicyclists and vehicles can be maximized. This leads to reduction of carbon footprint and traffic jams.
The data in question is required by the TU/e researchers to train our AI algorithm to detect traffic participants and their behaviour with high accuracy.
The purpose of each activity is given in the section above.
5. Parties involved
TU/e is collecting the personal data and is also the controller of the data collected.
Helmond Municipality is only involved in giving permission for collecting the data and installing the cameras in the Helmond neighbourhood.
The personnel of the Helmond Municipality does not have an access to the data in all processing phases.
For TU/e, the access and processing will be performed only by the two PhDs, participating in the SMART project:
ir. PHD1, VCA, EE, TU/e
ir. PHD2, VCA, EE, TU/E
Name party | Role party | Functions/departments with access | (Categories of) personal data | Data processing agreement |
TU/e | Controller | Two PhD researches from Electrical Engineering department: ir. PHD1 ir. PHD2 | All mentioned personal data | N/A |
Helmond Municipality | Provider | Helmond does not have an access to the data in any of the processing phases. Helmond only provides the permissions for collecting the data and for installation of the cameras. | N/A | N/A |
6. Interests in processing activities
Involved Parties | Interests |
TU/e | Conducting and publishing scientific research; educating the involved PhD student(s) to become doctor(s) of science. Introduce innovative traffic anomaly detection system to academia and industry. |
PhD Researchers | Research and develop advanced and innovative AI algorithms for detection of traffic anomalies. Publish in scientific conferences and journals. Obtain the doctor of science degree. |
Helmond Municipality | Test the pilot system that would be able to detect traffic anomalies and send them notifications that helps to improve the traffic management and control. |
7. Processing activity locations
The video data recording is planned at: 51.484399, 5.644002, intersection of Julianalan and Aarle-Rikstelseweg
The rest of the processing activities will be performed at the server room FLUX 6.1194 only.
The anonymized data will be archived at ResearchDrive.
8. Processing activity technology and methods
The video is captured by the cameras and connected edge device.
During the storage process, the original videos with personal data are encrypted on the fly and stored in the encrypted format. We deploy the AES 128 encryption – a strong encryption standard for protecting premium content. AES-128 is the only publicly available encryption algorithm that is recommended by the NSA. Content encrypted with the Advanced Encryption Standard algorithm with 128-bit cipher block size cannot be practically decrypted by only brute force attacks.
The encryption key is known only to the two abovementioned TU/e researchers.
Every week, one of the abovementioned TU/e researchers comes to the intersection by car, opens the secure box and replaces the storage harddrive containing the video data by an empty harddrive. The video data is brought to the TU/e server room 6.194, where the following steps are performed:
- The video data with personal data is copied from the harddrive to the TU/e server.
- The harddrive is formatted, to ensure the original data is deleted from the harddrive.
- The original video data on the TU/e server is decrypted by the TU/e researcher.
- The human faces and car license plates are irreversibly removed (voided) by the DeepLabV3 ResNet-101 system, tuned for 100% detection accuracy in the cost of high false detection rate (meaning that we will be voiding some areas in the videos which are not faces).
DeepLabv3 is a deep learning system that segments (finds) faces and license plates on images and allows to replace them with a one-color mask on top. In this way, the original pixels of faces and license plates are fully and irreversible deleted.
An example of irreversibly face-voided image is presented below

Police release images of suspects with their faces removed
- The original video data with personal data is deleted from the TU/e server. Instead, only the video data with removed faces and LPs remains on the server.
From this moment, further processing steps do not have and do not use any information on the human face data and licence plate data. We call the resulting no-face, no-LP videos as a stripped data in the remainder of this section.
- The stripped data is further used to train the anomaly behaviour detection algorithms. The training is perfomed on 3 servers that are allocated in the same server room FLUX 6.194.
- At the end of the training, the stripped data is archived to the ResearchDrive.
- The same stripped data can be used for training multiple times within the project duration 2022-2026.
- The scientific publications resulting from this research will not include images from the stripped data.
9. Legal and policy framework
The most relevant internal policy element is the TU/e Ethics Policy; obtaining ethical approval from the Ethical Review Board (ERB) for the study is necessary to start the research project.
The University does not have a formal Research Data Management policy, but Codes of Conduct around Research Integrity and Academic Practice do apply, of particular relevance is that a customary retention period of 10 years is followed for archiving relevant research data for the purpose of ensuring research integrity and verifiability of research results.
10. Retention period
The original videos with personal data (faces, license plates) are stored on the local harddrive in the encrypted format for at maximum one month (till the moment when the TU/e researcher comes to the intersection and transport the harddrive to the TU/e server room).
The stripped (anonymized) video data is stored on the ResearchDrive server for 10 years after the PhD completion period – 31.12.2035 in order to safeguard research integrity and verifiability of research results. During this period, both PhDs will be using the stripped data in their experiments and in the preparation for the PhD defense. Within the consequtive 10 years, the data will be used for the successing research projects.
B. Processing activity legal assessment
10. Lawful basis
The lawful basis for processing the data is the fact that the processing is necessary for the performance of a task carried out in the public interest (art. 6 (1)(e) GDPR) resulting from art. 7.4, 7.7, 7.8, 7.9 Wet hoger onderwijs en wetenschappelijk onderzoek. We see this study not only as important in scientific research, but also, taking into account that it potentially is used to develop an intelligent traffic control system that will be able to reduce the carbon footprint and traffic jam delays, as one of public interest.
11. Special categories of personal data
No special categories of personal data are processed.
12. Purpose limitation
The data are primarily collected and processed for one scientific purpose: make the smart algorithms of TU/e able to detect anomalies in traffic, optimize the traffic light control and, by this, reduce the carbon foorprint and traffic jams. The data collection and processing is for answering the above mentioned research question, while the long-term data retention is for the purpose of guaranteeing research integrity and verifiability of research results. We might also keep the aggregated and pseudonomized data for the purpose of research in a similar field.
13. Necessity and proportionality
The processing steps have a direct dependency, meaning that removal of any arbitrary processing step would disable the whole processing chain and hinders the research objective.
a. Proportionality: is the infringement on the data subject’s private life and the protection of the personal data proportionate compared to the objectives of the processing activity?
The impact of the research on the data subjects is very low. In the processing procedure, explained above, it is clear that the personal data (faces and license plates) will be removed automatically at the moment the data is obtained. The researches will not even look at the original data before starting the personal data removal process.
Comparing the abovementioned impacts with the objectives (reduction of carbon footprints and traffic jams) we can conclude that the probability-impact tuple is several factors lower than the benefits that can be provided by the designed smart traffic control system. This system can lower carbon footprints, that has big impact on global warmining, and with that, it can reduce the harm to people’s health. Additionally, this system would be designed to reduce the traffic jams, that will potentially improve road safety, increase comfort and mobility for non-motorized travel and reduce environmental impacts and vehicle emission.
b. Subsidiarity: can the objectives of processing not reasonably be achieved in another way, less disadvantageous to the data subjects?
The current state of the art methods in the domain of traffic behaviour analysis are all incorporating the video-based recordings and deep learning algorithms. Any other competing modalities (induction loops, acoustic, radar, thermal, hypecpectral sensors) do not provide sufficient amount of data to perform the detection of behavioural anomalies for all traffic actor types at once: pedestrians, byciclists, vehicles.
14. Data subjects’ rights
Because of the nature of the research, it is impossible to collect inform consent from all data subjects involved because it would involve a disproportionate effort. In that case, article 14.5 of the GDPR states that the duty to inform data subjects can be dropped, in particular when processing for scientific research. Parties however need to take appropriate measures to protect the rights of the data subject, which is properly taken care of considering the personal data (faces and license plates) will be removed automatically at the moment the data is obtained.
The (data subject) rights to access, rectification and to restrict processing can also be dropped following article 44 Uitvoeringswet AVG (Netherlands GDPR implementation act) because provisions have been made to secure that the data is only used for scientific research.
However, a large sticker on each metal box will be provided with a link to the information on the purpose of the processing of the recording activities. Additionally, an explanation on what will happen with the personal data of the data subjects will be given. The videos captured by these cameras will undergo encryption, face anonymization, license plate anonymization, by the TU/e researchers involved in the project and will be brought for training of the algorithms.
Below are the sticker photos with the link to the page explaining the project data processing.



C. Description and assessment of risks for data subjects
Describe and assess the risks that each planned processing activity poses to the rights and freedoms of the data subjects. Take into account the nature, extent, context and objectives of the processing activity as described and assessed in sections A and B. Risks for the data controller are not part of this assessment.
15. Risks
The data collection and processing imposes several potential risks. The data that are processed can include information about the data subjects’ location, movement, and activities that are collected for this study. While the risks associated with these data that are foreseen to be collected appear fairly limited, the data may reveal identity of people and their location, when this data becomes accessible to persons who should not have access to this information (especially before anonymization of the data). Again, we can only foresee this happening if the local drive gets stollen before it reaches the TU/e server, but our attempt to avoid this is by encryption of the local storage device in order to prevent the data access by unauthorized parties. However, we assume that the chances of this issue occurring are very low. After the data has been anonymized, there are only low risks left for the privacy of the data subjects. Before we proceed to the risks to the rights and freedoms of the data subjects, lets us provide an analysis of the technical risks, regarding the data disclosure.
Please note that the technical risks presented in the table below are specified for a general situation, in other words, they do not consider the mitigation and security enforcement measures planned (secure locks, encryption, anonymisation).
Technical risks | Description | Likelihood | Impact | Reason |
Technical risk 1 | An intruder breaks in to the metal box attached to the light poll and connects to the camera line directly. The intruder might be able to obtain live unencypted and non-stripped data, where all personal details of the captured actors are present. | Low | Medium | Even if no mitigation measures are taken to secure the box, the probability is still low (not many people want to break into a public infrastructure to see a standard video of the same intersection). |
Technical risk 2 | An intruder breaks in to the metal box attached to the light poll and steals the harddrive on which the videos are captured. The theft will be able to see all videos captured within the current week. | Low | Medium | Even if mitigation measures are not taken, the motivation to break in and steal the public infrastructure to obtain hours of videos on a standard traffic on the intersection is unclear. The only motivation can be to obtain the harddrive in possession, but then the intruder is not interested in videos and will clean up the data from the harddrive for his own future use or re-sale. |
Technical risk 3 | During harddisk replacement at the intersection, or on the way back to TU/e, the PhD fellow is attacked by a robber that wants to obtain the harddisks. The robber will be able to see all videos captured within the current week. | Low | Medium | Even if mitigation measures are not taken, the motivation to commit this crime to obtain hours of videos on a standard traffic on the intersection is unclear. The only motivation can be to obtain the harddrive in possession, but then the intruder is not interested in videos and will clean up the data from the harddrive for his own future use or re-sale. |
Technical risk 4 | Someone breaks in to the server room at TU/e and steals the server or the harddrive with the all captured video data. The theft will be able to see all the video data previously recorded. | Low | Medium | Even if mitigation measures are not taken, the motivation to break in to a server room and steal the TU/e infrastructure to obtain hours of videos on a standard traffic on the intersection is unclear. The only motivation can be to obtain the harddrive or a server in possession, but then the intruder is not interested in videos and will clean up the data from the harddrive for his own future use or re-sale. |
Technical risk 5 | Someone hacks in to the TU/e server or ResearchDrive and downloads all the captured videos. The theft will be able to see all the video data previously recorded. | Medium | Medium | Medium. Without mitigation measures, hacking computer servers is a common thing, unfortunately. Hackers are searching for an arbitrary valuable information. |
Technical risk 6 | Someone hacks the computers of the abovementioned PhD and obtains the AI models that were trained on the intersection videodata. The theft will not be able to see any bit of video or personal data from the videos, since the AI models are the software that contains no video or imaging data at all. | Medium | Low | Quality AI models are valuable things. But the models do not contain any of the captured personal data. |
Technical risk 7 | PhD researchers, who have an access to the data, share the data to the third party. The third party will be able to see the captured video data. | Low | Medium | Motivation of a TU/e employee to ruin the academic carrier due to the data sharing is unclear. |
Knowing the technical risks, we can proceed to the Risks and consequences to the rights and freedoms of the data subjects.
Risks that processing activities pose towards data subjects | Negative consequences for the rights of data subjects | Origin of consequences | Probability that these cons-ces materialize | Impact of these consequences for the data subjects |
A pedestrian or a car can be identified and localized on the intersection by a third party (offender) | Privacy protection rights will be violated | Technical risk 1. | Very low (probability of the technical risk 1) | Low. The information that a specific person a car was at this intersection at this specific time may be published in the social media. |
Knowledge on the daily movement trajectory of a pedestrian or a car through this intersection. | Privacy protection rights will be violated | Technical risks 2-7 (in case if mitigation measures specified below are not taken, overwise the risk does not occur). | Medium (highest probability of the risks 2-7) | High: – The data can be used for an offender to prepare an attack on a pedestrian or a car. Medium: – The data can be used to disclose the daily activity of a person to a third party for non-criminal intentions. – The information that a specific person a car was at this intersection at this specific time may be published in the social media. |
D. Description of mitigating measures
16. Measures
General measures:
- The data captured by the camera system is streamed to a local storage device. The local storage device encrypts the data to prevent the data access by unauthorized parties.
- All data will be stored on a to the project dedicated folder on ResearchDrive and this folder is only accessible to researchers involved in the project.
- Data is shared through ReserachDrive and not by any other means.
- (Sensitive) personal data are stored for the duration of the processing and analysis on an encrypted computer (e.g. using Bitlocker) of involved researchers. Upon completion of the analysis, the data will be uploaded to the ResearchDrive and deleted off the local computer.
- The institutional policy on data security of the TU/e applies, also see: https://intranet.TU/e .nl/en/university/services/01-01-1970-information-management-services/for-a-secure-data-driven-TU/e /ict-security-policies/.
- As part of the project, a document on privacy and ethics has been created. This document contains information on how each of the safeguard privacy measures.
Awareness:
- Researchers and students in the project will receive basic training in handling personal data if this has not already taken place. This mitigates the risk that researchers and especially students are not aware of the implications of the AVG and the risks and responsibility when handling personal data.
Legal measures:
- All TU/e students involved in the project have/ will sign and NDA
- All researchers (professor, supervisor, PhD) involved in the project have a contract with TU/e , which includes the obligation of confidentiality.
Specific measures
Risk | Measures | Residual risk and risk estimation |
Technical risk 1: intrusion into the metal box at the light pole and unauthorized connection to the live video stream. | The metal box is closed by a tubular lock of a security Grade 3 ANSI (https://www.thisoldhouse.com/home-safety/21015270/how-to-pick-a-lock). The two keys for the lock are only in possession of the two abovementioned PhD researchers. | The residual risk is low, since the process of breaking a metal lock in an open public space is easily observable by the community and may lead to a detention or a fine. |
Technical risk 2: Intrusion to metal box at the light poll and stealing the local harddrive with data. | All the harddrive data are automatically encrypted by the protocol that does not allow brute force decryption. | The residual risk is absent, since the data decryption is impossible. |
Technical risk 3: Attack on the PhD researcher on the way from Helmon to TU/e resulting in a robbery of the local harddrive with data. | All the harddrive data is encrypted by the protocol that does not allow brute force decryption. Decryption is performed only when the harddisk is brought to the TU/e server room. | The residual risk is absent, since the data decryption is impossible. |
Technical risk 4: Breaking in to the TU/e server room and stealing the local harddrive or a server disk. | 1. The server room is locked by an electronic lock. The access rights are given only to the PhD researchers and the supervisory TU/e employees. 2. The data stored on the TU/e server will be irreversibly anonymized at the beginning of the storage process. This means that the files will not contain any personal data. | The residual risk is absent since the data is either encrypted or if decrypted – then it stays anonymized. |
Technical risk 5: Hacking the TU/e server or ResearchDrive. | 1. The server deploys a local firewall, that prevents access to the data even from the local TU/e intranet. 2. The data stored on the TU/e server will be irreversibly anonymized at the beginning of the storage process. This means that the files will not contain any personal data. | The residual risk is absent since the data is fully anonymized. |
Technical risk 6: Hacking the computers of PhD researches and obtaining the data and AI models. | 1. The data stored on the computers of the PhD researchers will be irreversibly anonymized at all phases of processing. 2. The AI models do not contain any personal data by default. | The residual risk is absent. |
Technical risk 7: Unauthorized sharing of datasets with third parties by PhD researchers. | All researchers of the TU/e have signed an NDA as part of their employment contract. This NDA states that they will exercise uttermost diligence and they are also not allowed to share other datasets than those that are contractually agreed upon. | The residual risk is low, since there is little to no advantage for researchers to share datasets and endanger their reputation in academic world. |
E. Privacy by design
This paragraph contains additional questions, to ensure that all basic principles of the GDPR have been considered and met.
- Lawfulness, fairness and transparency
Is there a manual so that the relevant people know how to handle the personal data that are processed in the application?
The PhD researchers obtain a technical manual on the data processing requirements, and written instructions on the GDPR and related obligations. Besides this, they are informed about the TUE Privacy and Ethics: https://www.tue.nl/universiteit/library/library-for-researchers-and-phds/research-data-management/rdm-themes/privacy-and-ethics
and about the TU/e FAQ on GDPR: https://intranet.tue.nl/en/university/services/01-01-1970-information-management-services/for-a-secure-data-driven-tue/faq/
- Accuracy and up-to-date processing
In what way is it guaranteed within the application or process that personal data are kept accurate and up to date?
The personal data is fully removed from the local hard drives and also removed (by data anonymization) when stored on the TU/e server or ResearchDrive. Therefore we not need to keep the data up to date.
3. Confidentiality and integrity
Which persons or which departments have access to the personal data?
The access to the personal data is given to the following PhD researchers of the Electrical Engineering department:
ir. PHD1, VCA, EE, TU/e
ir. PHD2, VCA, EE, TU/e
Is there an authorization matrix?
The authorization matrix presented below is static and does not change over time.
Involved party | Original videos with personal data | Encrypted videos with personal data | Anonymized videos with no personal data. |
Helmond Municipality employees | No | No | No |
ir. PHD1, VCA, EE, TU/e | Yes | Yes | Yes |
ir. PHD2, VCA, EE, TU/e | Yes | Yes | Yes |
The access to the data is required for the PhD researchers to carry out the researched defined above.
In what way are the personal data accessible?
As explained in Section A, the videos with personal data are captured and encrypted on the local drive. Every week, the local drive is brought to the TU/e server room were the data is decrypted and irreversibly anonymized. The original videos with personal data is fully deleted, the local drives are formatted. After this step, no personal data is available on any storage device.
Are the relevant people familiar with the data breach procedure?
It is impossible to inform random pedestrians or drivers in the intersection about the data breach procedure. The participating PhD researchers are informed about the TUE Privacy and Ethics: https://www.tue.nl/universiteit/library/library-for-researchers-and-phds/research-data-management/rdm-themes/privacy-and-ethics
and about the TU/e FAQ on GDPR: https://intranet.tue.nl/en/university/services/01-01-1970-information-management-services/for-a-secure-data-driven-tue/faq/
4. Obligation of Accountability
Is the data controller (data domain owner) aware of and in agreement with all that is stated above?
Yes, the data controller is aware of everything that is stated above.
Conclusions and advice
Data Protection Officer
Bart Schellekens
Date: 25.04.2023
CISO
Martin de Vries
Date: 25.04.2023