NicFab Newsletter
Issue 6 | February 3, 2026
Privacy, Data Protection, AI, and Cybersecurity
Welcome to issue 6 of the weekly newsletter dedicated to privacy, data protection, artificial intelligence, cybersecurity, and ethics. Every Tuesday, you will find a curated selection of the most relevant news from the previous week, with a focus on European regulatory developments, case law, enforcement, and technological innovation.
In this issue
- ITALIAN DATA PROTECTION AUTHORITY
- EDPB - EUROPEAN DATA PROTECTION BOARD
- EDPS - EUROPEAN DATA PROTECTION SUPERVISOR
- EUROPEAN COMMISSION
- EUROPEAN PARLIAMENT
- COUNCIL OF THE EUROPEAN UNION
- DIGITAL MARKETS & PLATFORM REGULATION
- AI STANDARDS AND CERTIFICATIONS
- INTERNATIONAL DEVELOPMENTS
- CYBERSECURITY
- TECH & INNOVATION
- SCIENTIFIC RESEARCH
- AI Act in a Nutshell
- From the NicFab Blog
- Events and meetings
- Conclusion
ITALIAN DATA PROTECTION AUTHORITY
Garlasco case: Data Protection Authority warning on information limits
The Data Protection Authority has condemned the behavior of the media and websites in their coverage of the Chiara Poggi case, denouncing a progressive increase in the level of detail in the reconstruction of facts, personal contexts, and individual profiles that exceeds legitimate informational purposes.
The Authority emphasizes that the repeated publication of such elements transforms the news into “morbid sensationalism,” violating the principle of essentiality of information and the ethical rules of journalism. Respect for dignity must be guaranteed not only to the victim but also to family members, suspects, and all those involved in the media narrative.
For DPOs of publishing companies, this reminder highlights the need to implement preventive content assessment processes that balance the right to report with the protection of personal data.
Newsletter: sanctions and regulatory changes for the labor sector
The Garante’s newsletter highlights several significant decisions regarding data protection in employment relationships. A company in the agricultural sector was fined €120,000 for installing devices on its vehicles that monitored employees’ driving styles, collected data for 13 months, and assigned monthly behavioral scores without the guarantees provided for in the Workers’ Statute.
At the same time, the Authority clarified that accessing a dismissed employee’s email account violates privacy, setting clear limits for companies. On the innovation front, the system for issuing electric scooter license plates was approved, and a new tool to combat aggressive telemarketing was activated.
For corporate DPOs, these decisions underscore the importance of balancing legitimate controls and respect for employee privacy by implementing procedures that comply with the law.
Scooters: green light for the license plate issuance platform
The Italian Data Protection Authority has approved the draft decree of the Ministry of Infrastructure and Transport regulating the online platform for requesting and issuing scooter license plates. The Authority has set certain conditions to ensure greater data protection: eliminating automatic verification via ANPR (identification already takes place via SPID or CIE), specifying the databases for checks relating to companies and powers of representation, and assigning the MIT exclusive responsibility for providing information and managing the rights of data subjects.
SegnalaODM: new tool against telemarketing now active
SegnalaODM, the platform of the Monitoring Body for the Code of Conduct on telemarketing and teleselling, has been operational since January 2026. Users can report any violations by participating operators. Before sending a report, it is necessary to contact the operator concerned; if there is no response or an inadequate response, and if the behavior is attributable to a violation of the Code of Conduct, it will be possible to proceed with the report via the platform.
- Registration: https://segnalaodmtelemarketing.it/homepage
- Information: https://www.odmtelemarketing.it/
EU Regulation 2025/2518: new procedural rules for cross-border GDPR cases
On January 1, 2026, EU Regulation 2025/2518 came into force, establishing additional procedural rules for the application of the GDPR in cross-border cases. The aim is to harmonize the handling of complaints and the conduct of investigations. Among the new features are formal criteria for the admissibility of cross-border complaints, an early resolution procedure, strict deadlines for the one-stop shop, the introduction of administrative and cooperation files, and the right to be heard before decisions are taken. The Regulation will apply from April 2, 2027.
European Data Protection Day: focus on digital education for young people
On the occasion of the 20th European Data Protection Day, the Authority emphasized the need for digital education for minors, a particularly vulnerable category in the digital context. Young people struggle to recognize online dangers, tend to trust strangers, and share personal data without fully understanding the long-term consequences.
The Authority has updated two key tools: the guide “Social privacy. How to protect yourself in the age of social media” for more responsible and informed use of digital platforms, and the handbook “Privacy-proof schools” which addresses current issues such as the use of smartphones in the classroom and the introduction of artificial intelligence in teaching.
For DPOs in the education sector and companies that process children’s data, these updates are essential references for implementing compliant policies and effective awareness programs.
Case Report: Data Protection Authority considers appeal to the Court of Cassation
The Italian Data Protection Authority has taken note of the Court of Rome’s ruling that annulled the sanction imposed on Rai for broadcasting the audio of a conversation between former minister Sangiuliano and his wife during the Report program. While respecting the court’s ruling, the Authority has reserved the right to appeal the decision to the Court of Cassation.
This case highlights the complexity of balancing the right to report news with the protection of personal data, particularly when private conversations involving public figures are involved. The decision to consider the appeal demonstrates the Authority’s determination to enforce the principles of privacy legislation, even in cases that are relevant to the media.
For DPOs in publishing companies, the case underscores the importance of clear procedures for managing sensitive content and the need for case-by-case assessments.
EDPB - EUROPEAN DATA PROTECTION BOARD
Stakeholder event on political advertising: express your interest
The EDPB is organizing a remote event on March 27, 2026, to gather feedback on the Guidelines on the processing of personal data for targeted political advertising, in accordance with the Regulation on transparency and targeting of political advertising.
The initiative represents a significant opportunity for DPOs to contribute to the development of regulatory standards in an increasingly relevant sector. Participation is open to individuals and organizations with specific expertise in the field. For privacy professionals, this event offers an opportunity to directly influence future regulatory interpretations and gain valuable insights into a rapidly evolving field.
The deadline for expressing interest is February 9. DPOs should consider participation as a strategic investment in understanding emerging regulatory dynamics.
Data Protection Day 2026: protecting children’s data online
Data Protection Day 2026 focuses on protecting children’s data online, a strategic issue for the EDPB. Children are particularly vulnerable in the digital world due to their limited ability to recognize risks and their tendency to share personal information unknowingly.
The EDPB has stepped up its efforts in this area. In February 2025, it adopted a Declaration on age assurance, setting out ten principles for age verification in line with the GDPR. In addition, “Privacy for Kids” is being developed, a multilingual educational hub with resources for parents, teachers, and educators.
For DPOs, these developments underscore the importance of implementing specific measures to protect minors, balancing effectiveness and proportionality in age verification, and adopting privacy-by-design approaches in services aimed at children.
EDPS - EUROPEAN DATA PROTECTION SUPERVISOR
🎙️ TechSonar Podcast: Meet your AI companion
January 26, 2026
What if a machine listens, remembers, and cares like a friend or even a loved one? Who benefits most from digital intimacy and who is most at risk? Does ‘data extraction through intimacy’ reshape our behavior and self-determination? Can we draw ethical and legal lines? We will discuss these and other questions raised around privacy, consent, and influence with Vítor Bernardo.
Happy Data Protection Day 2026!
January 28 marks Data Protection Day, an anniversary commemorating the Council of Europe’s Convention 108, the first binding international legal instrument for the protection of personal data. This day is an opportunity to reflect on the evolution of digital privacy and the fundamental rights of citizens.
For DPOs, Data Protection Day is a strategic opportunity to raise awareness among organizations about the importance of regulatory compliance and to promote a culture of privacy. It is the ideal time to organize training sessions, update company policies, and strengthen dialogue with management on the risks associated with data processing.
The celebration of this anniversary also highlights the DPO’s role as guarantor of data subjects’ rights and promoter of best practices in the management of personal data.
EUROPEAN COMMISSION
New formal investigation against X under the Digital Services Act
The European Commission has launched a new formal investigation under the Digital Services Act against X (formerly Twitter), focusing specifically on Grok and the platform’s recommendation systems. This initiative represents a further tightening of European supervision of large digital operators.
The investigation is part of the broader enforcement of the DSA, which requires huge online platforms to implement appropriate measures to mitigate systemic risks. For DPOs, this action underscores the importance of ensuring that algorithmic systems comply not only with GDPR principles but also with the DSA’s specific obligations regarding transparency and content management.
Adequacy decision for Brazil: new bridge for international transfers
The Commission has adopted an adequacy decision recognizing an adequate level of personal data protection in Brazil, facilitating data transfers between the EU and the South American country without the need for additional safeguards. Brazil thus becomes the seventeenth country to benefit from an EU adequacy decision.
This decision represents a significant development for organizations operating in both jurisdictions, significantly simplifying bureaucratic requirements. DPOs will need to update their internal procedures and review existing agreements involving Brazil, as they can now rely on this adequacy decision rather than resort to standard contractual clauses or other safeguards.
DMA specification proceedings for Google
The European Commission has launched two specification proceedings to support Google in fulfilling its obligations under the Digital Markets Act. These procedures concern interoperability and the sharing of online search data, which are crucial to ensuring a fairer, more open digital market.
For DPOs, this development underscores the importance of closely monitoring European regulatory developments, mainly when their organizations operate in interconnected digital ecosystems. The definition of technical specifications for interoperability could affect data flows and require adjustments to data protection impact assessments.
7th EU-Japan Cyber Dialogue
On January 28, 2026, the seventh cyber dialogue between the European Union and Japan was held in Brussels, furthering strategic cooperation on cybersecurity and digital resilience between the two jurisdictions.
Slovenia: infringement procedure for copyright
The Commission has sent Slovenia a letter of formal notice (INFR(2025)4023) for failing to correctly apply the InfoSoc Directive and the Directive on collective management of copyright.
FAMES: pilot line for semiconductors inaugurated
The FAMES pilot line for ultra-low-power semiconductors was inaugurated at CEA-Leti in Grenoble (France) as part of the European strategy to strengthen technological sovereignty in the chip sector.
EUROPEAN PARLIAMENT
Briefing - Understanding EU data protection policy
The European Parliament has published an update to its briefing on European data protection policy, highlighting the evolution of the regulatory framework from the GDPR to the present day. The document emphasizes how the “datafication” of everyday life and recent scandals have made data protection a growing political priority in the EU.
Particular attention is paid to the challenges that have emerged in implementing the GDPR, including difficulties in cross-border enforcement and tensions between compliance and competitiveness. For DPOs, the briefing highlights concrete concerns, such as the potential overload of SMEs with excessive compliance requirements and the complexities of balancing data access for security purposes with privacy protection.
The document also anticipates future legislative initiatives, including a possible permanent framework to replace temporary anti-online abuse rules and a digital omnibus regulation to ease specific requirements in support of competitiveness and AI development.
Ask the European Parliament 2025: the annual report
The EPRS has published its annual report, “Ask the European Parliament 2025,” which documents the activities of the citizens’ response service. In 2025, 10,184 individual requests and 8,809 campaign messages were handled. The main topics concerned EU democracy and parliamentary law (2,758 requests), freedom/security/justice (1,076), and foreign affairs (1,044). The most significant campaigns concerned GMO/NGT labeling, Gaza and human rights, and prisoners abroad.
Acceleration of energy infrastructure authorization procedures
Parliament is examining legislative proposal 2025/0400(COD) - COM(2025) 1007 of December 10, 2025, amending Directives (EU) 2018/2001, 2019/944, and 2024/1788. Part of the “European grids package,” the initiative aims to create a regulatory framework for transmission and distribution networks, storage, charging stations, and renewable projects, addressing critical issues such as inconsistent administrative systems, limited resources for national authorities, complex environmental impact assessments, and limited digitization.
COUNCIL OF THE EUROPEAN UNION
EU priorities for UN human rights forums 2026
On January 30, 2026, the Council approved conclusions on the EU’s priorities in UN human rights forums for 2026. The document confirms the universal commitment to the respect, protection, and fulfillment of human rights.
Geographical priorities include: Russia (in relation to the war of aggression against Ukraine and systemic violations in Russia and Belarus), the Occupied Palestinian Territories, Iran, Venezuela (for a peaceful and inclusive democratic transition), and Afghanistan (with a focus on the operational accountability mechanism).
The Council also emphasized the importance of the 20th anniversary of the UN Human Rights Council, the focus on ending impunity and accountability (with support for the International Criminal Court), and the protection of civic space online and offline, condemning transnational repression.
DIGITAL MARKETS & PLATFORM REGULATION
WhatsApp is designated as a Very Large Online Platform under the DSA
WhatsApp has become the first messaging service to be classified as a Very Large Online Platform (VLOP) under the European Digital Services Act, with 46.8 million users in the EU. The designation applies specifically to WhatsApp’s public channels, not private messaging.
As a VLOP, WhatsApp will now have to implement measures to mitigate systemic risks, including content that could threaten the well-being of minors or democratic processes. The first compliance report must be submitted within four months. For DPOs, this development highlights how regulatory classification can expand rapidly, requiring continuous monitoring of user thresholds and preparation for additional governance obligations.
New DSA investigation against X for Grok deepfakes
The European Commission has opened a new formal investigation into X under the Digital Services Act, focusing on the risks posed by the AI chatbot Grok. The investigation concerns the generation of sexually explicit deepfake images of real people, including minors, that have flooded the platform.
The Commission will assess whether X adequately assessed and mitigated the risks before integrating Grok into the platform. According to estimates, the chatbot generated up to 3 million non-consensual sexual images in just 11 days. For DPOs, this case highlights the importance of conducting thorough impact assessments before implementing AI systems, considering all potential risks to data subjects’ rights.
EU steps up scrutiny of X after Grok scandal
The European investigation into X extends beyond deepfakes to include the impact of the platform’s decision to switch to a Grok-based algorithm. The Commission does not rule out interim measures, such as modifying the algorithms or suspending the chatbot, although the threshold for such interventions remains high.
EU authorities argue that without their pressure, X would likely not have implemented subsequent restrictions on Grok. This proactive approach by the Commission demonstrates growing vigilance regarding the risks posed by AI integrated into social platforms. For DPOs, the importance of establishing continuous monitoring and rapid response mechanisms for AI systems that could generate problematic content is emerging.
WhatsApp Channels under EU scrutiny for systemic risks
The designation of WhatsApp as a VLOP focuses specifically on Channels, where administrators can broadcast announcements to groups. With over 51.7 million EU users, the service has exceeded the critical threshold of 45 million, automatically triggering the more stringent DSA obligations.
Meta now has four months to assess and mitigate systemic risks on the platform, including illegal content, threats to civic discourse, elections, and fundamental rights. The distinction between private messaging (excluded) and public channels (included) highlights the complexity of applying digital regulations. For DPOs, this case illustrates how seemingly secondary features can trigger significant regulatory obligations when they reach sufficient scale.
AI STANDARDS AND CERTIFICATIONS
CEN-CENELEC: MoU with the Fundamental Rights Agency for AI
CEN and CENELEC have signed a Memorandum of Understanding with the EU’s Fundamental Rights Agency (FRA) to collaborate on European standardization in the field of artificial intelligence. The FRA will provide specific advice on fundamental rights, ensuring that technical standards adequately account for their impact on people’s rights.
10th Cybersecurity Standardization Conference
Registration is now open for the 10th Cybersecurity Standardization Conference, scheduled for March 12, 2026, in Brussels. The event, jointly organized by CEN, CENELEC, ETSI, and ENISA, will address the challenges of standardization for the Cyber Resilience Act and the future of European cybersecurity standardization.
Source CEN-CENELEC
Source ENISA
SBS: SME compatibility test for standards updated
On January 8, 2026, Small Business Standards (SBS) published a new updated version of its SME Compatibility Test, based on CEN/CENELEC and ISO Guide 17. The tool allows users to check whether standards under development are actually accessible and implementable by small and medium-sized enterprises.
Euralarm: guide to critical infrastructure protection
Euralarm has published a guide on precautionary measures for critical infrastructure protection, focusing on the physical security and resilience of vital installations in line with the CER Directive.
ForHumanity: certifications for AI auditors on the rise
ForHumanity, a nonprofit organization dedicated to the independent auditing of AI systems, continues to expand its certification programs, with over 40 schemes and more than 7,000 audit criteria for compliance with the GDPR, the EU AI Act, and the Digital Services Act. The organization is collaborating with CEN-CENELEC JTC 21 to develop audit criteria aligned with European standards.
ForHumanity Italy: “Conversations on AI, Ethics, and Standards.”
The Italian chapter of ForHumanity has launched the project “Conversations on AI, Ethics, and Standards,” a series of meetings with industry experts dedicated to spreading the culture of independent auditing of AI systems in Italy.
ForHumanity is an international nonprofit organization with a mission to make artificial intelligence safe for everyone. Through an open, collaborative process involving over 1,500 contributors from around the world, it develops independent audit criteria, certification schemes, and training programs focused on ethics, bias, privacy, trust, and cybersecurity, and operationalizes fundamental regulations such as the GDPR and the EU AI Act.
The web meetings and podcasts promoted by ForHumanity Italy address AI governance, data protection, algorithmic ethics, and international standards. The initiative stems from the collaboration between the Fellows and Italian members of ForHumanity, volunteers committed to promoting a responsible and human-centric approach to technological innovation in our country.
UNINFO: ISO/IEC 5259 standards on AI data quality transposed into Italian
UNINFO has made the four parts of the UNI CEI EN ISO/IEC 5259 standard “Artificial intelligence - Data quality for analysis and machine learning (ML)” available in Italian. The standard provides tools and guidelines to ensure reliable, transparent, and appropriate data in AI systems. The UNI/CT 533 Commission acts as the national mirror of ISO/IEC JTC 1/SC 42 activities and participates in the work of CEN-CENELEC JTC 21.
INTERNATIONAL DEVELOPMENTS
2026: a crucial year for global data protection
2026 is shaping up to be a turning point for international privacy, with three converging forces redefining the regulatory landscape. The unexpected reopening of the GDPR represents the most significant change, introducing two important new developments: abandoning technological neutrality with specific references to AI, and new legitimate grounds for the processing of sensitive data in the training of intelligent systems.
These European developments will inevitably have global repercussions, influencing national legislation inspired by the GDPR model. For DPOs, this means preparing for a more complex regulatory framework in which AI and data protection will need to find a new balance. Digital geopolitics and the safety of minors online will emerge as strategic priorities in the coming months.
Netherlands: digital sovereignty alert from privacy authority
The Dutch Autoriteit Persoonsgegevens has issued an urgent warning about the country’s technological dependence, triggered by the acquisition of DigiD (the national digital identity service) by a US company. The Authority denounces the lack of exit strategies when critical suppliers are acquired by non-European companies, warning of the risk of “serious social upheaval.”
Recommendations include applying the European Commission’s cloud sovereignty criteria in public procurement, with mandatory minimum scores. Contractual clauses allowing for immediate termination in the event of non-European acquisitions are also proposed. For DPOs, this case demonstrates the growing importance of digital sovereignty in impact assessments and supplier selection.
France Travail: CNIL fines France Travail €5 million for data breach
The CNIL fined France Travail (formerly Pôle Emploi) €5 million for a breach involving the data of all users registered over the last 20 years. The attack, using social engineering techniques, exploited compromised CAP EMPLOI consultant accounts to access Social Security numbers, email addresses, and contact details.
The French Authority deemed the security measures “inadequate,” noting that they were not absent but poorly implemented. The fine includes the obligation to demonstrate the corrections made, with a penalty of €5,000 per day for any delays. The case highlights that even public bodies are subject to severe fines, and that DPOS must ensure security measures are proportionate to the risk, especially when processing large-scale data.
CYBERSECURITY
Attack on the Polish power grid: critical vulnerabilities in energy infrastructure
The Polish government has confirmed that Russian hackers compromised parts of the national power grid in late December 2025, exploiting serious security gaps. The attackers targeted 30 sites, including wind farms, solar farms, and cogeneration plants, using default credentials and systems without multi-factor authentication.
Researchers at ESET and Dragos attribute the attack to the Sandworm/ELECTRUM group, known for causing blackouts in Ukraine in 2015, 2016, and 2022. The hackers used the “DynoWiper” malware to destroy industrial control systems, rendering several pieces of equipment inoperable and leaving them unrecoverable.
For DPOs in the energy sector, this incident underscores the urgency of implementing basic security controls and robust business continuity plans, given that critical infrastructure is a primary target of state-sponsored attacks.
Technical analysis of the ELECTRUM attack: new Russian operating model
Dragos has published a detailed analysis of the Polish attack, identifying a new Russian operating strategy that separates responsibilities between KAMACITE (initial access) and ELECTRUM (final execution). This division of labor enables greater flexibility and resilience in OT (Operational Technology) environments.
The attack represents the first documented large-scale cyberattack against distributed energy resources (DER), targeting communication and control systems critical to network operations. Although no blackouts occurred, the attackers demonstrated their ability to access strategic OT systems.
The methodology highlights the evolution of cyber threats targeting critical infrastructure, requiring DPOs to adopt layered security measures and continuously monitor industrial systems.
Sandworm strikes again: ten years after Ukraine
ESET attributed the attack on Poland with “medium confidence” to the Sandworm group, a unit of the Russian GRU specializing in cyber sabotage. The attack was timed to coincide with the tenth anniversary of the first malware-induced blackout in Ukraine in 2015.
The DynoWiper malware used confirms the operational signature of Sandworm, which has a long history of using destructive tools such as CaddyWiper and WhisperGate against critical infrastructure. The attack aimed to disrupt communications between renewable systems and energy distribution operators.
The incident highlights how geopolitical tensions translate into concrete cyber threats to civilian infrastructure, underscoring the need for DPOs to consider geopolitical context when assessing risks and implementing protective measures.
ICS devices irreversibly compromised
SecurityWeek reports that the Russian attack caused permanent damage to industrial control systems (ICS) at 30 Polish energy sites. Communication and control systems were “bricked”—rendered completely unusable—through targeted use of destructive malware.
The incident represents a significant escalation in cyberwarfare tactics, with the goal no longer limited to temporary service disruption but to the physical destruction of critical hardware. This approach exponentially increases restoration costs and recovery times.
For DPOs, the event highlights the need to implement hardware backup strategies and disaster recovery procedures that account for the destruction of primary systems, and to evaluate specific cyber insurance coverage for physical damage.
Strategic implications for critical infrastructure security
SecurityWeek’s second report confirms the attack’s attribution to the Sandworm group, highlighting that, ten years after its first success in Ukraine, the group has refined its destructive capabilities. The use of data-wiping malware represents a significant tactical evolution.
The attack demonstrates that state-sponsored APT groups are intensifying operations against NATO allies, using energy infrastructure as a vehicle for geopolitical pressure. Poland, a strong supporter of Ukraine, thus becomes a target for cyber retaliation.
This scenario requires DPOs to conduct a comprehensive review of threats, considering not only technical aspects but also geopolitical implications in cyber risk management, and to implement specific controls for state-sponsored threats.
TECH & INNOVATION
Microsoft illegally installed cookies on students’ devices
The Austrian Data Protection Authority (DSB) has ruled that Microsoft violated the GDPR by installing tracking cookies on the devices of a minor student using Microsoft 365 Education without their consent. The cookies in question analyze user behavior, collect browser data, and are used for advertising purposes.
The decision, obtained by the campaign group None of Your Business (noyb), highlights a critical issue: neither the school nor the Austrian Ministry of Education was aware of the tracking. Microsoft now has four weeks to stop using these cookies on minors’ devices.
For DPOs, this case underscores the importance of conducting thorough assessments of education service providers and not completely delegating GDPR compliance responsibility to technology platforms, especially when it comes to children’s data.
SCIENTIFIC RESEARCH
Selection of the most relevant papers of the week from arXiv on AI, Machine Learning, and Privacy
Machine Unlearning and Privacy
Per-parameter Task Arithmetic for Unlearning in Large Language Models - New approach to removing private information from language models through selective parameter subtraction. The technique reduces “over-forgetting” that can compromise other model features. Crucial for compliance with the right to erasure under the GDPR. arXiv
Representation Unlearning: Forgetting through Information Compression - Innovative framework that operates directly in the model’s representation space instead of modifying parameters. It promises greater stability and computational efficiency for the removal of sensitive data, a key element for privacy compliance. arXiv
From Logits to Latents: Contrastive Representation Shaping for LLM Unlearning - Introduces CLReg, a contrastive regularizer that identifies and separates concepts to be forgotten in internal representations. Addresses the problem of sensitive information persisting in latent representations, which is critical for compliance audits. arXiv
AI Security and Vulnerability
RedSage: A Cybersecurity Generalist LLM - Specialized language model for cybersecurity operations, trained on 11.8B domain-specific data tokens. Offers privacy-preserving alternatives to proprietary APIs for organizations that handle sensitive data and must comply with localization requirements. arXiv
The Trojan in the Vocabulary: Stealthy Sabotage of LLM Composition - Demonstrates critical vulnerabilities in the “tokenizer transplant” process used to combine models from different sources. Highlights risks to the AI supply chain and the need for thorough security checks before enterprise deployment. arXiv
Hardware-Triggered Backdoors - Reveals how numerical variations in hardware can be exploited to create backdoors in ML models. Raises critical questions about the integrity of model verification and the need for hardware checks in algorithmic impact assessments. arXiv
Privacy and Data Generation
SmartMeterFM: Unifying Smart Meter Data Generative Tasks Using Flow Matching Models - Framework for generating synthetic data from smart meters, addressing limitations due to privacy regulations.
Relevant for utilities that must balance operational utility and personal data protection under industry regulations. arXiv
Membership Inference Attacks Against Fine-tuned Diffusion Language Models - First systematic analysis of inference attacks on language diffusion models, identifying new privacy vulnerabilities. Essential for DPOs assessing re-identification risks in systems based on these emerging models. arXiv
AI ACT IN A NUTSHELL - Part 5
Article 10 - Data and data governance
After examining Article 9 on risk management systems in Part 4, we continue our journey through the AI Act by analyzing Article 10, which regulates one of the most critical aspects of developing high-risk AI systems: data and its governance.
The heart of Article 10
Article 10 establishes stringent requirements for the datasets used for training, validation, and testing high-risk AI systems. The Regulation recognizes that data quality is critical to the behavior and performance of an artificial intelligence system: poor-quality, unrepresentative, or biased data can result in discriminatory or erroneous decisions with significant impacts on people’s fundamental rights.
Data governance and management practices
Paragraph 2 of Article 10 lists the elements that must be subject to appropriate governance practices:
- Relevant design choices: architectural decisions that influence the collection and use of data
- Data collection processes and origin: including, for personal data, the original purpose of collection
- Preparation operations: annotation, labeling, cleaning, updating, enrichment, and aggregation
- Formulation of assumptions: in particular, regarding what the data should measure and represent
- Assessment of availability and adequacy: quantity and suitability of the necessary data sets
- Examination of possible biases: in particular, those that may affect health, safety, fundamental rights, or lead to discrimination prohibited by Union law
- Appropriate measures to detect, prevent, and mitigate identified biases
- Identification of gaps or deficiencies that prevent compliance with the Regulation
Data quality requirements
Paragraph 3 stipulates that data sets must be:
- Relevant to the intended purpose
- Sufficiently representative of the population or groups on which the system will be used
- Free from errors to the extent possible
- Complete in relation to the intended purpose
- Endowed with appropriate statistical properties, including those relating to the persons or groups concerned
The exception for sensitive data (paragraph 5)
One of the most innovative and controversial provisions is paragraph 5, which exceptionally allows providers to process special categories of personal data (sensitive data under the GDPR) when strictly necessary to detect and correct biases. This provision recognizes a practical reality: to verify whether a system discriminates, for example, based on ethnic origin or religion, it may be necessary to have such information.
However, this processing is subject to strict conditions:
- Bias cannot be effectively detected using synthetic or anonymized data
- Technical restrictions must be applied to the reuse of data
- State-of-the-art security and privacy measures must be implemented, including pseudonymization
- Data must be protected with strict access controls
- Sensitive data cannot be transmitted or transferred to other parties
- Data must be deleted once the bias has been corrected or at the end of the retention period
- Records of processing activities must document the reasons why the processing was strictly necessary
Practical implications for organizations
For DPOs and AI compliance officers, Article 10 requires:
- Rigorous documentation: every stage of the data lifecycle must be tracked and documented
- Integrated impact assessments: DPIAs will need to be coordinated with AI Act compliance assessments
- Multidisciplinary skills: data governance requires collaboration between data scientists, lawyers, and ethics experts
- Periodic audits: regular checks on data quality and the presence of bias
- Procedures for processing sensitive data: specific policies when using special categories of data for bias correction
Towards Article 11
Article 10 is the foundation for the quality and reliability of high-risk AI systems. In the next installment, we will analyze Article 11, which governs the technical documentation suppliers must prepare to demonstrate compliance of their systems with the Regulation’s requirements.
FROM THE NICFAB BLOG
The hidden cost of digital
January 31, 2026
A personal reflection on the environmental impact of digital technology and AI. From paper to data centers: we have replaced a visible problem with an invisible one—awareness without illusions.
Data Protection Day 2026: 45 years of Convention 108 and the challenge of staying vigilant
January 28, 2026
On January 28, 2026, we celebrate Data Protection Day, the anniversary of Convention 108. Between proposals to revise the GDPR, simplifications, and new technologies, the protection of personal data requires constant vigilance.
EVENTS AND MEETINGS
Implementation of the Data Act: Workshop on guidelines on selected definitions (published on January 26, 2026)
European Commission | Info
Meeting with NOYB (published on January 27, 2026)
EDPB | Info
Apply AI sectoral deep dive - robotics & manufacturing (published on January 27, 2026)
European Commission | Info
Data Protection Day 2026: keeping children’s personal data safe online (published on January 28, 2026)
EDPB | Info
Stakeholder event on political advertising: express your interest (published on January 29, 2026)
EDPB | Info
Info Day: Building an ecosystem for GenAI in Public Administrations (published on February 9, 2026)
European Commission | Info
Data takes flight: Navigating privacy at the airport (published on February 12, 2026)
EDPS | Info
The 15th annual Data Protection and Privacy Conference - Forum Europe (published on March 17, 2026)
EDPB | Info
Happy Data Protection Day 2026!
EDPS | Info
Conclusion
Data Protection Day 2026 marked a strategic turning point in European data protection policies, with supervisory authorities consolidating an increasingly sophisticated dual-track approach: on the one hand, they are stepping up enforcement in the world of work and public communication, and on the other, they are investing heavily in preventive education, especially for minors.
The European Commission’s decision to recognize Brazil’s adequacy represents much more than a simple administrative act. It is the first major geographical expansion of the post-Brexit adequacy regime, opening up new scenarios for international transfers to Latin America. This strategic move, combined with the update to the EU-US Data Privacy Framework FAQs, redraws the global map of data flows and offers multinationals new options for localizing cloud services and digital operations.
In the domestic context, the Italian Data Protection Authority is increasingly assertive in interpreting the GDPR in social and work-related contexts. The ban on monitoring employees’ driving style and penalties for unlawful access to company emails confirm a trend already observed in previous weeks: workplace privacy is no longer a matter of technical compromises but of fundamental principles. Organizations that still view employee monitoring as a purely technological issue risk finding themselves unprepared in the face of increasingly rigorous enforcement.
Particularly significant is the converging focus of the EDPB, EDPS, and national authorities on the protection of minors online. The stakeholder event on political advertising, the focus on AI companions and children, and the update of the vademecums for schools reveal a coordinated strategy that goes beyond simple GDPR compliance. The authorities seem to have recognized that the next frontier of data protection lies in fostering a “culture of privacy” among younger generations, an approach that could prove more effective than ex post sanctions alone.
The Commission’s investigation into X and Grok under the Digital Services Act introduces a particular level of complexity. The overlap between GDPR violations, DSA violations, and potential violations of the Digital Markets Act creates a layered regulatory landscape that requires cross-functional expertise. For specialized law firms, this scenario opens up opportunities but also requires developing integrated expertise spanning data protection to digital competition law.
The Russian cyberattack on the Polish power grid, resulting in damage to industrial devices, raises urgent questions about the convergence of cybersecurity and data protection. Cyber incidents affecting critical infrastructure not only pose operational risks but also result in massive exposures of personal data, creating particularly complex breach notification scenarios when they involve national security.
The Microsoft case on cookies in Austrian schools is a wake-up call for the EdTech sector. The Dutch authorities’ decision to raise awareness of European technological sovereignty is part of a broader debate on the dependence of public institutions on non-EU suppliers. This development could herald stricter restrictions on public contracts in the digital sector.
The €5 million fine imposed on France Travail shows that even public bodies are not immune to GDPR violations. This French precedent could encourage other European authorities to raise enforcement levels against the public sector, traditionally considered less exposed to significant sanctions.
The emergence of the new EU Regulation on cross-border cases promises to simplify procedures that have often become bureaucratic mazes. However, it remains to be seen whether this tool will actually speed up the resolution of complaints or introduce new procedural complexities.
The week leaves fundamental questions unanswered: will the educational approach actually reduce the future burden of sanctions? How will the balance between European digital sovereignty and global competitiveness evolve? And above all, are organizations ready to face increasingly sophisticated enforcement that integrates multiple digital regulations?
📧 Edited by Nicola Fabiano
Lawyer - Fabiano Law Firm
🌐 Studio Legale Fabiano: https://www.fabiano.law
🌐 Blog: https://www.nicfab.eu
🌐 DAPPREMO: www.dappremo.eu
