Rethinking consent for health apps, wearables and AI
The novel concept of Standard Health Consent
12/18/20256 min read
Every day, millions of us generate person generated health data (PGHD) from our watches, phones, sleep trackers and various mental health oriented apps. We use these tools to move more, sleep better, manage chronic conditions or simply feel a bit more in control of our lives.
Yet the data these tools generate almost never make it back into the systems that actually deliver our care – electronic medical records, national health information exchanges or AI models designed to improve population health.
A recent npj Digital Medicine article by Brückner and colleagues proposes a way out of this gap by introducing a Standard Health Consent (SHC) platform – a shared, user centred consent layer for health apps and wearables.
It’s an idea that has significant potential not just for Europe’s emerging European Health Data Space (EHDS) but also for countries like the UAE that are investing heavily in digital health, AI and unified medical health records.
The problem: powerful data, weak consent
PGHD could be a goldmine for:
Personalised care (e.g. linking CGM data, sleep and activity to diabetes management)
Research and public health (e.g. early signals of outbreaks, real world evidence)
However, today we face three structural problems:
Fragmentation: Data are scattered across dozens of apps and device vendors, most of which sit outside gthe traditional healthcare IT infrastructure.
Opaque, one off consents: Each app has its own privacy policy and consent flow. Users tick “I agree” to get to the feature they want, often with little real understanding of where their data may go next.
Misalignment with emerging regulation: Europe is moving towards the EHDS - a framework that will allow secondary use of health data (e.g. for research) in secure environments, sometimes without individual consent but with strong governance and oversight. For wellness apps and wearables, the European Data Protection Board has signalled that explicit consent should still play a central role. However, there is no common, practical mechanism to do that at scale.
The result is a paradox: huge volumes of highly personal health data but very little trustworthy, scalable and auditable consent.
The idea for Standard Health Consent (SHC)
Brückner et al. propose treating consent not as something each app reinvents but as a shared public utility - a bit like an identity provider or payment gateway but for data rights.
At a high level, an SHC platform would:
Give people a single place to see and control which apps can share what health data, with whom and for which purposes (care vs research vs product development).
Give developers and data users a standard, regulator aligned way to obtain and document that consent.
Integrate with health systems using interoperability standards such as HL7 FHIR and the FHIR Consent resource, so that consent choices are machine-readable and enforceable.
The proposed architecture has three main components:
SHC Connect – a consent module embedded into individual health apps (via API or iFrame).
SHC App / Portal – a central interface where users can review and update all their consents across apps.
SHC Service (backend) – a secure server that stores consent metadata and links it to pseudonymous identities, while the actual health data stay with the original controllers (apps, providers, HIEs).
The important bit is the user experience:
Data are grouped into intuitive categories (e.g. body measures, behaviour, environment and derived insights).
People can opt in or out of specific types of secondary use (research, product development, etc.), rather than a blunt “share everything with everyone” or “share nothing”.
They can come back to this later, if they change change their mind.
This is not just a nicer consent pop up. It’s an attempt to build an infrastructure of trust around PGHD and AI.
Why this matters for AI in healthcare
Under the EU AI Act, developers of high-risk AI systems must document how training data were collected, on what legal basis and how representative they are.
A platform like SHC can:
Provide auditable, individual level consent trails for PGHD used in AI development.
Help manage fairness and bias by making participation and opt-out patterns visible (so we know whose data is in the training set).
Align real-world PGHD use with both data protection and AI safety requirements, instead of treating them as separate.
What would a Standard Health Consent look like in the UAE?
The UAE is already ahead of many countries in building the foundational blocks for digital health:
Federal Law No. 2 of 2019 on the Use of ICT in Health Fields (the Health Data Law) regulates the handling of electronic health data, including restrictions on transfer outside the UAE and an emphasis on confidentiality, purpose limitation and consent.
Federal Decree-Law No. 45 of 2021 on Personal Data Protection (PDPL) sets cross-sector rules on lawful bases, data subject rights, transparency and security, largely aligned with global best practice.
At the Emirate level, Abu Dhabi has Malaffi (Abu Dhabi), whereas Dubai has NABIDH and the national Riayati platform now integrated into a unified health information backbone across the country.
The DoH in Abu Dhabi has issued detailed healthcare data privacy and cybersecurity standards (ADHICS), and most recently a Responsible AI in Healthcare framework that mandates ethical data-use checks, KPI monitoring and audits for AI
So where could an SHC-type platform fit in?
1. Connecting PGHD to Malaffi / Riayati
Right now, national and Emirate level platforms primarily aggregate clinical and administrative data (lab results, diagnoses, medications, encounters). Wearables and consumer apps live largely outside that perimeter (although KSA, for example, is already looking at integrating WHOOP's physiological data analytics into the national health platforms).
An SHC-style layer could:
Let individuals express once, in a structured way, whether and how their app data (e.g. activity, heart rate, sleep, mood logs) should be linked to their unified medical record.
Allow clinicians to see PGHD where it is clinically relevant (e.g. pre-op fitness, long term adherence), without each app needing bespoke integration.
Provide researchers and AI teams working within UAE based secure environments with clearly consented, provenance-rich datasets for model development.
This could be implemented as a national or Emirate-level consent service that app developers can plug into, similar in spirit to SHC’s “Connect” module but anchored in UAE legal and technical standards.
2. Aligning with UAE Health Data Law and PDPL
Any UAE implementation would need to respect:
Data localisation and cross-border transfer rules under the Health Data Law (limits on sending health data abroad, subject to specific approvals).
Data subject rights under the PDPL (access, correction, deletion, restriction, objection to processing).
On the other hand, a consent platform of this type could make these rights clearly exercisable:
A single place where individuals can see which apps and services are processing their health-related data, for what purposes and under which legal basis.
Easy mechanisms to withdraw consent, request deletion, or restrict processing – coupled with obligations on connected apps to honour those requests and log compliance.
Critically, the platform itself should be governed by neutral, trusted institutions (e.g. health authorities or a regulated data intermediary), not a single commercial actor.
3. Supporting Responsible AI in Abu Dhabi and beyond
DoH Abu Dhabi’s Responsible AI framework already requires ethical data use checks, KPIs and audits for AI systems in the Emirate’s healthcare ecosystem.
A UAE-adapted SHC could:
Provide the consent and provenance backbone for AI models that incorporate PGHD, making it easier to demonstrate compliance during approvals and audits.
Help monitor participation and exclusion patterns: which patient groups are consistently opting out of PGHD sharing, and what does that mean for model fairness?
Serve as a practical bridge between legal frameworks (Health Data Law, PDPL, ADHICS) and the day-to-day realities of AI developers and clinical adopters.
Design choices that will make or break it
Whether in Europe or the UAE, a Standard Health Consent platform is not just a technical build. It’s also a governance and behavioural design challenge.
A few design principles stand out:
Consent flows should be clear, in plain language, with examples of real use cases
People should be able to say “yes to research, no to targeted advertising” or “yes to this hospital’s AI team, no to third party brokers”.
If data can be used in partly commercial projects (e.g. co-developed AI tools), that should be made clear in plain language
The UK’s GPDPR experience in 2021 shows that you can have a legally sound initiative and still lose public trust if people don’t feel informed or in control. Any UAE level consent platform should be co-designed with patients, communities and clinicians, not only vendors and regulators.
Looking ahead: from compliance to confidence
We’re entering a world where it will be technically possible to combine:
Clinical records
High-frequency PGHD from wearables and apps
Powerful AI models that learn from both
The real constraint will not be technical, but rather social and ethical acceptability of whatever is being proposed.
Standardised, user-centric consent infrastructure, such as the SHC concept in Europe, points towards a model where:
Individuals can actually see and shape how their data flows.
Health systems can tap into PGHD and AI without losing public trust.
Regulators can enforce rights and responsibilities with real-time, auditable evidence rather than paper promises.
For the UAE, which is positioning itself as a global hub for digital health and responsible AI, the question is not whether to tackle this problem but how fast we can move from app-by-app tick-boxes to a shared consent layer that matches the ambition of our digital health infrastructure.
Waypoint Legal Consultancy
Legal advice tailored to healthcare innovation.
© 2025. All rights reserved.
