Kowatsch et al. (2019) A Design and Evaluation Framework for Digital Health Interventions Study Guide

Kowatsch et al. 2019 — GIVEMEA Study Guide
GIVEMEA Study Guide · Digital Health Innovation

A Design and Evaluation Framework for Digital Health Interventions

Tobias Kowatsch, Lena Otto, Samira Harperink, Amanda Cotti & Hannes Schlieter · it – Information Technology, 61(5–6):253–263 · 2019

Framework Paper Systematic Literature Review Qualitative Content Analysis DOI: 10.1515/itit-2019-0019
4Life Cycle Phases
331Evaluation Criteria
13EC Categories
98Implementation Barriers
36Source Records
Central Argument
Despite a mature tradition of evidence-based medicine, systematic guidelines for designing and evaluating digital health interventions remain scarce. The DEDHI framework closes this gap by mapping 331 evaluation criteria and 98 implementation barriers across four iterative life cycle phases — from early research prototype to large-scale market deployment.

Research Question

How should digital health interventions (DHIs) be systematically designed, evaluated, and implemented across their full life cycle, and what evaluation criteria and implementation barriers are relevant at each stage?

The Problem: A Guidance Gap

Existing design and evaluation frameworks for health interventions — including the MRC Framework, MOST, and IDEAS — share a common shortcoming: none provide phase-specific guidance on which evaluation criteria to apply or which implementation barriers to anticipate as a DHI matures from prototype to product. Technology-related aspects such as maturity, scalability, and security are particularly underserved.

The MOST Foundation

The framework is built on an extended version of the Multiphase Optimization Strategy (MOST) developed by Collins et al. MOST was selected because it describes DHI development rigorously and iteratively, explicitly addresses just-in-time adaptive interventions (JITAIs) and micro-randomized trials, and focuses on behavioral health at the individual level. A fourth implementation phase — absent from the original MOST — was added by drawing on DHI life cycle models and the MRC Framework.

Scope and Contribution

DEDHI integrates research from three disciplines: behavioral medicine (behavior change components), medical informatics (clinical applications), and information systems (technology acceptance and barriers). The resulting framework is intended for both researchers and practitioners, and applies to all DHI types: mobile apps, web-based platforms, and hybrid interventions that combine digital coaching with human health professionals.

The DEDHI Framework at a Glance

Phase Goal Technical Maturity Key Evaluation Criteria Key Barriers
1. Preparation Define conceptual and technological foundation; feasibility and acceptability study; select optimization criterion Research prototype with basic functionality Ease of use, adherence, personalization, safety, privacy & security Individual characteristics, usability, expectations, planning, funding, regulatory issues
2. Optimization Run optimization trials (e.g. micro-randomized trial); identify best DHI configuration Elaborated prototype with full functionality for component-level testing Effectiveness (components), perceived benefit, content quality, aesthetics, adherence, service quality Social support, outcome expectations, usability, funding for equipment, cost, integration
3. Evaluation Confirm effectiveness in randomized controlled trial vs. control condition Elaborated prototype with full functionality for RCT Effectiveness, perceived benefit, adherence, personalization, safety, privacy & security, accountability Funding, guidelines, methodology (missing proof of cost-effectiveness)
4. Implementation Develop market-grade product; monitor reach, impact, side effects; update content and technology Product-grade DHI with long-term operational readiness Adherence, personalization, content quality, ethics, service quality, accountability Individual resources, interoperability, human technical support, regional infrastructure, reimbursement, culture

Top Evaluation Criteria Categories (from 331 total)

Category Count % of Total Example
Ease of Use8726.3%Using common interaction paradigms to minimize effort
Content Quality4112.4%Real-time, location-based, accurate health information
Privacy & Security4112.4%GDPR compliance; encrypted data transmission
Accountability3911.8%Author details accessible within the DHI
Adherence278.2%Ratio of actual to intended usage per week
Aesthetics195.7%Consistent use of colors, figures, and fonts
Effectiveness175.1%Significant reduction in clinical outcome measure
Ethics51.5%Design for diverse cultural backgrounds and disabilities
Safety30.9%Interaction limits to prevent addiction behavior

■ Framework Concepts   ■ Research Methods   ■ Barriers & Risks   ■ Evaluation Criteria

Framework Concepts

DHI
tap to define
DHI
Digital Health Intervention. The act of intervening using tools and services that employ ICTs to improve prevention, diagnosis, treatment, monitoring, and management of health and lifestyle. Includes mHealth, telemedicine, telecare, and health IT.
DEDHI Framework
tap to define
DEDHI
Design and Evaluation of DHIs. The proposed framework by Kowatsch et al. (2019) that maps evaluation criteria and implementation barriers to four iterative life cycle phases: Preparation, Optimization, Evaluation, and Implementation.
MOST
tap to define
MOST
Multiphase Optimization Strategy (Collins et al., 2007). A behavioral intervention framework using sequential design, optimization, and confirmation phases. It forms the backbone of DEDHI’s first three phases, extended here with a fourth implementation phase.
Optimization Criterion
tap to define
Optimization Criterion
A pre-defined benchmark that provides the best expected outcome within technical and health-economic constraints. Identified in Phase 1 and used in Phase 2 to select among competing DHI component configurations.
DHI Life Cycle
tap to define
DHI Life Cycle
The phases a digital health system passes through from prototypical development to an operational, market-deployed product. DEDHI recognizes four phases, informed also by Broens et al.’s four-layer model and the Technology Readiness Level framework.
Technology Readiness Level
tap to define
TRL
A systematic metric (Mankins, 1995) for assessing the maturity of a technology. DEDHI incorporates TRL concepts to describe the expected technical maturity at each phase, from basic research prototype to full operational product.

Research Methods

Systematic Literature Review
tap to define
SLR
A structured search across multiple databases (EMBase, Medline, Scopus, ACM DL, IEEE Explore) using predefined search strings. Used here to identify evaluation criteria for DHIs from 2616 initial hits, narrowed to 36 included records.
Qualitative Content Analysis
tap to define
QCA
A method (Mayring, 2000) for systematically categorizing textual data. Used in DEDHI to consolidate 331 evaluation criteria into 13 inductive categories, and to map both criteria and barriers deductively to the four framework phases.
Micro-Randomized Trial
tap to define
MRT
An experimental design (Klasnja et al., 2015) for developing JITAIs in which participants are randomized at each decision point. Recommended in DEDHI’s Optimization Phase to test and select effective DHI components.
RCT
tap to define
Randomized Controlled Trial
The gold standard for confirming DHI effectiveness, used in DEDHI’s Evaluation Phase (Phase 3). The optimized DHI is compared against a control condition (e.g., treatment as usual). If it fails to outperform the control, iteration back to Phase 1 is recommended.
JITAI
tap to define
Just-in-Time Adaptive Intervention
Interventions (Nahum-Shani et al., 2018) that deliver tailored support at moments of need based on real-time sensing. MOST’s explicit treatment of JITAIs was a key reason it was selected as DEDHI’s foundation.
Backward Search
tap to define
Backward Search
A literature search strategy that traces cited references within identified key papers (here, Nouri et al. 2018) back to their sources. One of three complementary search approaches used in the DEDHI systematic review, covering articles from 2000 to 2016.

Barriers & Risks

Implementation Barrier
tap to define
Implementation Barrier
Any factor that hinders the successful adoption or scaling-up of a DHI. The 98 barriers identified from a prior systematic review of reviews (Otto & Harst, 2019) were consolidated into 26 categories and mapped to DEDHI’s four phases.
Regulatory Issues
tap to define
Regulatory Issues
Barriers arising from governmental and non-governmental requirements, such as liability, jurisdiction, and medical device classification. Relevant from Phase 1 onward, as commercial DHI companies face hard regulatory requirements that research teams may not.
Framing Conditions
tap to define
Framing Conditions
Implementation barriers that cannot be overcome during the DHI life cycle itself, such as missing cooperation incentives, unclear responsibilities, or disease-specific constraints. These sit outside the DEDHI framework phases and represent structural challenges.
Funding Barrier
tap to define
Funding Barrier
One of two barriers (with cost) that apply across all four DEDHI phases. Encompasses lack of start-up funding, insufficient equipment funding, and absence of long-term maintenance funding — a persistent challenge across the entire DHI development pipeline.
Individual Characteristics
tap to define
Individual Characteristics (Barrier)
The largest barrier category (15 of 106 assignments), covering patient demographics, resistance to change, healthcare provider reluctance to cooperate, and lack of trust in colleagues or health policy — all of which must be addressed through user-centered design from Phase 1.

Evaluation Criteria

Ease of Use
tap to define
Ease of Use
The degree to which effort is required to take advantage of a DHI (e.g., using common interaction paradigms). The dominant criterion category with 87 of 331 criteria (26.3%), relevant from Phase 1 onward.
Accountability
tap to define
Accountability
The degree to which information about the DHI is made explicit for usage decisions — for example, accessible details about the intervention author. Becomes especially critical in Phases 3 and 4 as DHIs reach broader populations.
Adherence
tap to define
Adherence
The ratio of actual usage to intended usage of a DHI (e.g., completing 4 out of 5 prescribed exercises per week). One of only two criteria present across all four DEDHI phases, alongside privacy & security.
Effectiveness
tap to define
Effectiveness
The degree to which a DHI contributes to enhancing an individual’s health behavior or condition. Notably, despite being the primary goal of evidence-based medicine, it ranks only 8th among EC categories — reflecting a focus on usability at the expense of clinical outcomes in the literature.
Personalization
tap to define
Personalization
The degree to which a DHI adapts to the individual’s needs — for example, adjusting a daily step goal based on the user’s current capabilities. Present across all four DEDHI phases, reflecting the centrality of adaptive design to effective DHIs.
Ethics
tap to define
Ethics (EC Category)
The degree to which a DHI addresses ethical aspects, such as inclusive design for diverse cultural backgrounds and disabilities. One of the least-represented categories with only 5 of 331 criteria, alongside safety (3 criteria) — flagged as a significant gap in the field.

Study Design at a Glance

DEDHI is a conceptual framework paper combining two inputs: a new systematic literature review on DHI evaluation criteria, and a pre-existing review of implementation barriers (Otto & Harst, 2019). Both sets of findings were then mapped deductively to an extended version of MOST using qualitative content analysis. The paper is theoretical — the framework was not empirically validated in the field.

Literature Search: Evaluation Criteria

  • Starting point: Nouri et al. (2018) systematic review of mHealth quality criteria, updated and broadened to cover all DHI types (mobile, web-based, hybrid)
  • Three parallel searches: (1) backward search in Nouri et al. sources from 2000–2016; (2) updated Nouri search extended to DHIs from Dec 2016–May 2019; (3) extended search of socio-technical databases (ACM DL, IEEE Explore) and A/B-ranked digital health journals
  • Search term structure: (evaluation OR criteria OR scoring) AND (intervention OR app OR therapy) AND (health OR clinical) AND (digital OR mHealth OR smartphone) — applied to title and abstract
  • Inclusion criteria: original, peer-reviewed, English-language works describing a tool with evaluation criteria for DHIs; systematic reviews excluded but their source references screened
  • Result: 2,616 initial results → 36 included records → 331 evaluation criteria extracted

Consolidation into Categories

  • Two authors independently reviewed all 331 extracted criteria and consolidated them into inductive categories using qualitative content analysis (Mayring, 2000)
  • Disagreements resolved through discussion; a third author consulted when consensus could not be reached bilaterally
  • Result: 13 categories ranging from Ease of Use (87 criteria, 26.3%) to Safety (3 criteria, 0.9%)
  • Implementation barriers: a pre-existing set of 98 barriers from Otto & Harst (2019) was similarly consolidated into 26 inductive categories with 106 assignments (some barriers span multiple categories)

Mapping to DEDHI Phases

  • Evaluation criteria and barrier categories were each mapped to one or more of the four DEDHI life cycle phases using deductive qualitative content analysis
  • Mapping conducted independently by at least two scientists; inconsistencies resolved through discussion until consensus
  • Most criteria and barriers map to a single phase; exceptions include funding, cost, and several individual characteristic barriers, which span all four phases
  • Some barriers (missing cooperation incentives, unclear responsibilities, disease-specific constraints) could not be aligned to any phase — they represent structural framing conditions outside the DHI development process

Limitations

  • Not empirically validated: DEDHI was developed inductively from literature and has not been applied or revised through real-world DHI development cycles
  • Scientific literature only: country-specific regulatory frameworks are incorporated only to the extent they appear in peer-reviewed outlets — practical legal requirements may differ substantially
  • Stakeholder-blind: the framework does not distinguish between research teams (focused on preparation/optimization) and commercial companies (focused on implementation), whose documentation and testing requirements diverge significantly
  • Subjective methodology: both literature search and content analysis involve researcher judgment; mitigated by multi-author independent coding and consensus procedures
[19–21]

Collins et al. — The Multiphase Optimization Strategy (MOST)

Annals of Behavioral Medicine, 2011; Springer, 2018 · Collins, Murphy, Strecher, 2007
★ Backbone Source Behavioral Medicine Life Cycle Foundation
Why it matters to DEDHI

MOST is the direct precursor to DEDHI’s structure. Collins et al. proposed a rigorous iterative approach to behavioral intervention development with explicit preparation, optimization, and evaluation phases. DEDHI adopts these three phases wholesale, adding only a fourth implementation phase absent from MOST.

MOST’s distinguishing feature is its explicit treatment of optimization: rather than testing a whole intervention against a control, it isolates and evaluates individual intervention components. This component-level logic justifies the micro-randomized trial as the recommended design for DEDHI Phase 2. The MOST framework also embraces JITAIs, which require technology-intensive adaptive delivery — a core reason it was selected over more clinically traditional frameworks like the MRC.

[63]

Nouri et al. — Criteria for assessing quality of mHealth apps

Journal of the American Medical Informatics Association, 25(8):1089–1098 · 2018
★ Search Foundation JAMIA Evaluation Criteria
Why it matters to DEDHI

Nouri et al. provided the primary starting point for the DEDHI evaluation criteria literature search. Their systematic review identified quality criteria specifically for mobile health applications. DEDHI extended this work in three directions: broadening scope from mHealth only to all DHI types (including web-based and hybrid); updating the search to May 2019; and adding socio-technical databases (ACM DL, IEEE Explore) not covered by Nouri.

The backward search from Nouri’s reference list (Search Strategy 1 in DEDHI) was applied to sources back to 2000, establishing the historical baseline for the criteria analysis.

[61, 62]

Nahum-Shani et al. — JITAIs in Mobile Health

Health Psychology, 2015; Annals of Behavioral Medicine, 52(6):446–462 · 2018
★ Key Concept ABM / Health Psychology Adaptive Interventions
Why it matters to DEDHI

Nahum-Shani et al.’s work on JITAIs is cited as one of the primary reasons MOST was chosen as DEDHI’s lifecycle model. JITAIs represent a clinically important and technology-dependent class of DHI: they deliver tailored support at precisely the right moment based on real-time sensing data and predictive models. This requires technological infrastructure and evaluation designs — such as the micro-randomized trial — that conventional frameworks do not address.

The 2015 paper established the conceptual pragmatic framework for JITAIs; the 2018 paper elaborated the key design components. Both inform DEDHI’s Phase 2 (Optimization) methodology.

[15, 16, 22]

Campbell et al. — MRC Framework for Complex Interventions

BMJ, 321:694–696 · 2000; BMJ, 334:455–459 · 2007; BMJ, 337:a1655 · 2008
BMJ Phase 4 Source
Why it matters to DEDHI

The Medical Research Council Framework for complex interventions is one of the most widely cited frameworks in health intervention design. It provided the conceptual basis for DEDHI’s fourth (implementation) phase, which MOST does not address. The MRC framework’s emphasis on monitoring reach, impact, and long-term side effects in real-world settings directly informed the goals and evaluation criteria of Phase 4.

Alongside Broens et al.’s telemedicine life cycle model, the MRC framework is credited for DEDHI’s explicit inclusion of a post-RCT implementation phase — the phase most relevant to practitioners and commercial DHI companies.

[13]

Broens et al. — Determinants of Successful Telemedicine Implementations

Journal of Telemedicine and Telecare, 13(6):303–309 · 2007
J Telemedicine Life Cycle Model
Why it matters to DEDHI

Broens et al. proposed a four-layered DHI life cycle model distinguishing prototypes, small-scale pilots, large-scale pilots, and operational products — linking specific success determinants to each stage. This model provided DEDHI with its phase-specific view of technical maturity and the idea that implementation success factors differ meaningfully across stages.

The Broens model is particularly important for DEDHI’s Phase 4, informing how a commercially viable, maintained DHI product should be monitored and updated over time.

[48]

Klasnja et al. — Microrandomized Trials

Health Psychology, 34(S):1220–1228 · 2015
Health Psychology Optimization Method
Why it matters to DEDHI

The micro-randomized trial (MRT) is the recommended evaluation design for DEDHI’s Optimization Phase. Unlike conventional RCTs that compare whole interventions, MRTs randomize at each individual decision point — enabling DHI developers to isolate the causal effect of specific components delivered in real-world context.

Klasnja et al.’s paper introduced MRTs as a tool specifically designed for JITAIs, making it a natural methodological complement to the Nahum-Shani JITAI design framework and to MOST’s component-selection logic.

Reference Network Note

The DEDHI framework integrates two parallel literature streams. The first, from behavioral medicine and clinical trial methodology (Collins MOST, Nahum-Shani JITAIs, Klasnja MRTs, Campbell MRC), shapes the life cycle phases and their evaluation designs. The second, from information systems and technology management (Broens life cycle, Nouri mHealth criteria, Otto & Harst barriers), contributes the technology maturity lens and the pragmatic barrier taxonomy. The framework’s core innovation is the deliberate synthesis of these two streams — rarely combined in prior work.

Question 1 of 5
Which life cycle model forms the primary structural foundation of the DEDHI framework, and what was added to it?
✓ Correct. MOST (Collins et al.) was selected because it describes DHI development rigorously, addresses JITAIs, and focuses on behavioral health. However, MOST ends after the RCT. DEDHI extends it by adding a Phase 4 (Implementation) drawn from the MRC Framework and DHI life cycle models like Broens et al.
Not quite. The MRC Framework and TRL model both inform DEDHI but neither serves as its primary structure. MOST provides the three research phases; the authors extended it with a fourth implementation phase missing from the original.
Question 2 of 5
The systematic literature review identified 331 evaluation criteria. What was the single largest category, and what notable ranking did “Effectiveness” receive?
✓ Correct. Ease of Use dominated with 87 of 331 criteria (26.3%). The authors note with concern that Effectiveness — the primary goal of evidence-based medicine and the raison d’être of health interventions — ranked only 8th, with 17 criteria (5.1%). Ethics and Safety were even more neglected.
Ease of Use was by far the largest category with 87 criteria. The finding that Effectiveness ranks only 8th — despite being the primary objective of health interventions — is one of the paper’s most striking observations and a direct critique of the existing evaluation literature.
Question 3 of 5
What type of trial does DEDHI recommend for the Optimization Phase, and why is it particularly suited to DHI development?
✓ Correct. The MRT (Klasnja et al., 2015) randomizes participants at each decision point rather than at enrollment. This is ideal for DHIs because it allows developers to isolate the causal effect of individual intervention components — essential when building adaptive, JITAI-based systems where component interactions matter.
The full RCT is reserved for Phase 3 (Evaluation), where the already-optimized DHI is tested against a control. Phase 2 uses the micro-randomized trial to select the best component configuration before committing to a confirmatory trial — avoiding the expensive error of running an RCT on a suboptimal design.
Question 4 of 5
Which two evaluation criteria appear across all four DEDHI phases, and why does this make sense?
✓ Correct. Adherence (whether users engage as intended) and Personalization (whether the DHI adapts to individuals) are present across all four phases because both are fundamental to any DHI regardless of maturity stage. A non-engaging or one-size-fits-all intervention fails at every stage of development.
Privacy & Security also appears frequently but is not listed across all four phases in the paper’s mapping. The two criteria that span all phases are Adherence and Personalization — both reflect the irreducible need for engagement and individual-level adaptation throughout a DHI’s entire life cycle.
Question 5 of 5
Some implementation barriers could not be mapped to any DEDHI phase. What does the paper call these, and what examples are given?
✓ Correct. The authors call these “framing conditions” — barriers that cannot be overcome during the DHI development life cycle because they are rooted in structural or systemic factors: missing financial or professional incentives for cooperation, unclear legal responsibilities between stakeholders, and inherent disease characteristics that hinder any digital approach.
The paper specifically terms these “framing conditions” — a conceptually important distinction. These are barriers that DHI developers cannot address through better design or phased development because they exist outside the development process itself, in the broader institutional, legal, or clinical landscape.
— / 5 Quiz Score
Core Thesis
Effective and scalable digital health interventions require more than good ideas — they require a structured, phase-sensitive approach that aligns evaluation criteria and implementation barriers to where the DHI actually is in its development journey. The DEDHI framework provides this scaffolding, from first prototype to market-deployed product.
  • 📐

    Phase Matters: One Framework Does Not Fit All

    Applying a full clinical evaluation battery to a Phase 1 research prototype wastes resources; deploying a barely-tested prototype at scale risks patient harm. DEDHI’s key contribution is that the relevant evaluation criteria and addressable barriers differ meaningfully at each of the four phases — and knowing which ones apply when is actionable guidance that prior frameworks lacked.

  • 🔬

    Optimize Before You Confirm

    The Optimization Phase (Phase 2) is the framework’s most operationally novel contribution. Using micro-randomized trials to select effective components before an RCT avoids the costly mistake of testing an under-optimized intervention. This logic — borrowed from Collins’ MOST — is especially important for adaptive, JITAI-based DHIs where component interactions are complex.

  • ⚠️

    Effectiveness is Underrepresented in the Literature

    Despite being the primary objective of health interventions, effectiveness ranks only 8th among the 13 evaluation criteria categories (5.1% of all criteria). Ethics and safety are even more neglected. This signals a systemic bias in the mHealth evaluation literature toward usability and interface aesthetics at the expense of clinical outcomes — a gap DEDHI explicitly calls out.

  • 🏗️

    Implementation is a Phase, Not an Afterthought

    The addition of Phase 4 is not a minor extension — it reflects a substantive argument that long-term DHI effectiveness requires ongoing monitoring of reach, impact, and side effects; regular content and technology updates; and navigation of barriers (interoperability, reimbursement, regional infrastructure) that only surface at scale. Researchers who stop at the RCT miss everything that determines whether a DHI actually helps patients in practice.

  • 🤝

    Researcher and Practitioner Goals Diverge

    A research team funded by a national foundation cares most about Phases 1–2 and publishable results. A commercial DHI company needs to reach Phase 4 rapidly and meet hard regulatory requirements from day one. DEDHI does not yet distinguish these stakeholder perspectives — a limitation the authors acknowledge and flag for future development of the framework.

  • 🌍

    Criteria and Barriers Show Universal Reach

    The evaluation criteria and implementation barriers in DEDHI originate from studies conducted across the United States, Europe, Australia, and Africa. The authors interpret this geographic breadth as evidence of the framework’s universality — the same core challenges (usability, funding, regulatory issues, adherence) appear regardless of country-specific context, lending DEDHI cross-national applicability.

Similar Posts

Leave a Reply