Development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
5 minutes
Read so far

Guidelines for Reporting of Health Interventions Using Mobile Phones: Mobile Health (mHealth) Evidence Reporting and Assessment (mERA) Checklist

0 comments
Affiliation

Johns Hopkins Bloomberg School of Public Health (Agarwal, LeFevre, Lee, Labrique); Johns Hopkins University (Agarwal, LeFevre, Lee, Labrique); Gillings School of Global Public Health, University of North Carolina (Agarwal); Family Health International 360 (L'Engle); School of Nursing and Health Professions, University of San Francisco (L'Engle); World Health Organization, Department of Reproductive Health and Research (Mehl); International Development Research Centre (Sinha)

Date
Summary

"Adoption of the mERA checklist by journal editors and authors in a standardised manner is anticipated to improve the transparency and rigour in reporting, while highlighting issues of bias and generalisability, and ultimately temper criticisms of overenthusiastic reporting in mHealth."

Published in the BMJ, this article shares research carried out by the World Health Organization (WHO) and other organisations with the aim of helping to expand evidence bases through standardised reporting of mobile health (mHealth) programmes. The WHO mHealth Technical Evidence Review Group developed a checklist on mHealth evidence reporting and assessment (mERA) that aims to identify a minimum set of information needed to define what the mHealth intervention is (content), where it is being implemented (context), and how it was implemented (technical features), to support replication of the intervention. This paper presents the resulting 16 item checklist and a detailed explanation and elaboration for each item, with illustrative reporting examples. The mERA checklist supplements existing reporting standards to provide a concrete checklist of criteria specific to reporting on digital innovations; it also elaborates on the existing criteria to support high-quality methodological reporting of evidence. The reporting checklist aims to support better comparisons between research findings, as well as to combine experiences across different settings to advocate for innovations, which can improve patient experiences around the world.

As noted here, over the past decade, global enthusiasm and the interest of development agencies, researchers, and policymakers have led to the rapid proliferation of mHealth solutions throughout developed and developing countries. There remains a lack of rigorous, high-quality evidence on the efficacy and effectiveness of such interventions. The current mHealth evidence is disseminated in multiple forms, including peer-reviewed literature, white papers, reports, presentations, and blogs. The evidence base is heterogenous, thus making comparisons across intervention strategies difficult. The quality of reporting on evidence on mHealth interventions has been varied. This is likely attributable to 2 factors: the multidisciplinary nature of mHealth, which combines different approaches from the fields of healthcare and technology, and the rapid pace of technology development, which often outpaces our ability to generate and disseminate quality evidence.

To address this gap, WHO convened a group of global experts working at the intersection of mHealth research and programme implementation, called the mHealth Technical Evidence Review Group (mTERG). The group’s pragmatic and iterative process in developing this checklist attempted to capture scientific consensus around appropriate mHealth reporting. The development process comprised 3 main steps: convening an expert working group (WHO commissioned the Johns Hopkins Global mHealth Initiative (JHU-GmI) to develop an approach for the mERA guideline), convening a global expert review panel for checklist development, and pilot testing the checklist. To continue the testing and refinement of these guidelines, in 2014, the WHO Department of Reproductive Health and Research and mTERG supported the application of the mERA tool by independent research groups to three topic areas. These included the application of mERA to conduct evidence reviews on the use of mobile strategies for: management of stockouts of essential maternal and child health drugs; promotion of adherence to treatment regimens by healthcare providers; and promotion of adolescent sexual and reproductive health. The application of mERA to each area resulted in some refinements and adaptations of the criteria.

mERA was developed as a checklist of items which could be applied by authors developing manuscripts that aim to report on the effectiveness of mHealth interventions and by peer reviewers and journal editors reviewing such evidence. mERA aims to provide guidance for complete and transparent reporting on studies evaluating and reporting on the feasibility and effectiveness of mHealth interventions. The checklist does not aim to support the design or implementation of such studies or to evaluate the quality of the research methods used. Rather, it is intended to improve transparency in reporting, promote a critical assessment of mHealth research evidence, and help improve the rigour of future reporting of research findings.

mERA consists of 16 items focused on reporting on mHealth interventions (see table 1, where the rationale for inclusion and explanation of each item is given listed with examples of good reporting.). In addition to these criteria, web appendix 1 presents 29 items for reporting on study design and methods. The authors urge that, as far as possible, the 16 core mERA items should be used in conjunction with appropriate checklists for study design, such as CONSORT for randomised trials and STROBE for observational studies. In brief, mERA includes:

  1. Infrastructure: describe, in detail, the necessary infrastructure which was required to enable the operation of the mHealth programme. (This refers to physical infrastructure including electricity, access to power, and connectivity in the local context.)
  2. Technology platform: describe, in sufficient detail to allow replication of the work, the software and hardware combinations used in the programme implementation. (If the software used is a publicly available system (eg, Open Data Kit, CommCare), it should be explicitly mentioned, together with the modifications or configuration.)
  3. Interoperability: describe how, if at all, the mHealth strategy connects to and interacts with national or regional Health Information Systems (HIS)/programme context. (For instance, simple descriptions of specific data standards being used can provide some basis to gauge this.)
  4. Intervention delivery: elaborate the mode, frequency, and intensity of the mHealth intervention. (For example, with a text message intervention to stimulate behaviour change, how was the message curriculum structured, timed, and delivered?)
  5. Intervention content: describe how the content was developed/identified and customised. (If information content is drawn from a publicly available resource, or newly developed content is being made publicly available, external links to this database should be provided.)
  6. Usability testing: describe how the end-users of the system engaged in the development of the intervention. (Given the limitations in space in most peer reviewed journals, this important element of a carefully developed mHealth innovation is given short shrift.)
  7. User feedback: describe user feedback about the intervention or user satisfaction with the intervention. (User feedback could include user opinions about the content or user interface; or perceptions about usability, access, connectivity, or other elements of the mHealth programme.)
  8. Access of individual participants: mention barriers or facilitators to the adoption of the intervention among study participants. (Limitations of access - e.g., related to education and literacy, gender norms, etc. - among certain subgroups is likely and therefore should be candidly considered in the peer reviewed report.)
  9. Cost assessment: present basic costs of the mHealth intervention. (If an economic evaluation has been conducted, it should be reported according to the 24-item CHEERS statement.)
  10. Adoption inputs/programme entry: describe how people are informed about the programme or steps taken to support adoption. (Appropriate training, instructional materials, and competency assessment may be warranted because mHealth interventions typically require the health provider or client end-users to have a level of understanding.)
  11. Limitations for delivery at scale: present expected challenges for scaling up the intervention. (This information is critical for understanding the generalisability of the implementation and making inferences on its viability beyond a closely controlled and defined setting.)
  12. Contextual adaptability: describe appropriateness of the intervention to the context, and any possible adaptations. (For example, perhaps a mobile-phone-based survey apparatus is particularly suited for conducting survey research in rural areas; this should be stated.)
  13. Replicability: present adequate technical and content detail to support replicability. (Processes for replication may include the software source code, workflow or dashboards screenshots, flowcharts of algorithms, or examples of content that is developed for the end-users.)
  14. Data security: describe security and confidentiality protocols. (Even in settings where laws, standards, or practices governing data security might be absent, researchers and programme implementers are responsible to take reasonable measures to protect the privacy and confidentiality of participant identity and health information.)
  15. Compliance with national guidelines or regulatory statutes. (For example, if the system is providing SMS based advice to pregnant women, does the information follow evidence-informed practices and align with recommendations of existing national or regulatory bodies? )
  16. Fidelity of the intervention. (To what extent has the mHealth programme’s adherence to the intended, original deployment plan been assessed?)

"We expect that the mERA checklist represents a set of evolving criteria that will be appraised and if necessary, updated. The checklist will be disseminated through conducting workshops and presentations at the mHealth Summit, mHealth Working Group, and other global informatics forums. Additionally, the checklist will be hosted on the WHO mTERG website, and the Equator website. The mERA checklist will be continuously revised and versions will be periodically released on the basis of feedback, comments, and experiences from its use. We invite readers to share their comments and experiences."

Source

BMJ 2016;352:i1174 - sourced from Slimline C4D Network Twitter Trawl: 5 - 11 September 2016 and WHO website, March 22 2016 - both accessed on September 13 2016. Image credit: Mark Leong