01.06.15
The importance of evidence-based mhealth developments
Source: NHE May/June 15
Professor Jeremy Wyatt, leadership chair in eHealth research and clinical advisor on new technology at the Royal College of Physicians, and Dr Maximilian Johnston, clinical research fellow at the Centre for Patient Safety & Service Quality at Imperial College London, discuss the importance of clinical and evidence-based development in mobile health apps.
The use of smartphones, tablet computers and websites by clinicians to support patient care and by patients to inform and help themselves has increased dramatically in recent years.
However, there are fears that there has been a lack of robust evidence-based research and clinical involvement in the development of apps for the mobile health (mHealth) environment we find ourselves in.
At the end of April, the Royal College of Physicians (RCP) issued new guidance to doctors on using medical apps. There were two key pieces of advice:
- Do not use medical apps, including web apps, that do not have a CE mark; and
- Always exercise professional judgement before relying on information from an app.
Produced with the Medicines and Healthcare Products Regulatory Agency (MHRA) and the General Medical Council (GMC), the guidance states that any medical app approved for use in Europe must carry a CE mark – assurance that an app meets essential criteria, works and should be clinically safe.
It added that if there is no CE mark, doctors “must urgently ask the app’s developers to obtain one; meanwhile, you should stop using the app”.
Enthusiasm for apps
Although the RCP has no plans to endorse particular medical apps, Professor Jeremy Wyatt, leadership chair in eHealth research and clinical advisor on new technology at the RCP, told NHE: “We are enthusiastic and interested in apps. We believe that apps have an important value in healthcare, particularly to physicians.
“But there is no doubt that there is variability in the quality of apps; not only in the accuracy – where we’ve done some tests – but also on privacy and how they share data with others.”
Academic research suggests there are close to 100,000 healthcare and medical apps available on the market, but the proportion of these with CE marks is unclear.
The RCP defines medical apps as those that “diagnose, support diagnosis or clinical decisions, make calculations to determine diagnosis or treatment, or are used for any medical purpose”. For example, users of the Mersey Burns app – which has the CE mark – can input the parts of a patient’s body that have been burned and it calculates the percentage of skin damage and their fluid balance requirements.
Prof Wyatt knows of 10 apps with CE marks. “But there are thousands of medical apps and tens-of-thousands (if not more) health and lifestyle apps. It is an enormous area.”
In the US, their equivalent of the MHRA – the FDA – publishes a list of more than 150 ‘certified’ apps. “That is not quite the same as CE marked, but it is likely that there are a lot more than 10 apps that have been CE marked – but we don’t have a central list of them.”
App development
Prof Wyatt added that, indirectly, the RCP is hoping its latest guidance will have a beneficial impact on the industry and it will start to see apps being developed more seriously – particularly those intended for clinical use.
NHE also talked to Dr Maximilian Johnston, clinical research fellow at the Centre for Patient Safety & Service Quality at Imperial College London, who produced a recent paper entitled the ‘Imperial Clarify, Design and Evaluate (CDE) approach to mHealth app development’.
He welcomed the latest RCP guidance, and said the framework developed at Imperial aims to support the “structured translation” of an initial idea through to an effective app, the success of which has been rigorously evaluated.
“There is no good way at the moment of working out which apps are based on evidence and which have been designed with medical professional involvement,” said Dr Johnston.
His paper outlines the development of two apps – Hark (a clinical task management and collaboration platform) and Usher (a platform to support patients going through complex pathways) – using the CDE approach so other innovators are aware of the requirement for well-conducted research, design and evaluation methods when implementing apps in the future.
“Hark is produced, ready for implementation,” he said. “By way of data we’ve done a simulated study comparing Hark to the pager device using a controlled trial design. That has shown that Hark has the same response time as a pager, but there is also a good audit trail and, secondly, using one of the validated metrics we’ve produced at Imperial, we’ve shown that the information transfer in it is much more complete.
“For instance, if a nurse is picking up a phone and they have a sick patient it is very easy to have the blinkers on and forget to transfer vital information. Hark, however, prompts you to include new vital signs or previous blood results etc. It is a lot more comprehensive. We are now looking for a test site.”
Once the app has been tested and finalised, the team will look into applying for a CE mark for Hark. NHE was told that having a framework, such as CDE, is vitally important in the development of apps. This is because it allows developers to clarify a niche for their apps while being “grounded in evidence and end-user engagement”.
The British Standards Institution (BSI) has also produced a new standard – PAS 277:2015 – to develop a set of principles for health and wellness app developers to follow throughout an app project life cycle, so that healthcare professionals, patients and the public trust their products and services.
“It is a very useful set of principles about how to develop good quality clinical software,” said Prof Wyatt. “It really is basic level stuff: think about the requirements, the users, how can you make this reliable, how can you test it for safety, does it need CE marking or not, risk assessment.”
He added that PAS 277 is very helpful for laying the foundations of good app development. However, it doesn’t mention testing the accuracy and impact of the app.
“We believe that you need to go beyond PAS 277 or CE marking to look whether it is accurate when it is used by a ‘typical’ user and whether the advice or output lead to the appropriate change in users’ decisions and behaviours,” said Prof Wyatt.
“We have developed 24 quality criteria, which are in the process of getting published, that offer a more comprehensive evaluation framework for clinical apps and health and lifestyle apps.”
The future market of apps
Going forward both Prof Wyatt and Dr Johnston believe that mHealth apps, provided they are developed and tested rigorously, will be a major benefit to physicians.
“Where we see that it could be useful, in essence, is for reference purposes as such as NICE guidelines,” said Prof Wyatt. “If you have an app that can make them much more accessible it makes sense. That would be a good example of knowledge support.
“There are also decision support apps for calculating risk, such as drug dosages. We know that doctors aren’t always 100% accurate with calculating drug dosages so that is an obvious example where a dedicated app, if it has been correctly constructed, could be really helpful.”
He added that mHealth apps are one of several ways of delivering better information in the right format to clinicians at the right time and in the right place.
Dr Johnston told us that he believes mHealth apps will change the way people are working in healthcare and that one day pagers may no longer be necessary as people start using tablets rather than desktops “and when that happens everyone will be using apps rather than operating systems”.
However, he reiterated that there are lots of untried and untested apps currently being used “and people need to be very careful about that”. That is why he believes the CDE evidence-based framework could lead to scalable, high-value app solutions rather than the current ad-hoc approach.
Tell us what you think – have your say below or email [email protected]