Archive | March 2015

Resilient GP: bad research, and an inappropriate demand on GPs’ time

Resilient GP’s survey report Inappropriate demands to GPs is a poor quality piece of research – bad enough that it can’t tell readers anything much. I’ve already blogged about the report’s ethical problems In this post, I’ll raise concerns about the sampling used in this project, the survey itself, the reporting of this work, and the report’s analysis (or lack thereof). I’ll argue that getting GPs – busy as they are – to participate in low quality research like this is an inappropriate use of their time

Sampling

Resilient GP say that they

conducted a survey on a large, private online discussion group composed entirely of GPs…We received over 200 unique responses.

Unfortunately, the survey report doesn’t specify much the characteristics of the group or those who responded – for example, if all the respondents were in England their answers may not reflect the situation in Scotland. The report also doesn’t specify the response rate: if these 200 responses came from a group of 200 GPs then they clearly reflect the group well; if they came from a group of 50,000 then the minority who chose to respond may be very different from the group as a whole (for example, may be responding because they’re annoyed by particular demands). Without this information, it’s impossible to know to what extent issues with the sample may bias the report’s findings or limit how much one can generalise from it.

The survey

Resilient GP’s report says almost nothing about the survey they used. They state that

We asked for examples that were considered by that GP to be an inappropriate use of their time and skills.

However we don’t know, for example, whether participants received a large number of questions to elicit these responses or how the questions were phrased. Without this information, the readers of this report can’t really interpret the survey results.

Reporting

The way the survey report presents the survey results makes it near-enough impossible to draw any robust conclusions from this work. The report presents a list of what were viewed as inappropriate demands. It states that

We excluded very similar responses or those we considered might have conceivably have been a presentation of underlying illness.

However, there is nothing beyond this (for example, no way of knowing how Resilient GP verified judgements as to what “might have conceivably have been a presentation of underlying illness” or how they decided what was/was not similar enough to exclude).

There is no way of knowing how recent or regular these demands are: for example, for a GP to see one patient who wants a fake sick note in a 40 year career then this would be unfortunate but not exactly shocking; if this happens every day that would be a very noteworthy finding. If there were regular demands for pet medicines 20 years ago but never today this would mean something very different from regular demands for pet medicines today.

It is also not clear whether Resilient GP have reviewed the available evidence on demands on healthcare professionals and healthcare systems. If they have, this isn’t apparent in this survey report.

Analysis

Bluntly, there isn’t much. The survey report reads like a long list of inappropriate demands reported by GPs, loosely divided into five categories. It is not clear how or why Resilient GP chose all of these five categories, although the survey report states that this was done

For ease of reference, and to help stimulate ideas for alternative solutions

There might be interesting information that can be drawn from the survey data Resilient GP collected. If there is, though, this analysis fails to do so.

Conclusions

One can conclude very little from Resilient GP’s survey report. Some GPs (about whom we know very little) were asked something (we don’t know what). They responded with a number of reports (we don’t know how many) about what they remember as inappropriate patient demands. We don’t know when these demands were made or how frequently they arise. This survey report therefore doesn’t tell us much at all.

Finally, while I’ve already blogged about ethical problems with this survey report, I’d like note one more ethical issue. Doing low quality research is often seen as unethical – among other problems, it wastes the time of participants and may make things harder for future researchers. Asking GPs – busy as they are – to participate in low quality research like this is an inappropriate use of their time.

 

* To be fair, the survey report does state that one demand – a “letter stating patient is unable to attend their tribunal or ATOS assessment” – is “a very common request”

Resilient GP: an ethically inappropriate survey report

Resilient GP posted a survey report yesterday about inappropriate patient demand. The ethics and methodology of what they did was questioned, and they reposted a (revised) version of the post today – justified for the purpose of “debate” and “educating patients not to use up appointments” inappropriately. I’m going to look at ethical problems with this survey report here, and look at problems with methology/write-up in a subsequent post. I’ll argue that it’s not ethical to have posted this, and that Resilient GP’s arguments for posting it don’t stand up.

Patient Confidentiality

It’s important for doctors to maintain patient confidentiality, except where there are very good reasons not to (for example, the patient might die if they don’t). However, the original version of the Resilient GP survey report contained some very specific information that could identify patients. Some of this has now been removed – and it wouldn’t be ethical for me to repost this information – but there was no good justification for posting this information in the first place. Resilient GP has not publicly reflected on the removed information – so I don’t even know whether this was done for ethical reasons, let alone if any lessons have been learnt. Some of the survey report (e.g. point 5.22) is still specific enough to identify patients.

Resilient GP argued today that “[c]areful reading of the report” will show that patient confidentiality hasn’t been breached. However, it’s not classy to make this argument after removing (but not noting the removal of) potentially identifying information – readers now are unlikely to know that information has been removed and may therefore reach an overly positive conclusion about how patients have been anonymised.

Doctor-patient relationship

The doctor-patient relationship is important: mellojonny suggests that, for some, “a long-term relationship with a stable adult who knows them is literally life-saving”. One reason for maintaining patient confidentiality is so patients feel confident sharing very personal things with their doctors. Will patients now worry that what they say to their GP will be posted online? For doctors to post quite specific anecdotes about patients in what seems a derogatory way is hardly likely to help build a good relationship.

Consent

Generally, we ask research participants to give informed consent to taking part in research projects. I’d be very surprised if the patients featured in this survey report consented to this (I asked Resilient GP to confirm, but they haven’t). There are some cases where one might proceed without informed consent, but I can’t see how this is justified here.

Resilient GP’s defence of the ethics of their survey report

Resilient GP offers a defence of the ethics of posting this material. However, this is very weak. They argue that

ethics of utilitarianism are equally important here and it is a duty of doctors to challenge inappropriate use of resources

A utilitarian justification of posting this, though, would depend on having a reasonable expectation that it will have positive consequences. Resilient GP have not presented any good evidence that it will. As Betabetic has shown, it’s not at all clear whether (well-resourced, large-scale) campaigns again over-consulting will have the desired effect. It seems rather unlikely that the best way to address inappropriate demand is to post anecdotes about inappropriate demand on a website that seems largely aimed at GPs. There is significant potential for harm, though – for example, in breaching patient confidentiality or damaging the doctor-patient relationship.

Resilient GP also states that the ambulance service has “raise[d] real life examples of inappropriate appointment use” and appears to feel that this shows that it’s not unethical to do so. This is a really lousy argument, though – it might, for example, just be that others have acted in ethically problematic ways or that the GP-patient relationship is different to the ambulance-patient relationship.

Conclusions

Resilient GP should pull this post ASAP, at least till it can be revised to properly address issues related to patient confidentiality. They should also reflect on why a post with this issues was put up in the first place.

More broadly, if Resilient GP feel utilitarian ethics are the best lens to view this through, they should give more serious consideration to the consequences of their survey report: I haven’t seen good reason to expect the positive to outweigh the negative. Also, as I’ll argue in my next post on this, I don’t think the research they’ve posted is all that good – and I’m not sure that starting a ‘debate’ centred around low quality research is especially useful.

Note: I have contacted Resilient GP to suggest that they pull the post ASAP, at least till they can resolve issues with (non)anonymisation. They haven’t yet responded.