Just put together a video for the Clinical Biomechanics Boot Camp on ‘Clinical Practice Can be Deceptive’ wrestling with an issue I can’t quite get my head around. Sometimes writing about it helps me think clearer, so here goes:
Category Archives: Research
Homeopathy data dredging
Homeopathy does not work and can not work. The evidence is clear; and there is plenty of that evidence. It is no better than a placebo. Any ‘clinical’ effect of it is due to that placebo effect. I won’t get into it all the details here, but if you want more check this out: How Does Homeopathy work?.
That does not stop those who try to defraud the consumer with homeopathy from grasping at straws and coming up with implausible and improbable mechanisms as to how it might work (it doesn’t) and grasping at some badly done flawed studies published in a low or no impact factor journals, and ignore all the well done properly blinded and controlled studies published in high impact factor journals. And when that argument does not work, they come up with some sob story or special pleading that this is not the appropriate way to clinically test homeopathy (it is).
Publication Rate of Conference Abstracts
Conferences presentations and in the conference abstract books there are often gems and lots of pearls of useful information. I often blog live from conferences (eg here and here) or peruse abstract books looking for gems (eg here and here). The problem with conference abstracts can be the lack of detail on the study to judge it and they are not subject to the same scrutiny of peer review that a full journal publication is; so how much weight in the grand scheme of things should a conference abstract be given? They have to be interpreted in that context of the lack of detail and the lack of peer review. There are examples I have seen where the preponderance of evidence on a topic may be altered to be in a different direction if the unpublished conference abstracts were included or not included in that body of evidence under consideration. That is a worry. A large number of conference abstracts never make it to full publications, despite they being ‘gems’ and would be a valuable addition to the body of peer reviewed literature on that topic.
Way back in 1999, I published this that looked at the publication rates of abstracts presented at the main diabetes conferences in Australia, Europe and the USA. The rates were 26%, 49% and 53%. At that time, those figures were pretty consistent with other disciplines. My attention was just brought back to this by this recent publication in Foot & Ankle International which looked at the publication rates from the American Orthopaedic Foot & Ankle Society meetings. They found it was 73.7% for podium presentations and 55.8% for posters. That is a bit better than the ~50% that I found and is often reported in the literature as a pretty typical publication rate reported.
Do you have time to read 3000 research papers a year?
3000 is almost how many that were published of relevance to podiatry and related topics in 2016. Did you read them all? On top of that, there are social media and blog posts about relevant topics. That is a lot if you want to stay up-to-date.
How did I work out it was almost 3000? Newsbot is the username over at Podiatry Arena that is used to post all the research, news, press releases, etc that are relevant to Podiatry. In 2016, Newsbot posted an average of 7.6 times a day. That is 7.6 bits of new research, news, reviews and commentary of relevance to Podiatry every single day (ie 2774 posts in 2016). Admittingly some of that is bad research and commentary and a lot of it is certainly not directly relevant to all areas of clinical practice for everyone, but it does give you an idea of the magnitude of what is published.
What ‘foot’ related topics are most often Google’d?
Every December for the last few years on my other blog, I tabulate how often each of the running shoe brands are searched for in Google for a bit of fun.
If you are going to comment on research studies in social media…
…please read and understand the study first. Don’t embarrass yourself by just commentting based on the what you think the title of the study means.
For example, this recent study was published. The study investigated outcomes of clubfoot treatment to see if immediate or delayed treatment affected outcomes. Of the 176 cases they reviewed, the age at presentation did not affect the outcome (except for the issue of cast slippage).