Tags

HONcode Certified

This website is certified by Health On the Net Foundation. Click to verify.

This site complies with the HONcode standard for trustworthy health information: verify here.

June 2010

Not all in the genes...

Ami Banerjee
Last edited 18th September 2012

This week was the 10-year anniversary of the initial decoding of the 3 billion base pairs of the human genome. The research effort was through the $3 billion public-funded Human Genome Project and its privately-funded competitor, Celera Genomics. Even before the genome was fully mapped, forecasters spoke of the new age of “personalised medicine” where we would be able to have genetic tests to tell us about our own specific disease profile. President Bill Clinton made patenting of individual genes illegal. This hype and excitement led to unprecedented investment in genome-wide scanning across almost every disease category in the human body over the past decade.

However, even in the early stages of the project, some commentators showed concern regarding the level of spending and the concentration of research on gene mapping, as well as the massive ethical implications of knowing the genetic make-up of an individual and how to use the data. Unfortunately it was the predictions of these commentators which proved to be right a decade later.

There is no doubt that genome-mapping has added to our scientific understanding of new disease pathways and the causation of disease, but the practical application and translation of any new knowledge into new drugs or treatments has been lacking, as this week’s Lancet notes. Outside of a few cancers and some very rare inherited disorders, common diseases such as coronary heart disease, bowel cancer or stroke have turned out to be impossible to map by genes alone.

Individual risks associated with individual gene variations are generally very small and these variations are common, which means that we are unlikely to be able to predict risk of disease on the basis of this information alone in individual patients. The exceptions are diseases due to single genes (e.g. cystic fibrosis) or diseases where the genetic risks explain a large proportion of the disease, such as age-related macular degeneration. The focus of research is therefore shifting to gene-environment interactions and trying to explain the variation in susceptibility to disease between individuals.

The concept that diseases are the outcome of both genetic and environmental interactions is far from new. In the 9th century AD, Al-Jahiz considered the effects of the environment on the likelihood of an animal to survive, and first described the “struggle for existence”. Therefore, even in the age of genome-wide scanning, old techniques such as family history studies remain the most accessible way of measuring the inherited component of a disease and represent the overall interaction between environmental and genetic factors. Family history studies are still required to guide or focus genome-wide scans, in order to decide which analyses may be useful (e.g. by sex, age, risk factors).

All scientific discovery has historically taken time to translate into clinical practice and so we must wait to see what the future holds for genomic mapping and it application in patient care. However, in the age of tighter budgets and increased accountability, should research spending be governed more by evidence of the scientific fruit that it bears? Or is it enough for medical research to push back the boundaries of knowledge, regardless of the usefulness to patients?

In the BMJ this week is a case control study on Mobile phone base stations and early childhood cancers. A case control study is an epidemiological study design in which persons with and without a disease, in this case cancer, are studied to identify factors (mobile phone masts) associated with the disease. The gold standard would be a prospective study (not a trial as this would be unethical); however, when the disease is rare it is too difficult, costly and would involve following millions of children to detect the cases.

Concerns has been raised due to there being a few clusters of cancers in people living nearby to mobile masts. Participants in a survey were concerned about or attributed adverse health effects to mobile phone base stations and those living within 500 m reported slightly more health complaints than others.

In the present study for two years (1999 to 2001) researchers obtained data on all registered cases of cancer in children aged 0 to 4 in Great Britain. From 1,926 cases, 1,397 (73%) were included. Four controls per case were obtained and matched by sex and date of birth.
Further to this mobile phone operators provided data on antennas to an accuracy of about 10m and the researchers estimated exposure in relation to the distance and the total power output across base stations within 700 m (the typical peak is not nearest the mast, but normally is 200 to 500 m from the base station). They also and used a model to compute power density (dBm) which was validated with data from two further surveys.

The results of the study showed the mean age at diagnosis of cancer was two years and the mean distance at birth from a base station was not different between the cases, 1107 m and the controls, 1073 m (P=0.31). Also there was no difference in terms of the mean total power output of base stations within 700 m (P=0.54) for both groups; and no difference in the mean modelled power density (P=0.41).

The evidence presented in this paper for lack of effect is backed up by the dramatic increase in the use of mobile telephones not giving rise to a subsequent increase in the incidence of brain tumours. The one major limitation of this study is that they were unable to account for movement of the mother during pregnancy, which could have reduced the ability of the study to detect any true excess in risk.

Overall this is a well done study and allows us to feel more certain about the evidence base that there is no association between risk of cancer in young children and exposure to mobile phone base stations. It seems we can all relax a little more about mobile phone masts, the radiofrequency exposures are extremely low and backs up the World Health Organization, view that cancer is unlikely to be caused by cellular phones or their base stations.

If you want ot know a bit more then follow the excellent story tracker.

Oxygen and heart attack - what next?

Carl Heneghan
Last edited 21st September 2012

Most medical students will recognize the quote:

‘Half of what you'll learn in medical school will be shown to be either dead wrong or out of date within five years of your graduation; the trouble is that nobody can tell you which half—so the most important thing to learn is how to learn on your own.’

Dave Sackett: “Old fart from the frozen north” “Father of EBM”

The rapid assessment and treatment of a patient with a heart attack is drummed into most medical students very early on in their training. ABC: airway, breathing, circulation. Part of that resuscitation is the delivery of Oxygen to patients with a heart attack, mainly due to the fact the flow of oxygenated blood in the heart is stopped for a period of time.

The idea for providing oxygen in a heart attack is it may improve the amount of oxygen of the cells in the heart that are dying mainly due to the lack of oxygen, ultimately reducing pain and the size of the dead heart muscle. To most this will make sense in terms of pathophysiological reasoning.

Today a Cochrane review by Cabello and Burls on Oxygen therapy for acute myocardial infarction looks at the evidence from randomised controlled trials to establish whether routine use of inhaled oxygen in acute heart attack infarction improves patient-centred outcomes, in particular pain and death.

Now, here is the half of what is learnt learn that may eventually be out of date:

Three trials involving 387 patients were included and 14 deaths occurred. The pooled relative risk of death was 2.88 (95% confidence interval 0.88 to 9.39) in an intention-to-treat analysis and 3.03 (95% confidence interval 0.93 to 9.83) in patients with confirmed heart attack.

While suggestive of harm, the small number of deaths recorded meant that this could be a chance occurrence. Basically, there is no conclusive evidence from randomised controlled trials to support the routine use of inhaled oxygen in patients with acute heart attack.

The neat thing about EBM is you are never really sure of which half is out of date; this review adds to that half. As the reviewers rightly state, we need an urgent large scale trial to unpick the uncertainty.

Evidence 2010

Carl Heneghan
Last edited 15th June 2010

Right now, you would have to have been asleep to not realize implementing cost-effective change based on evidence is the key challenge for health systems around the world. beciause this its the most pressing problem we decided to bring together a conference of the evidence creators and evidence users to define the processes for implementing best clinical practice and forging efficient and cost-effective solutions for healthcare.

We would like you to join us at Evidence 2010, the leading evidence-based healthcare event at the forefront of EBM debate and innovation.

The conference is a collaboration between the BMJ and the Centre for Evidence Based Medicine CEBM.

The aims of the conference are to:

* Improve evidence-based decision making and provide practical, evidence-based ideas that can be implemented in practice
* Foster effective innovation
* Guide efficient commissioning
* Provide education and training to improve evidence-based healthcare.

We've got some great speakers lined up including:
Jim Easton, Sir Iain Chalmers, Sir Muir Gray Victor Montori,
Paul Glasziou, Mike Clarke, Sharon Straus,
Giordano Perez Gaxiola, Steven Woloshin, Fiona Godlee, Bill Summerskill, Helen Lester, Rubin Minhas, Amanda Burls, Dan Lasserson, Dyfrig Hughes, Tony Rudd, Tim Ringrose, Tom Jefferson, Ann McPherson, and Fiona Fox
Oh and not to mention Ben Goldacre Bad Science.Net

Look forward to seeing you there, Carl

EBM-the best way to cut the cost?

Ami Banerjee
Last edited 11th June 2010

Across government, but particularly across the NHS, a fear of impending cost-cutting is dominating both news and journals. The NHS Confederation, which independently represents all organisations within the NHS, reckons that the health service will be facing “real-terms reduction of between £8bn and 10bn over the three years after 2011” .

The budget for the NHS in England in 2010-11 is forecast to be just under £110bn, so the predicted shortfall between rising costs (due to an ageing population and increasing cost of treatments) and the budget is substantial. The Office for Health Economics to compare what we can do with £8-10 billion for a population of 25 million: (1) provide family health or mental health services for 1 year, (2) provide cancer treatment for 2 years, (3) provide care for normal births for 27 years, or (4) provide prescriptions for 1.6 years.

As the new Chancellor, George Osborne calls for a public consultation about where cuts should be made in public services, there are already several theories of where we should save in healthcare. At a conference of the British Medical Association this week, the GP leader advised reducing the “bureaucracy tied to the NHS market, management consultants, patient surveys and management tiers. He also called the role of NHS Direct into question. I blogged about the cost of healthcare consulting a few weeks ago, quoting the £600 million that is spent annually on consulting rather than on treating patients. Using the above Office of Health Economics data, we see that this money could have paid for family health or mental health for 1 month or for normal baby deliveries for 2 years. As a practising doctor, I can say that this kind of comparative cost data about treatments is hard to come by, so it must be even harder for patients and the general public to find out where their money is being spent. This situation is the same across other government departments. If George Osborne wants a proper engagement with the public, this kind of data is needed.

This week’s BMJ editorials include two salutary examples of where money is being wasted in the current NHS. Firstly the case of swine flu and the massive stockpiles of Tamiflu and vaccines, which have made drug companies US$7-10 billion cannot show more clearly what happens when evidence is not part of health policy. Add to that the issue of conflict of interest and the amount of money wasted (or swindled) goes up exponentially. Secondly, the idea of “risk sharing” which provided interferon to multiple sclerosis patients, despite NICE recommendations that it was an ineffective treatment. Risk sharing meant that drug companies and government work together to provide disease modifying treatments within the NHS under the conditions of a large study. “If the drugs were more effective than the NICE predictions, and so achieved cost effectiveness, then all would be well. If not, there would need to be a financial reckoning—payback from the drug industry to the Department of Health or reduced drug costs—to achieve "affordability" post hoc.” The results now show that interferon does no good, but are the drug companies keeping their side of the bargain? Of course not.

The good news is that there is one methodology that exists in order to evaluate treatments and healthcare in general. That methodology is EBM and it needs to be tied more closely with health policy if we are to have any chance of reducing the wastage in the NHS.

A confidentiality agreement, also known as a non disclosure agreement is a legally binding contract between two parties who wish to share information with one another, but wish to restrict access to others. Basically keep it a secret. They are used by companies when they are undertaking a potential business relationship. Some of these agreements are signed with employees or paid consultants, restricting the use of material and confidential information. In some circumstances the existence of such an agreement cannot be disclosed at all.

Some of you may have never even heard of such agreements or considered them important. However, in the last few years I’ve had numerous organizations approach me, mainly with regard to monitoring and diagnostic technologies. And even before talking, they want you to sign one of these confidentiality agreements or NDAs (non- disclosure agreement) as they are known in the trade.

To this, I have a perfectly simple answer. ‘If you want perfect confidentiality, then let’s not meet. Otherwise, if you want to keep it confidential then don’t mention it, but I won’t be signing any agreement now or in the future.’

So, when this week we find out from Fiona Godlee at the Council of Europe that:
‘Also worrying is the existence of a secret "emergency committee" which took key decisions relating to the pandemic: first the decision to downgrade the definition of pandemic in May 2009, and then to announce the pandemic one month later, triggering pre-established vaccine contracts around the world.’

Deborah Cohen’s investigative work on the WHO and the pandemic flu "conspiracies"
‘And why does the composition of the emergency committee from which Chan sought guidance remain a secret known only to those within WHO? We are left wondering whether major public health organisations are able to effectively manage the conflicts of interest that are inherent in medical science.’

Recently I gave a talk, and realized I was the only person not being paid in the room, and in doing so, was the only person who hadn’t signed up to a 5 year NDA.

Therefore, it is highly likely anyone who has a conflict of interest has signed an NDA previously, and has a major bias. As of now this should be added to journal submissions as part of the conflict of interest statement and be disclosed by public committee that set our health policies – In fact, if you’ve signed one of these you shouldn’t be allowed on the committees in the first place.

Atrial fibrillation-potential of new treatments

Ami Banerjee
Last edited 21st April 2011

Everybody seems to be interested in atrial fibrillation (AF) at the European Stroke Conference this week in Barcelona. So they should be as AF is the most common heart rhythm problem, affecting up to 1% of the population, and increasing with age. Usually an electrical impulse passes from the upper chambers of the heart (the atria) to the lower chambers (the ventricles), which pump blood around the body. If the electric impulse is not conducted properly, the atria “fibrillate” or “flutter” instead of beating in the usual coordinated way. When this happens, clots tend to form in the heart. AF increases risk of stroke by 5-fold due to these clots being thrown off into the brain. Strokes due to AF tend to be more severe and disabling than other types of strokes, with 25% mortality rate. But we have known about AF and the need to prevent strokes by thinning the blood with warfarin for many years, so why all the sudden interest?

We know that warfarin is better than aspirin and clopidogrel and aspirin alone in preventing strokes in people with AF from existing trial data. Therefore, warfarin is recommended in people with moderate or high risk of stroke. The problem is that people on warfarin need their blood monitoring and there is a risk of bleeding, as well as lots of interactions with other drugs and food. In people who cannot take warfarin, aspirin is better than nothing. I have previously mentioned the evidence that self-monitoring of INR levels in patients taking warfarin leads to better results. However, there has been a search for new drugs which avoid blood monitoring and are easier to take.

Dabigatran is such a drug. The RE-LY trial compared dabigatran to warfarin in stroke prevention in patients with AF other risk factors for stroke,such as age or heart failure. The relative risk of stroke with digabatran was two-thirds that of warfarin (remind yourself of what relative risk is). Importantly the new drug caused much less bleeding and you don’t have to worry which other drugs you are taking or what you are eating. There are lots of other drugs being trialled at the same time, but dabigatran is probably the most promising and has got the best results in the trials.

Another way to treat the problem is to treat the AF directly and try and make the heart rhythm return to normal. This has traditionally been done with drugs (known as anti-arrhythmics such as amiodarone), but surgical therapies have become available over the last 20 years. Over the last 10-15 years, cardiologists have been increasingly using techniques to burn the electrical pathways that are at fault in the atria. The procedure is currently recommended in patients who do not respond to drugs and still have symptoms from their AF such as palpitations. There are trials ongoing to test this long procedure, but one of the presenters (a cardiologist who was a proponent of the technique from France) raised the issue that it is difficult to envisage a time when the procedure will be available to large enough section of the population with AF. In other words, it requires so much time and expertise that providing enough training resources, human resources and enough hospital resources may be challenging. But would we want to offer everybody this procedure anyway? There must be cheaper, more effective alternatives.

Stroke and AF are totally intertwined and both are growing problems in ageing populations across the world. That is why researchers and drug companies are so interested because the potential rewards are huge.

Twitter TrustTheEvidence.net

tte
     

Search the TRIP Database

TRIP Database

 

Recent Comments