Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Tackling health inequalities has become a cornerstone of British government policy. From the corridors of Whitehall to the local board meetings of primary care groups and trusts, the imperative to reduce the health gap between rich and poor is ubiquitous. Yet when action is proposed to turn the rhetoric into reality, particularly when significant investments are required, the cry of “not enough evidence” is invariably heard in opposition.
Over the past decade, evidence-based practice has, quite properly, become a mantra of modern medicine. With the National Institute for Clinical Excellence and the Commission for Health Improvement now in place, medical practice in the NHS cannot get away with being anything other than evidence-based. The problem, though, is that the evidence most readily accepted within the health system comes from scientific methodologies like randomised control trials, which may not be the most appropriate forms of research into health improvement activities like community development, smoking cessation and rehabilitation. The growing understanding that health should be seen holistically—as being more about general wellbeing than the absence of disease—has not yet been equipped with evidence that policymakers and practitioners can respect.
Throughout the history of the NHS, there has been a tension between the imperative to treat disease in the individual and the need to promote general good health in the population. The two activities have invariably clashed over access to resources and demand on health workers' time. On some occasions, they have come into a more direct conflict, for example during the current arguments about the safety of the MMR vaccine and …