Our special report this week argues that change may finally be under way. It is prompted, in part, by a host of information and communication technologies that should make health care much more portable, precise and personal. The spread of electronic medical records and the emergence of a “smart grid” for medicine (so doctors, and in some cases patients, can see what their peers are doing) should bring more transparency. “Intelligent pills” that come tailored to people’s needs should be cheaper than the one-drug-fits-all variety, especially if the doses do not have to rely on humans remembering when to take them. Personal medical monitors and other devices should make it easier to treat expensive chronic diseases that last for years, such as diabetes and heart defects, on a preventive basis. Your ticker can be monitored at home remotely rather than having to come in for check-ups, and problems can be spotted in advance, thus avoiding costly hospitalisations.
Change is also being prompted by the willingness of doctors and politicians, especially ones in poorer countries, to apply at least some economic tests to medical spending. One example is India (see article), where poor patients mostly have to pay for their own health care: its techniques and business models may yet be copied in the rich world. Another leader is Britain’s National Institute for Health and Clinical Excellence (NICE), which has championed the use of basic economic appraisals, albeit in an over-centralised way. Mr Obama wants to expand comparative effectiveness studies and health technology assessments. These sound boring but could save billions, which is one reason so many health-care firms moan about them.