The Patient Protection and Affordable Care Act, popularly known as Obamacare, will go into effect next year amidst considerable controversy and without a clear understanding of what it will do, how much it will cost and how it will change the face of U.S. health care.
In fact, it has pleased no one. It is unlikely that, as matters now stand, it will completely eradicate the problem of the uninsured, who delay being diagnosed and treated and end up relying on costly alternatives, such as hospital emergency rooms.
To get passed, Obamacare also avoided taking on powerful players in the industry: doctors, insurers, for-profit medical and care facilities and drug companies. While failing to boost competition or institute meaningful cost controls, Obamacare will increase budget deficits and create another unwieldy federal bureaucracy.
It is also clear that Obamacare will not be the comprehensive reform package for the country’s health care industry that both Democrats and Republicans claim America desperately needs. Indeed, health care spending has been growing 1.5 times as fast as GDP. Even though in 2009-11 the growth rate slowed, health care spending now stands at $2.7 trillion and continues to rise in relative terms. Its share of GDP will likely surpass 20% by 2021, and projections of recent growth trends forward put it at 50–70% of GDP by 2080.
Moreover, these calculations do not take into account “imponderables.” Long-term projections made in 1960 could not, obviously, foresee the invention of new technologies and drugs, envision successes in treating and curing a number of deadly diseases and anticipate today’s healthier lifestyles—all of which have led to a steady increase in life expectancy for all Americans and boosted health care spending and costs.
Bad for Washington
In the final third of the 20th century, the federal government effectively became the underwriter of the American healthcare system. It provided access to care for many more Americans, especially the elderly, but it was also a crucial factor in allowing medical costs to escalate.
In 1960, when according to the Census Bureau the country spent just $187 billion on health care, 47% of that was paid for by patients themselves. Currently, out-of-pocket spending stands at just over 10%. Naturally, 50 years ago patients had a stronger incentive to shop around for cheaper services and to look over their health care provider’s shoulder to make sure they were not getting unnecessary procedures.
The contribution by the government, on the other hand, nearly doubled. In the 1960s, various public funds from all sources footed the national bill for a quarter of healthcare spending, but today the government pays for nearly one-half. Medicare and Medicaid account for 20% and 15% of total U.S. health care spending, respectively.
The federal government’s spending on health care makes up around 5% of its total outlays (roughly equaling this year’s projected federal budget deficit). However, that share will jump next year, when the ACA goes into effect and overall health care expenditures increase by 7.4%. Current trends suggest that the share of federal spending on health care will surpass 10% by midcentury, leaving less and less for defense, Social Security and other purposes.
Even if we believe the optimistic projections of the Obama administration and ignore its critics, we’ll still be told that sooner or later medical spending will bankrupt the U.S.
However, this may be a wrong way of looking at the issue or, rather, an incomplete one. True, further increases in health care costs will be bad—and possibly disastrous—for government finances. However, they are likely to be good for the U.S. economy and, equally important, for U.S. exports. And, since government finances will depend on the overall health of the economy, the future may not look so dim after all.
Take the prescription drug program for seniors known as Medicare Part D, passed by the Bush administration in 2003. It was criticized at the time as a wasteful boondoggle and many fiscal conservatives voted against it. Even though its actual costs have been almost half the $111.2 billion projected at the time, it still adds $60 billion annually to the federal budget.
But consider the benefits. Prescription drug sales in the United States experienced a one-off jump of $140 billion in 2006, the year the program was enacted, reaching $3.4 trillion. Higher spending on prescription drugs since then has provided a shot in the arm for Big Pharma at a time when many leading drug companies were sitting on expiring patents and had few blockbuster drugs in the pipeline.
Since the start of this year, healthcare-related, pharmaceutical, biotech and life science sectors have been the best performers of the 2013 stock market rally. While biotech startups as a group may be vulnerable to an eventual increase in U.S. interest rates, pharma-centered ETFs have been rising strongly. Small wonder: Drug companies have been coming out with new generation drugs to treat serious diseases. For example, shares of Bristol Myers Squibb (BMY) popped this year after several years in the doldrums, thanks to its new diabetes drug.
The health care sector has been immune to the jobs crisis that impacted the U.S. economy over the past five years. While the overall U.S. labor market has yet to return to its pre-2008 levels, employment growth in health care and social assistance has continued to average 2% annually. In fact, since 2001 the number of jobs in the field increased by 28%, to nearly 19 million—while overall job growth has been an anemic 5% over the same period.
While health care and biotech have long been mainstays of such cities as San Diego and Boston, the industry is helping depressed industrial regions to reinvent themselves. The city of Pittsburgh has done this successfully over the years and Louisville has benefitted from the presence of Humana Hospital. Cleveland, a perennial rustbelt laggard, has a strategy to attract life science companies and biotech startups centering on its world-renowned Cleveland Clinic.
World renown, in fact, is key to the future of the U.S. health care sector, broadly defined, because its medicine remains the best in the world and it leads globally in scientific innovation.
The phenomenon of medical tourism has traditionally centered on Americans traveling to less developed countries, such as Mexico, Argentina or India, to get costly treatment. But now that emerging economies have started to grow a middle and upper middle class, a reverse trend has been in evidence. Those who can afford it, flock to the U.S. to get state-of-the-art medical care.
Moreover, the development of communication technologies now allows patients to be monitored by their doctors remotely. This will become widespread as medical advances increasingly transform acute illnesses, such as cancer, into chronic conditions.
Drugs, medical equipment and medical devices have always had vast export potential. The problem was that the rest of the world was too poor to provide demand. This is likely to change, as well, once incomes increase worldwide. Moreover, by the second half of the 21st century demographers expect life expectancies in developing countries to increase and their populations to age. Advances in geriatric medicine that U.S. health care and life sciences companies are making today will have substantial export markets three decades from now.
The health care sector will contribute to the overall health of the U.S. economy, adding jobs and paying taxes. Moreover, as was the case in the computing sector, technological and scientific progress have the potential to reduce costs over the long run.
So far, in health care, progress has been responsible mainly for boosting costs, while cost reductions have been modest, such as shorter hospital stays and patient information sharing. But eventually cost cuts may prove significant, especially if science not only extends lifespans but slows aging and senescence. Such advances require investment, which profligate federal government spending seems willing to provide.