The only supposedly redeeming feature of the idea is that it would reduce Medicare spending by $7.6 billion a year, according to a recent Kaiser Family Foundation study. But that spending would only be shifted, not controlled.
Indeed, total health-care spending would be increased as both employers and seniors paid more. The Kaiser study estimates that 65- and 66-year-olds would spend another $5.6 billion a year, while employers would spend another $4.5 billion. That's spending $10.1 billion to save $7.6 billion.
If it's not prudently managed, Medicare does present a threat to federal finances. Though it has historically spent less per person than private insurers for comparable coverage, its spending has grown faster than American incomes. The problem is rising health-care expenditures, whether public or private.
But that has nothing to do with the number of Medicare beneficiaries. When most Americans turn 65 and retire, they shift from private health insurance to Medicare. More doing so means more Medicare expenditures. But this is a change in where expenditures show up - in public or private budgets - not in total health-care spending.
So there are two questions: How should public expenditures be paid for? And how can health-care spending be better controlled? The Lieberman-Coburn proposals conflate these issues, treating budgetary shifts as real increases or decreases in total health-care spending.
How to finance public insurance is a matter for debate. The payroll contributions that fund Medicare's hospital insurance are one option, and because they fall on all workers, very small increases produce huge amounts of revenue. Likewise, a small increase in premiums for Medicare's outpatient coverage would spread the impact widely, with the added benefit that it wouldn't single out those seeking care, as deductibles do.
However, the proposed increase in Medicare outpatient deductibles - from the current average of less than $200 a year to more than $500 - has almost nothing to recommend it besides ideological conviction. There is ample evidence that deductibles dissuade useful as well as useless care, increase administrative complexity and cost, fail to address expensive care for the very ill or injured, and act as a tax on sickness.
Over four decades of economic analysis of American medical care, there has been a persistent emphasis on patient cost-sharing as a solution to growing public health budgets. The case for it would be straightforward if medical care were an ordinary market good.
We allocate bread and circus tickets to those willing to spend what bakers and circus managers charge. Those willing and able to pay get the services and goods, and no central authority need interfere. Hence, if we make patients pay more, those who cannot pay will do without, and total costs will go down. This is doubtless true, but most Americans do not believe medical care should be allocated that way.
Or there's the argument that cost-sharing will get patients to forgo unnecessary care. But there is no solid evidence for this claim, and much contrary evidence. If free medical care led to more reckless overuse, countries like Canada and Germany, where patient costs are either zero or minimal, would suffer disproportionate inflation in expenditures or severe access pressures. They don't.
We have a real problem with medical spending in the United States, but it is not restricted to Medicare. We have little understanding of why other systems have been able to expand public insurance without anything like our growth in costs. And we have reformers who have faith in patient cost-sharing without justification. This combination has left us vulnerable to genuinely daft ideas.
Theodore R. Marmor is a professor emeritus at the Yale School of Management. Jerry Mashaw is a professor at Yale Law School. They can be reached at firstname.lastname@example.org and email@example.com.