<$BlogRSDUrl$>

Monday, April 21, 2014


Who Shall Decide?

The press has recently included a number of articles about a movement under way to have doctors take cost into account when making clinical decisions.  The idea is that whereas the ethos of medicine has been that the doctor should only consider what was clinically best for the patient, cost has reached such a level that perhaps it should be considered as well.

There is some disagreement about that, of course, and in the New York Times article of April 18 on the subject, Dr. Martin Samuels, chief of neurology at Brigham and Women’s was quoted as saying “There should be forces in society who should be concerned about the budget, about how many M.R.I.s we do, but they shouldn’t be functioning simultaneously as doctors.”

I see merit in that point of view, but it leaves open the question of who those “forces” should be. 

The “who” most active in this area at present are insurance companies, not my first choice as I think they will always be suspected of being motivated more by financial considerations than by the interests of patients. 

I’m not thrilled about government doing it, either.  I wouldn’t feel comfortable having decisions about how much my health care can cost in the hands of politicians mainly concerned about surviving the next election.

My vote would be for my local, non-profit community hospital, controlled by trustees who are my friends and neighbors, united with its doctors, and operating in a market designed to reward providers who provide the best value for money spent – best value being defined as my best interests. 

 

Monday, April 14, 2014


Professional Independence

Professional independence has long been a strong element of the culture of health care generally and of the medical profession in particular.  “Following doctor’s orders” was something good patients did and nobody wanted to be accused of “interfering in the practice of medicine.”

That now seems to be in a state of demise. 

The April 8 issue of The Boston Globe carried an article about cutbacks in painkiller prescriptions by Massachusetts Blue Cross Blue Shield.  The key sentence reads “Faced with concerns about a rise in opiate abuse, Blue Cross implemented changes in July 2013 that have reduced prescriptions by 20 percent for common opioids such as Percocet and 50 percent for longer-lasting drugs such as Oxycontin….”  The changes were limits on the number of days’ supply per prescription and the number of refills eligible for payment.  Prior authorization by Blue Cross was also required in some cases.

As I read through the article my thought was that either some bad medicine was being practiced or Blue Cross was interfering in doctor business.  If the former, somebody would “view with alarm.”  If the latter, somebody representing the profession would complain.

Neither turned out to be the case.  Instead Dr. Raymond Dunlap, president of the Massachusetts Medical Society was reported as saying that “Blue Cross is heading in the right direction.”

It looks as if the day is coming when the professional independence of medicine will be a thing of the past.  Doctors who wonder how that happened need only look in the mirror.

Saturday, April 05, 2014


Planned vs Market Economy

We still can’t decide whether we want the health care economy to be centrally planned or market driven.

The situation is illustrated in Omaha where the University of Nebraska Hospital system has decided to discontinue an arrangement in which it cooperated with Alegent, an affiliate of the Creighton University Medical School, in providing Level I emergency trauma services.  Previously, each had a Level I program but rotated the days on which they operated.  Alegent announced it will continue to operate its Level I program so now Omaha will have two programs, each open 24/7.  Although the article did not say so, Medicare, Medicaid and private insurance companies will presumably continue to pay as before.

The reason given by Nebraska University is that while the previous arrangement worked alright, it was not eligible as a divided program for American College of Surgeons certification as a Level I facility. Currently the programs have state certification only.  ACS certification is the gold standard and not having it has made staff recruitment more difficult and generally detracted from University’s status as an academic medical center.

There may be something to that, but the impression left by the article is that University, being the stronger of the two, decided it could make it on its own, get ACS certification, and get a leg up on its competition.

The volume of Level I cases in Omaha, about six per day, is well within the capacity of a single unit.  So if operations continue as in the past, having two full blown programs will clearly be more expensive.  In other kinds of activity, that disadvantage would be offset by market competition in which each would strive to grow and prosper by becoming more attractive and efficient than the other.  But if only one was to continue in operation it would be considered to be a monopoly and regulated as such.

But it seems we are to have neither one.

And we wonder why the cost of care is so high.

Tuesday, March 25, 2014


A Managerial Blind Spot

The healthcare.gov web page fiasco calls attention to a persistent management problem; namely, the frequency with which IT projects fail.

Anyone wishing to pursue the matter can simply Google up “IT project failure.”   One reference it produces quotes a study by the McKinsey consulting company reporting that “On average, large IT projects run 45 percent over budget and 7 percent over time, while delivering 56 percent less value than predicted.”  A survey by KPMG consultants in New Zealand found that “….70% of organizations have suffered at least one project failure in the prior 12 months.”

A variety of reasons are given to explain this pattern of chronic failure.  All undoubtedly have some merit.  My own view is that it is attributable mainly to inadequate management - what I call a managerial blind spot.

Complex projects (and all IT projects are complex) can be greatly affected by detail.  An analogy would be that of adding a foot to the width of a bathroom while a house is under construction.  Doing that might seem like a small thing but because of its potential effects on everything else it could easily cause a major increase in cost and a delay in completion.

 In the case of healthcare.gov, it is reported that there was a question as to whether anyone would be allowed to go on the website and see the coverage options available and the cost of each.  Massachusetts had done that and the federal programmers were following the Massachusetts example.  Late in the planning process, someone decided that was not a good approach and that people should first have to learn whether they were eligible for subsidies.  Implementing that decision required that a lot of the programs be rewritten.

The point here is that if IT projects are to be successful, the end product must be clearly defined in advance and any issues involved it achieving it must be resolved.    Contemporary managers are not particularly good at that.  They like to think of themselves as “big picture” people who delegate what they see as small stuff to others.  But in complex projects, what seems to be small stuff, like widening a bathroom, is behind many of the failures.   Also, managers are often reluctant to deal with the friction associated with resolving differences and so Issues get kicked down the road.  The result is delay, cost overrun, and a failure to achieve the desired result..

Obama took a lot of heat for the debacle, and appropriately so.  But he doesn’t have to feel particularly lonesome in his misery.

Sunday, March 16, 2014


Major and Unremarked Change

The place of the medical profession in society is undergoing a major, important and largely unremarked change.

The era immediately following the end of World War II might appropriately be characterized as the Golden Age of Medicine.  Nothing happened in health care that was not subject to the profession’s approval.  I remember that sometime around 1960 polio vaccine became available and we took our two young boys over to the local school for their free inoculations.  Nurses were doing the work but two doctors were quite obviously in attendance; reminders that it was happening with the approval of the local county medical society.

At that time, it was common that medical society membership was a prerequisite for appointment to a hospital’s medical staff, thus giving the profession effective control over who could practice medicine in the community.  Hospitals ostensibly were controlled by their trustees, but medical staffs were considered to be “self-governing” and woe betide any administrator or trustee who interfered in professional affairs.

The adoption of Medicare in 1965 marked a big change in all of that.  Up until then, the American Medical Association had been able to block any such thing and so Medicare was the first health care decision taken by the federal government against medical advice. 

Another marker of change was the quality movement that got seriously under way in the late 1980’s.  Preventing errors and improving outcomes require support and action by institutions.  The loosely structured profession with its emphasis on the independence of the individual practitioner was not able to do it by itself.

A current marker of importance is the growth of salaried practice, to a large extent by hospitals.  A February 14 article on the subject in the New York Times reported that “About 60 percent of family doctors and pediatricians, 50 percent of surgeons and 25 percent of surgical sub-specialists….are employees rather than independent.”   It has also been reported that the number of physicians employed by hospitals is now greater than the number who are dues-paying members of the AMA.

The NYT article focused on the economic and clinical implications of this trend.  It did not address the social consequences, which may well be greater. 

Tuesday, March 11, 2014


Provider Charges

For years now, hospitals have been charging ridiculously high rates for their services.  Medicare beneficiaries are accustomed to getting reports showing how much the hospital charged and how much Medicare paid, with the latter usually being but a fraction of the former.

I have finally learned how we got into this crazy situation.  It is partly due to insurance companies that negotiate and contract with providers to pay a percentage of charges.  Those contracts periodically come up for renewal and the negotiating process being what it is, a common result is an increase in the charges and a reduction in the percentage paid.

The other and possibly more important factor has to do with what are called “outliers” in Medicare.  For standard cases like joint replacements and heart attacks, Medicare pays hospitals a predetermined lump sum.  Cases that do not fit any of the established categories are known as “outliers” and payment is determined by an algorithm in which the hospital’s prevailing charges play an important role and which work in such a way that the hospital can increase outlier payments by increasing its charges.  Hospitals have fallen into the pattern of counting on generous payment for outliers to compensate for the rather stingy Medicare lump sum payments, and hospitals for which Medicare covers a large percentage of their total patient load have come to be financially dependent on payments for outliers.

The result has become an embarrassment bordering on a scandal but so far nobody has come up with a remedy.  If, as many are urging, the fee-for-service system of reimbursement is replaced by some form of capitation (monthly prepayment), charges will become irrelevant and the problem will go away.  But that is not likely to happen for a while.

Wednesday, February 12, 2014


An Economic Conundrum

Is our outsized health care system an economic plus or minus?

The conventional wisdom, which I have tended to accept, is that our unusually costly system places an undue financial burden on government, makes our industry less competitive internationally and unnecessarily consumes resources that would otherwise be devoted to something else.  Thus, the economy would benefit if we spent less on health care.

In contrast to that, columnist Benjamin Applebaum argues in the February 9 Sunday New York Times that during the last two years the health care system has been a drag on the economy because it has grown more slowly than the economy as a whole (Will Saving on Health Care Hurt the Economy?).

He also points out that “The health care sector has repeatedly helped to pull the economy from recession in recent decades….”   Certainly, it seems likely that unemployment during the recent recession would have been higher if it were not for the large number of people employed in health care, which is largely recession-proof. 

The stubbornness of unemployment, attributed in large part to globalization of the labor market and information technology, poses the question of whether in the foreseeable future there will be enough jobs to employ all our unskilled and semi-skilled people.  Health care employs a lot of them.  Maybe that is a good thing.

Applebaum’s column ends by quoting Harvard School of Public Health Economist Katherine Baiker who favors a more efficient health care system.  She said “….for a given outcome, if you could get it with fewer resources, that would be better for everyone.  You could get more health.  You could get more stuff.”

Well, maybe.  But then again, maybe not.

This page is powered by Blogger. Isn't yours?

FREE counter and Web statistics from sitetracker.com