El error humano no se resuelve buscando culpables, sino buscando soluciones


Me vais a perdonar por el "off-topic", pero acabo de leer este artículo sobre errores humanos en medicina que me ha parecido magnífico:

The Wrong Stuff : Risky Business: James Bagian—NASA astronaut turned patient safety expert—on Being Wrong

Siempre me ha parecido que los médicos tendían bastante a taparse los unos a los otros, pero la razón es que, como dice el artículo, los fallos no se analizan para evitar que no ocurran más (o que ocurran lo menos posible), sino que directamente se dice que ha sido culpa de alguien y se le dice "Ten más cuidado y fíjate más", como si eso fuera a arreglar algo... os traduzco (libremente) un párrafo que creo que resume muy bien el sentido del artículo:

Tomemos un ejemplo muy sencillo: Una enfermera le da al paciente de la cama A la medicina del paciente de la cama B. ¿Qué dirías? ¿"La enfermera se ha equivocado"? Eso es cierto, pero ¿cual es la solución? ¿"Enfermera, por favor, sea más cuidadosa"? Decir a la gente que tenga cuidado no es efectivo. Los humanos no funcionan así; algunos son mejores que otros, pero nadie es perfecto. Necesitamos una solución que no consista en hacer a la gente perfecta.

Por lo tanto, preguntémonos: "¿Por qué la enfermera se equivocó?" Quizá las dos medicinas sean muy parecidas... ese es un problema de empaquetado; podemos solucionarlo. O quizá se espera que la enfermera reparta las píldoras de diez pacientes en cinco minutos. Ese es un problema de organización; podemos solucionarlo. Y estas soluciones pueden tener un enorme impacto. Entre el 7% y el 10% de las medicinas que se administran a pacientes tienen algún problema: medicina incorrecta, dosis incorrecta, paciente incorrecto... pero si introduces un código de barras para la administración de medicinas, la tasa de errores cae a un 0.1%; eso es enorme!!

Además, me encanta este artículo porque no sólo es aplicable a la medicina; para casi cualquier aspecto de la vida, si cuando algo falla dejamos de pensar en quién la ha cagado, y pensamos en por qué ha fallado y qué podemos hacer para que el error se repita lo menos posible, el mundo sería un lugar muchísimo mejor...

Para quien no tenga problemas con el inglés, y a riesgo de que la Sinde decida chaparme el blog por fusilar contenidos ajenos, os copio el artículo íntegro... estoy seguro de que, con todos los enlaces y atribuciones correctamente puestos, a Kathryn Schulz no le importará que dé a conocer su trabajo entrevistando a James Bagian:

In the weeks after I launched this series, several readers e-mailed me to suggest that I interview a man named James Bagian. When I began looking into his background, it became clear to me why: Name a high-stakes industry, and odds are Bagian has been involved in trying to make it safer. He is, among other things, an engineer, an anesthesiologist, a NASA astronaut (he was originally scheduled to be on the fatal Challenger mission), a private pilot, an Air Force-qualified freefall parachutist, and a mountain rescue instructor. And then there's his current job: director of the Veteran Administration's National Center for Patient Safety. In that capacity, Bagian is responsible for overseeing the reduction and prevention of harmful medical mistakes at the VA's 153 hospitals.

Given that most of us are far more likely to find ourselves in a health clinic than a space shuttle, it's sobering to hear Bagian compare the overall attitude toward error in his various fields. "If you look at the percent of budget we spend on safety activity in healthcare versus, say, nuclear power or aviation or the chemical industry, it's not even close," he told me. "Granted, that's just one metric, and I'm not saying money is the be-all end-all. But if people in industry look at what happens in healthcare, they say, ‘Man, this doesn't look like anything we recognize.'" In the below interview, Bagian and I talk about how to make medicine safer, why he doesn't like the word "error," and what it was like to dodge the Challenger bullet.


How does the healthcare industry compare to engineering and aeronautics when it comes to dealing with human error?

Not favorably. Much of my background is in what's called high-reliability industriesthe ones that operate under conditions of high hazard yet seldom have a bad eventand people in those fields tend to have a systems perspective. We're not terribly interested in what some individual did. We want to know what led up to a bad event and what changes we need to make to reduce the likelihood of that event ever happening again.

When I got into healthcare, I felt like I'd stepped into an entirely different world. It was all about, "Let's figure out who screwed up and blame them and punish them and explain to them why they're stupid." To me, it's almost like whistling past the grave. When we demonize the person associated with a bad event, it makes us feel better. It's like saying, "We're not stupid so it won't happen to us." Whereas in fact it could happen to us tomorrow.

Why do you think healthcare lags so far behind in this respect?

For one thing, in healthcare there's tons of variation, in both biology and behavior, so physicians are rightly skeptical of the cookie-cutter approach. They think you have to tailor everything to the individual. There's some truth to that, but the tailoring should be based on what helps the patient, not on your own personal preference.

And then, too, medicine is much older than these other fields, eons old, and for most of that time there wasn't PubMed or the AMA or what have you. It was all about the expertise of the individual practitioner. It's a short step from there to assuming that problems in medicine stem from problematic individuals. That's why we have this whole "train and blame" mentality in medical culture; someone makes a mistake, you train them not to do it anymore, and then you punish them if it happens again. I think we've ridden that horse about as far as we can.

That suggests that the biggest obstacle to reducing medical error is medical culture, rather than our understanding of the human body or the quality of the available technologies and treatments.

It's all those things, but first and foremost, yes, it's cultural. But I should say before we go any further that I don't usually use the term "error." For starters, it distracts people from the real goal, which isn't reducing error but reducing harm. And it also feeds into precisely the cultural problem we're discussing. It has a punitive feel, and it suggests that the right answer was available at the time, which isn't always the case.

I appreciate that attitude, but some things really are medical errors, right? Bad outcomes don't only happen because a certain piece of information was unknowable or a certain event was unforeseeable. Sometimes doctors just write the wrong prescriptions or operate on the wrong body parts.

That's true, but if at the end of the day all you can say is, "So-and-so made a mistake," you haven't solved anything. Take a very simple example: A nurse gives the patient in Bed A the medicine for the patient in Bed B. What do you say? "The nurse made a mistake"? That's true, but then what's the solution? "Nurse, please be more careful"? Telling people to be careful is not effective. Humans are not reliable that way. Some are better than others, but nobody's perfect. You need a solution that's not about making people perfect.

So we ask, "Why did the nurse make this mistake?" Maybe there were two drugs that looked almost the same. That's a packaging problem; we can solve that. Maybe the nurse was expected to administer drugs to ten patients in five minutes. That's a scheduling problem; we can solve that. And these solutions can have an enormous impact. Seven to 10 percent of all medicine administrations involve either the wrong drug, the wrong dose, the wrong patient, or the wrong route. Seven to 10 percent. But if you introduce bar coding for medication administration, the error rate drops to one tenth of one percent. That's huge.

If the biggest obstacles to improving medical safety are cultural, how do you go about changing the culture?

Some of it is about philosophy. We're very, very clear about the fact that patient safety is not someone else's issue. We say, "Everyone here is your patient." Even if they're not directly yours, they're being cared for by this organization, and if you see something that puts someone at greater risk, it is your moral responsibility to intervene with their caregiver to make sure the right thing happens. You don't just say "That's not my business." Baloney. If it was your kid, would you get involved? Then why don't you do it for this patient?

And some of it is about tools. You can't change the culture by saying, ‘Let's change the culture.' It's not like we're telling people, "Oh, think in a systems way." That doesn't mean anything to them. You change the culture by giving people new tools that actually work. The old culture has tools, too, but they're foolish: "Be more careful," "Be more diligent," "Do a double-check," "Read all the medical literature." Those kinds of tools don't really work.

What kinds of tools have you introduced that do work?

One thing we do that's unusual is we look at close calls. In the beginning, nobody did that in healthcare. Even today probably less than 10 percent of hospital facilities require that close calls be reported, and an even smaller percentage do root cause analyses on them. At the VA, 50 percent of all the root cause analyses we do are on close calls. We think that's hugely important. So does aviation. So does engineering. So does nuclear power. But you talk to most people in healthcare, they'll say, "Why bother? Nothing really happened. What's the big deal?"

How do you get people to tell you about their close calls, or for that matter about their actual errors? Getting people to report problems has always been tricky in medicine.

Yeah, reporting is a huge issue, because obviously you can't fix a problem if you don't know about it. Back in 1998, we conducted a huge cultural survey on patient safety, and one of the questions we asked was, "Why don't you report?" And the major reasonmost people think it's going to be fear of malpractice or punishment, but it wasn't those. It was embarrassment, humiliation. So the question became, How do you get people to not be afraid of that? We talked about it a lot, and we devised what we called a blameworthy act, which we defined as possessing one of the following three characteristics: it involves assault, rape, or larceny; the caregiver was drunk or on illicit drugs; or he or she did something that was purposefully unsafe. If you commit a blameworthy act, that's not a safety issue, although it might manifest as one. That's going to get handled administratively, and probably you should be embarrassed. But we made it clear that blameworthiness was a very narrow case.

At the time that we conducted this survey, we were already considered to be a good reporting healthcare organization; our people reported more than in most places. But in the ten months after we implemented this definition, our reporting went up 30 fold. That's 3,000 percent. And it has continued to go up ever sincenot as dramatically, but a couple of percentage points every year.

It's pretty sobering that the reporting rate can go up so much. I realize that that's good news, but it also suggests that there was (and to a lesser extent presumably still is) a lot of bad stuff going on out there that we never hear about.

That's true. But the only reason to have reporting is to identify vulnerabilities, not to count the number of incidents. Reports are never good for determining incidence or prevalence, because they're essentially voluntary. Even if you say "You must report," people will only report when they feel like it's in their interest to do so.

Do you punish people for failing to report serious medical issues?

No. In theory, punishment sounds like a good idea, but in practice, it's a terrible one. All it does is create a system where it's not in people's interest to report a problem.

What about public reporting? If the primary purpose of reporting is to identify vulnerabilities, is there any value to making such reports public? There certainly seems to be some movement in that direction within healthcare.

It depends what you're reporting. If you look at our Web site and publications, you'll see that we post risks and advisories, we're open about the problems we have and the steps we need to take. And we don't mince words; it's not like we're afraid to talk about these things.

But the reports that come in raw from the fieldI don't think it makes sense to make those public. They're too misleading. People don't understand what they mean, they don't have the knowledge and sophistication and opportunity to get the full facts, and the way something looks at first blush is often not how it looks after an investigation.

What about these scorecards and similar public metrics that some states and institutions are now using? Are you in favor of those?

I don't think they're always bad, but I do think they often kid people about what's going on. When people think they're going to be graded, they're very likely to take action to make themselves look good, to give themselves a business advantage. And that can be dangerous.

Let me give you an analogy. In the United States, airlines are legally prohibited from advertising based on their safety record. The feeling was, if you let airlines compete for customers based on safety, there will be an incentive not to report problems. Suppose I work for an airlines where the ad campaign is "We're the safest company, we've never had a flight canceled for maintenance problems." And I'm a mechanic and I see a maintenance issue and I think: "We should hold this flight." But someone above me is saying, "This is going to destroy our advertising campaign, this is an investor-owned airline" and so forth. So I say, "Well, maybe it's not that big of a deal, we'll catch it at the next scheduled maintenance instead of dealing with it now." Do we really want to create that kind of perverse incentive?

On the subject of public awareness of medical safety, I want to ask you about some recent incidents at VA facilities, such as the non-sterile equipment that might have exposed patients to HIV and hepatitis.

Other places have had the same thing happen or worse and done nothing about it. At the Senate hearings, people from other medical systems showed up and said, "Oh, this stuff happens all the time, it's just that the VA told you about it." The joint commissioner said, "The VA's done more than anybody in terms of looking at this and making it better." You have to have a thick enough hide to tolerate some of the unsophisticated responses to the fact that you're publicizing a problem. All those responses do is make some managers who don't have as much courage say "Let's not talk about this in the future." And that means the problems don't get fixed.

What about the VA facility in Philadelphia with the so-called "rogue cancer unit," where almost a hundred patients received inappropriate radiation treatment?

The fact is, there was not a robust quality-control system in place for that kind of treatmentnot in Philly and not anywhere. The profession didn't even have standards about how to decide what amount of radioactive seed to give and how to follow up. Penn, which was providing the service, said, "There's no standard, so we didn't violate it." And we said, "Well, there should be a standard, and we should've been enforcing it. We should have stepped into the gap in healthcare, as we've done in many other situations." We hadn't. Now we have. We went and changed it all, and along the way we took our lumps in the press.

But here again, the important fact is that we didn't say, "Let's not talk about it." The easiest thing would have been to fix this one problem and not make a big deal of it and let everybody else fend for themselves. We didn't take that approach. I think that's a good thing. Does it hurt the organization in some ways when people read about it and say, "Oh, look at what's happening in the VA?" A little. But what they don't know is that the same thing or worse is happening in their own hospital.

As a government institution, is the VA legally required to behave differently in terms of reporting and investigating problems than private hospitals and healthcare systems?

No. Nobody told us we had to look at close calls or tell people they can sue us or any of that. It's just what we decided to do. In the VA system, we ask, "What's the right thing for the patient?" That's what guides us. Whereas people in the private sector sometimes say, "We could lose market share if we talk about this publically, let's not do it."

It's got to be traumatic to be a healthcare provider who is involved in a major medical error. How do you support practitioners in that situation?

We don't deal with that on the safety side. In my opinion, it's a nice thing to do, but it's not the major issue. Quite honestly, I think: "Get over it and grow up." I come from aviation, and we don't have pilot support groups. Would it be helpful to have them? Maybe a little. But I think the far more important thing is: Don't blame people where they shouldn't be blamed. Don't humiliate them publically. Don't disclose who they are if it wasn't an intentional act. And show them that the problem they reported was fixedthat it was worth taking that risk, because it made things better for other patients.

Let's talk about the patients. What do you do for victims of medical error?

If a patient is harmed by something we've done, we tell them. We explain what happened, we tell them that they're eligible for monetary compensation, and we tell them they can sue us. I don't know any other place that says, "Here's how to bring a tort claim against us." We do. We figure that if we hurt you, whether through malfeasance or not, we should make restitution.

Do you get sued a lot?

There's a ton of information in the malpractice literature about what are called closed claimsthe ones that are resolved in courtand what you see is that when the patient feels like they've been dealt with less than candidly, that's when they really go in for the kill. It's like, "Okay, if that's the way you treated me, let's see who pays in the end." It becomes about getting even, which I can understand.

So we make it easy. We just tell them. And we end up getting more torts filed, but the aggregate payment is less, because people aren't trying to get revenge. Most people just want us to pay for something specific, to take care of the problem we created. And a lot of people say, "Thanks for telling me, I'm not glad it happened, but I understand that it wasn't intentional." And that's that.  

Let me shift gears for a bit and ask you a couple of questions about your career as an astronaut. Given that, as you've said, aviation is historically far better about safety issues than medicine, what kinds of things still go wrong up there?

You can never make the probability of failure zero. You can make it really low, but you can never make it vanish. In a high-stakes, high-value system like a space shuttle, we go to great lengths to understand, say, the failure probability for a valve or a fitting or bolt. And then we do what's called a probabilistic risk assessment: we put all of those probabilities together and say, "What level of confidence do we want to have that we won't have a catastrophe?" And management has to decide what that level is. It really comes down to riskto how much of it we're willing to accept.

It seems that NASA's been willing to accept a fair amount of it, given the tragedies that have bedeviled our space program.

NASA has been terrible on this. Not because of how much risk they're willing to accept, and not because they don't do the work to understand it. They always know what the probability of failure is. But, historically at least, they haven't been honest and forthright in taking to the public about it. In the early '80s, before the Challenger accident, they would sayand this is where I think they actually lied, I don't think that's too strong a wordthey would say, "Flying the shuttle is like flying a 727 to Disney World."

That's absurd. Not only are you more likely to get killed in the shuttle than in an airplane; you're more likely to get killed going up once in the shuttle than if you had flown combat missions for two years in Vietnam. That's a statistical fact, but NASA doesn't make it clear. They might tell the House [of Representatives] that there's a 1.5 percent failure rate, but most Americans don't understand what that means. I mean, 1.5, what is that? Is that a lot? You have to relate it to something that means something to somebody. Otherwise, people have the perception that space flight is safe, and when there's an accident, they're shocked. It's like, "We gotta stop flying." If we want to add additional safeguards because now we're feeling emotional about it, okay, we can do that. But if we're still meeting our design specs for loss, why would we stop flying?

Pretty much everyone who lived in the U.S. at the time can tell you exactly where they were and what they were doing when they found out about the Challenger disaster. You were supposed to be on the Challengerand then a few months before the fatal mission, your crew was switched out. Where were you instead?

I was there. I helped get everything ready. I babysat the vehicle during tanking. I was at the pad in case there was a pad emergency. When it happened, I was looking at my watch, because every time the shuttle launched, I would think, "I wonder if we're going to lose it during launch this time." That wasn't a fleeting thought, like a low-probability thing that just crossed my mind. It was something I thought seriously about every single time. The riskiest time is between throttle down and throttle up, approximately between 30 seconds and 75 seconds into the mission. So I'm looking at my watch and I'm like, "Okay, we're back to full throttle and it didn't blow. We're over the hump." And then a second later, it went off.

How did you feel?

Was I sad that it happened? Of course. Was I surprised? Not really. I knew it was going to happen sooner or laterand not that much later. At the time, the loss rate was about 4 percent, or one in 25 missions. Challenger was the 25th mission. That's not how statistics works, of courseit's not like you're guaranteed to have 24 good flights and then one bad one, it just happened that way in this casebut still, you think we're going to fly a bunch of missions with a 4 percent failure rate and not have any failures? You gotta be kidding. Anybody who's a realist knows you're going to have losses. Even at 1.8 percent, which is the estimated failure rate these dayshow many missions did we go? We went another 70-some missions and had another loss. Well, we were looking at a one in 80 loss rate. That's right on schedule.

That's a really high risk level. Handling it institutionally and politically is one thing, but how do you handle it emotionally?

Everybody's different. People who hadn't been around the high risk stuff themselves, it changed their whole appetite for it. Others looked at it much as I did: It's a shame but it happens, let's go on. I had worked at a test pilot school and some of my best friends were killed while I was there, so it was not an abstract concept to me that people I worked with would be killed doing the job I do.

You were part of the team that investigated the Challenger accident. Were you satisfied with how that investigation was handled?

Overall I didn't have big problems with it. But one thing that was deliberately buried was what happened to the crew. I did that part of the investigation, and there was tremendous political pressure not to tell anyone what happenednot even the other people in the crew office. They didn't learn for months, which was totally inappropriate. They wouldn't even let us put in checklists about what to do in the case of a breakup similar to Challenger. There's ways you could probably survive it, but politically we weren't allowed to discuss that for years, which to me is total hogwash. There are still many people that don't understand that the crew of the Challenger didn't die until they hit the water. They were all strapped into their seats in a basically intact crew module; their hearts were still beating when they hit the water. People think they were blown to smithereens, but that's not what happened. And the problem with that is the same one we were talking about with regard to medicine: if you don't learn what you can from a tragedy, you can't mitigate that risk in the future.

If you could hear someone else interviewed about wrongness, who would it be?

I've been thinking about these issues one way or another for my entire adult life and I've talked to most of the major hitters, so I'm guessing I'm not going to hear much that strikes me as new. But I would be interested to hear what the president [Obama] thinks when outcomes are less than what would be desired.


Kathryn Schulz is the author of Being Wrong: Adventures in the Margin of Error. She can be reached at [email protected]. You can follow her on Facebook here, and on Twitter here.

This interview is part of a series of Q and As in which notable people discuss their relationship to being wrong.  You can read past interviews with hedge fund manager Victor Niederhoffer, mountaineer Ed Viesturs, This American Life host Ira Glass, celebrity chef Anthony Bourdain, Sports Illustrated senior writer Joe Posnanski, education scholar and activist Diane Ravitch, and criminal defense lawyer and pundit Alan Dershowitz.

  1. en respuesta a Runrun bv
    25/03/11 02:08

    Buen debate abierto. En mi empresa pasa exactamente esto. Yo me esfuerzo en identificar todas las posibles causas que originan un desastre, mientas que la gente se limita a puntualizar los eventos desastrosos del pasado de forma aleatoria, y como no a buscar culpables. Esto es asi... Fernan2 viene de la rama de la informatica, yo vengo tambien de esa rama, el problema es que no todos vienen de disciplinas en la que te enseñan a pensar sobre lo indefinido, y hay que convivir con ello.

    Creo que si todo el mundo controlase el 100% de los procesos, nos extinguiriamos por el sobreanalisis. Hay problemas humanos, con culpables, y sin solucion... http://es.wikipedia.org/wiki/Dilema_del_prisionero

  2. en respuesta a Fernan2
    24/03/11 17:15

    No te mosquees conmigo Fernan2. Reconozco que he sido tendencioso, pero mira, cuando comienzas en urgencias te aleccionan para evitar esos errores (Lo primero que se debe decir al saludar es el nombre y dos apellidos de la paciente, para estar seguro de que no te equivocas, dando por supuesto que será ella quien te corregirá si está capacitada mentalmente. Esa pregunta se realiza lo menos 10 veces durante un ingreso. Y la razón es esa. )

    Pues eso, que no te me piques hombre, pero déjame decirte que la edad consta en la historia y a veces hasta la foto, la cual, sinceramente, de poco sirve en muchos casos (te aseguro que a mi hermano le confundirías conmigo por la foto y por la voz). Más sencillo sería un sistema de notas independiente de la historia en las habitaciones que produjera redundancia de información en aquellos puntos clave como por ejemplo: "la paciente se encuentra en radiología efectuándose una RMN; 10:45 AM" Solución barata, sencilla y rápida que por cierto alguien sugirió y comenzó a utilizarse en ese y otro centro(vas a tener razón; soy como los políticos).

    Independientemente de que tus implementaciones sean útiles o no, se trata de un fallo de organización del que no debería responsabilizarse, por lo menos no por completo a la enfermera, sino a su supervisora y a la gerencia. Por desgracia, la acción directa es lo que cuenta en la psique de las personas y sería muy difícil demostrar y atribuir responsabilidades morales, profesionales y, sobre todo, legales a quien es debido. En mi experiencia la gente va a por el eslabón más débil, lo que genera asistencias defensivas aún peores y que deterioran la necesaria relación de confianza entre el personal sanitario y los pacientes. Eso se lo pone muy fácil a los que deberían ponerse las pilas en vez de soportar como tuve que soportar yo durísimas sesiones de mortalidad en las que tienes que dar cuenta de por qué se murió a las 5:00 un paciente que atendí yo a la 1:00. Jamás he visto a un director de hospital soportando esa tensión y encima su cargo es "electivo". Tócate los cataplines.

    Pd: en cuanto a inversión y economía, de verdad que confío en tu buen criterio, como has demostrado por aquí y por allá en numerosas ocasiones.

  3. en respuesta a Runrun bv
    24/03/11 16:57

    Runrun, me estás haciendo como los políticos: en vez de decir la verdad, dices la parte de la verdad que te conviene para lo que tú de antemano querías argumentar.... mi madre tiene 60 años, y nadie la confundiría con una señora de 85-90, eso te lo aseguro!! ¿Que habrá algún caso que sí que puede confundirse? Alguno habrá, pero probablemente habremos evitado 9 de cada 10 errores, sólo poniendo edad y foto. Y si no hay medios para poner la foto, pues sólo poner sexo y edad ya ahorrará algunos errores de ese tipo (¿la mitad?), con un coste muy cercano a cero. Y si por razones de urgencia no se puede hacer para el del brazo amputado, pues vale, lo haremos sólo para los pacientes que no vengan con una urgencia vital, que son ¿el 90%?, lo que nos reducirá el número de errores a 8 de cada 10, en vez de los 9 de cada 10 que hablamos... ¡eso sigue siendo una mejora muy significativa!!

    Y ojo, que yo no digo que no haya que coordinar ingreso, celador y enfermera... es que no se trata de elegir una u otra, perfectamente se pueden hacer las dos cosas, y seguro que irá mejor. Porque tú dices "coordinar", pero luego llega un cambio de turno, o alguien está ocupado con algo urgente, y lo de coordinar se va al garete con la misma facilidad que se va al garete lo de la ficha con edad y foto... y si cada una de las medidas por separado puede evitar 8 de cada 10 errores, aplicándolas conjuntamente se evitarían 96 de cada 100 errores.


  4. en respuesta a Fernan2
    24/03/11 11:25

    Por suerte o por desgracia nos parecemos a nuestros padres y más a medida que nos hacemos viejos; la foto de poco habría servido. El mismo razonamiento vale para la edad. Creo que tanto tú como yo confiaríamos mucho más en el conocimiento de alguien sobre sí mismo a menos que lo ingresen por transtorno mental. Diré más, sólo hoy día es posible gracias a los avances tecnológicos añadir sistemáticamente una foto en la historia clínica (yo incluso creo vídeos de las exploraciones). Pero ¿cómo te las arreglas en el ambulatorio de Quintaneja de la Tomellosa a donde acuden gentes de toda la comarca a recoger recetas para las "clausulas azules con la rallita en medio"?. ¿Y la protección de datos? ¿Y el precio? ¿Y el tiempo que eso requiere? ¿y el coste del tiempo? ¿Y las situaciones de urgencias? "A ver señor, ya sé que le duele el brazo amputado, pero joder, sonría un poco al pajarito"

    Es posible que en medicina lo de evitar errores esté en la prehistoria. Y yo os digo que afortunadamente está en la prehistoria, porque me parece de pura lógica que los mecanismos mentales que permitieron a individuos sin tecnología sobrevivir y prosperar hasta hoy día sean más simples, efectivos y eficientes que los que emplea la NASA. Cualquier "ignorante" cazador del Sherengueti habría caído en la cuenta de que no se debe dejar sólo a un compañero en determinados momentos del lance, pero todo un informático del que conozco su brillante intelecto no ha caído en la cuenta de que la clave en este caso era coordinar ingreso, celador y enfermera. Dios mío, pobre economía.

  5. en respuesta a Runrun bv
    24/03/11 01:07

    Claro que la interacción humana es imprescindible, pero... también se puede mejorar el método ¿qué tal si la enfermera tuviera una ficha en la que figurara la edad del paciente y una foto del mismo? El error sería menos probable, ¿no?

    Y efectivamente, si la cosa acaba en desgracia, lo más probable es que vayan a por la enfermera, o a por el cabeza de turco que encuentren, o a por el hospital... de eso me quejo en el post: A nadie se le ocurrirá lo de dar fichas a las enfermeras, que es lo que evitaría que el problema se repita!!


  6. en respuesta a Gonzoneitor
    23/03/11 22:50

    Buenas tardes Gonzoneitor.

    Estoy absolutamente de acuerdo contigo en que ya desde el colegio deberíamos empezar a adquirir una cultura económica y financiera. Como una asignatura obligatoria más. Pero mi ejemplo trataba de un ancianito, a quien obviamente ya no le vamos a poder enviar a la escuela.

    No creo haber dicho que considere que el Estado deba velar por los intereses de un inversor incauto. De hecho, he comentado el papel que, a mi humilde juicio, debería tener la figura del Defensor del Cliente o de la oficina de atención al cliente de una entidad bancaria.
    Para afirmar que gozamos de libertad y ejercerla necesitamos unos valores propios: 1- conocimiento de causa, 2- posibilidad de elección, 3- voluntad de acción, 4- previsión de consecuencias.

    1- Si no conocemos no somos libres porque ignoramos cuáles son nuestras posibilidades y limitaciones. El conocimiento nos hace libres, la incultura del inversor es la base que facilita una estrategia al "asesor" que intenta o tiene la misión de estafar, porque tiene unos objetivos que cumplir, pese a quien pese, pase lo que pase... Luego ya nos lavaremos las manos con su firma.

    2- Si no podemos escoger no somos libres. El tratar de convencer de que no hay elección, ya que en estos momentos no hay otro producto que se adecúe mejor al perfil históricamente conservador del cliente, limita en gran medida la libertad del inversor confiado.

    3- Si no hay voluntad no hay libertad. Existen mecanismos de control individual, que se basan en la limitación o anulación de la voluntad propia mediante por ejemplo, la convincente persuasión dialéctica (monologuista), la publicidad engañosa, la utilización de un léxico específico con objeto de disminuir la capacidad cognitiva del inversor aturullado ante tanto concepto que le supera, la información falaz o sesgada del producto...

    4- Una persona no ejerce por completo su libertad cuando no sabe calcular qué consecuencias traerán sus actos. Actuar ignorándolos implica un margen menor de libertad porque queda la duda de si la persona hubiera actuado o no, así o asá, de haber sabido qué consecuencias traería su acción.
    Y no voy a entrar en ejemplos puntuales, porque podría describir cómo nunca me han hablado de los riesgos de un producto cuando voy a contratarlo. Siempre me hablan de la parte bonita. Y no soy anciana, ni analfabeta, aunque reconozco abiertamente que de finanzas cero patatero.

    Démosle pues un pececillo a quien por avanzada edad, enfermedad o incultura ya no puede utilizar la caña por sí mismo. Asesores bancarios y Defensores del Cliente: ¡¡Menos salarios y bonus y más humanidad!!

  7. en respuesta a PNeoliberales
    23/03/11 22:18

    Qué suerte tienes que conoces la perfección para dirigirte sin derroteros hacia ella

  8. #13
    23/03/11 22:15

    Una enfermera con más de 20 años de experiencia y gran profesionalidad entra en la habitación y se encuentra a una señora de uno 60 años sentada en la butaca:
    "Buenas tardes Menganita. Soy la enfermera Zutanita. Esta medicación que se la voy a poner (y le muestra la jeringa para aplicación subcutánea con fraxiparina) para evitar que se le formen coágulos en la sangre ya que va a permanecer varios días ingresada y en reposo en cama. Levántese por favor la blusa"

    La enfermera procede sin ningún inconveniente.

    "Ahora acudirá un celador a buscarla para bajarla a radiología donde le realizarán una resonancia magnética"- Continúa la enfermera
    " Ah, no, pero si el celador ya ha venido"- contesta la señora
    " ¿Cómo que ya ha venido? ¿y dónde está?
    A lo que la buena mujer contesta "supongo que con mi madre, esperando a que terminen de realizar la resonancia"

    Este hecho real que no terminó en desgracia muestra algo que en ese texto apenas se roza y es que la mayor parte de la información útil en medicina ahora como hace 2000 años se sigue obteniendo de la interacción con otros seres humanos cuyo comportamiento es, en alguna medida, impredecible.

    Sin embargo sí que es predecible que si algo termina en desgracia, da igual lo bien que hayas trabajado porque pedirán tu cabeza y más en España. Esto sucede sobre todo cuando los familiares de las "víctimas" son de clase media alta y poseen estudios universitarios (según las últimas estadísticas) mucho más que cuando acaecen sobre pobres diablos sin estudios ni recursos. Qué curioso, en un estudio efectuado por el colegio de médicos de Santander allá por los 90 del pasado siglo, el porcentaje de denuncias ridículas que dan la risa a la judicatura proceden sobre todo de los privilegiados por la diosa fortuna.

    Por cierto, en España abundan los licenciados en económicas y empresariales más que de otras ramas del saber; así que, aunque la estadística no diga nada de esto, os invito a la reflexión.

    Pero buen rollo ¿eh? que seguro que tenéis razón, la economía va de puta madre por los errores de los médicos que cuidan de los listillos que aconsejan a los banqueros; seguro que sí y vosotros no tenéis que hacer autocrítica.

    Pd: Fernan2 eres un cabronazo (dicho con todo mi cariño)

    Pd:Sinceramente estoy hasta la coronilla de listillos en mis consultas. Nos ha jodido

  9. en respuesta a Siscu69
    23/03/11 22:03

    También me siento estafada, pero como me obligan a decir "presuntamente" y me fastidia la palabrita... por éso en mi comentario he dicho: "Opino que porque no era tal error, sino más bien su intención."

  10. en respuesta a Karlicones
    23/03/11 21:33

    mejor que intención o error esta el escrito de Fernan2 , el habla de estafa, yo opino como él, me siento estafado.

  11. en respuesta a Fernan2
    23/03/11 21:30

    Este es el comentario de los mejores que le he escuchado Fernan2.

  12. #9
    21/03/11 18:43

    Esto me recuerda uno de los comentarios del protagonista de la novela de Michael Cripton "Sol Naciente". Venía a decir que una de las cosas que diferenciaban a los americanos de los japoneses era que cuando alguien cometía un error los americanos tendían primero a buscar a quien culpar mientras los japoneses primero intentaban solucionar el entuerto. En España actuamos igual, que los americanos no que los japoneses.

  13. en respuesta a Karlicones
    21/03/11 18:10

    Karlicones, con el debido respeto, no es el mismo ejemplo.
    En el caso de la enfermera, no existe voluntad por parte del sujeto perjudicado. En tu ejemplo del banco, sí.

    En ningún momento estoy defendiendo el sistema bancario. De hecho, no me fio ni un pelo de ellos. Pero llegar a defender que se deben crear instituciones para protegernos de los bancos es meterse hasta la cadera en arenas movedizas. ¿Hasta dónde se llega con esas medidas? Estás pidiendo que se habilite la injerencia por parte de las administraciones en nuestra sagrada libertad personal, dicho de otra forma, institucionalizas que el Estado vele por nosotros y esté facultado para restarnos libertades en pos de nuestro bienestar... como si nosotros desconociéramos lo que es mejor para nosotros. ¿El Estado español, las comunidades autónomas o los ayuntamientos saben mejor que tú y que yo lo que nos conviene? Creo que no.

    La solución a las tomaduras de pelo de los bancos pasa necesariamente por dotar a la población de muchísima más cultura económica y financiera de la que tenemos. Es una aberración que tratemos de romper la baraja de juego cuando el resultado es contrario a nuestros intereses. En Derecho, eso es la seguridad jurídica. Es que cada uno de nosotros somos responsables porque somos libres de tomar las decisiones, sean favorables o perjudiciales, y es nuestro deber conocer las consecuencias de nuestros actos.

    Es insultante que nos tilden de analfabetos y que esa idea promueva la creación de normas para nuestra defensa, que pueden llevar a coartar nuestras libertades. La ignorancia no excusa el cumplimiento. Y si alguien desconoce algo, démosle las herramientas para que aprenda. Demos una caña de pescar a quien tiene hambre, pero que a mí no me digan que el Estado se encarga de velar por mis intereses. Eso es intolerable y yo no lo acepto bajo ninguna circunstancia.


  14. en respuesta a Comstar
    21/03/11 16:32

    Lo que ocurre es que si consideramos el error humano como algo inevitable, y que le ocurre a todos, no es posible separar los zoquetes de los que realmente están en "lo normal"... si aplicando estas técnicas se redujera sensiblemente el número "normal" de errores médicos, los "zoquetes" (que sin duda los seguiría habiendo) resaltarían como una jirafa en medio de un rebaño de ovejas, y sería más fácil "limpiarlos"...


  15. en respuesta a Efectoyunque
    21/03/11 16:14

    El propio autor ya lo dice: no hay ninguna novedad en el sistema... lo novedoso es aplicarlo a la medicina, que es un campo que en esto está aún en la prehistoria.


  16. en respuesta a Karlicones
    21/03/11 16:12

    Evidentemente, hay quienes han hecho del "error" un modelo de negocio; no solo bancos, también las telecos, por ejemplo... pero en ese caso, ya no hablaríamos de un "error", sino de una ESTAFA:

    ESTAFA: Código Penal, Artículo 248. 1. Cometen estafa los que, con ánimo de lucro, utilizaren engaño bastante para producir error en otro, induciéndolo a realizar un acto de disposición en perjuicio propio o ajeno.


  17. #4
    21/03/11 14:53

    Existen fallas sistémicas que hacen el error humano inevitable. Pero también existe falta de voluntad de hacer las cosas.

    Al tener a mi esposa en el hospital me he encontrado algunas situaciones. De acuerdo con un médico, el problema de mi esposa no era tan serio, de acuerdo con... las acciones efectuadas en un internamiento anterior (hacía más de una semana)... y si por él fuera le daría la salida. Es que este médico piensa que "si se ve bien, está bien". Verse bien implica que está alegre, camina y come. Bajo esa premisa los optimistas podrían ser enviados a casa a morir, las anoréxicas o los discapacitados serían los únicos que se quedarían en un hospital. Juzgar por lo que pasó en un internamiento anterior ignora la razón que da motivo al nuevo internamiento.

    Este médico tiene una verborrea que al principio es convincente. Echa primero el libro de texto, para ganar credibilidad y luego dice luego lo que le da la gana, normalmente una burrada.

    Resulta que el jefe dee ste médico pensaba distinto, pues prefería tener un examen reciente para tomar decisiones basadas en la evidencia. ¿Seá culpa del sistema que este médico sea tan incompetente? Yo no creo.

    Este médico problemático tiene cosas en su psique que tiene que arreglar, un gran problema de ego, paranoia, y vive atrapado en su pasado familiar en relación con su abuelita.

  18. #3
    21/03/11 13:48

    Concretamente en el caso de la enfermera, es un problema de sus superiores. En la produccion en cadena los mejors son los japoneses e inventaron el "pokeyoke". Es decir, que no soy un lumbreras, esto tiene 30 años.
    Esto consiste en hacer las cosas que sean imposibles. Imposible que si accionas una sierra tus manos quepan por ningun lado. Literamente es sistemas para tontos.
    Yo no soy un gran ingeniero, pero para mi es tan facil como que las pastillas del enfermo del lado de la ventana SIEMPRE, tome las pastillas rotas por otra enfermera a quien las da... por ej.

  19. #2
    21/03/11 11:29

    Excelente post. Es una filosofía de vida que llevo a rajatabla, lo importante no es ser perfecto (a la que hay que aspirar, siendo consciente que es inalcanzable), es saber que se cometerán errores y estar preparado para ello.

    Un saludo

  20. #1
    21/03/11 10:57

    Y cuando un comercial bancario comete el "error" de colocar un producto de alto riesgo a un ancianito analfabeto, pregunto:

    ¿Es siempre (solo) culpable la persona que firma el contrato? Opino que NO.

    ¿Por qué el banco NO quiere ni oir hablar de una solución? Desfacer el entuerto y recolocar mejor ese dinero. Opino que porque no era tal error, sino más bien su intención.

    ¿Por qué el MIFID no nos defiende? Porque lo rellena el banco a su antojo y el viejo solo firma. ¡¡Otra vez la firma!!

    ¿Por que el Defensor del Cliente hace oídos sordos de la queja del cliente? ¿Por qué busca un culpable sólo en el cliente y nunca en el asesor/comercial del propio banco para el que trabaja?

    Si en una negligencia médica se piden responsabilidades, ¿por qué no en una "negligencia" bancaria?

    Siento que mi nivel de inglés no sea ya tan elevado para leer con fluidez lo que has escrito.

    Un saludo y gracias.

Cookies en rankia.com

Utilizamos cookies propias y de terceros con finalidades analíticas y para mostrarte publicidad relacionada con tus preferencias a partir de tus hábitos de navegación y tu perfil. Puedes configurar o rechazar las cookies haciendo click en “Configuración de cookies”. También puedes aceptar todas las cookies pulsando el botón “Aceptar”. Para más información puedes visitar nuestra política de cookies.