Friday, January 29, 2010

POEM: Encounter on the stairs

http://runningahospital.blogspot.com/2007/08/poetry-from-nurses-and-doctors-part-i.html
The link has a set of poems written by doctors and nurses in a Boston hospital (USA). The one below caught me in the heart.

ENCOUNTER ON THE STAIRS
By Warner V. Slack, MD

Next to Children’s Hospital, in a hurry
Down the stairs, two at a time
Slowed down by a family, moving slowly
Blocking the stairway, I’m in a hurry
I stop, annoyed, I’m in a hurry
Seeing me, they move to the side
A woman says softly, “sorry” in Spanish
I look down in passing, there’s a little boy
Unsteady in gait, holding onto an arm
Head shaved, stitches in scalp
Patch over eye, thin and pale
He catches my eye and gives me a smile
My walk is slower for the rest of the day

Thursday, January 28, 2010

Transcript of Obama's Nobel speech

http://www.nytimes.com/2009/12/11/world/europe/11prexy.text.html?_r=2&pagewanted=1

Published: December 10, 2009
Following is the transcript of President Obama's speech at the Nobel Peace Prize ceremony in Oslo on Wednesday, as released by the White House:
Your Majesties, Your Royal Highnesses, distinguished members of the Norwegian Nobel Committee, citizens of America, and citizens of the world:

I receive this honor with deep gratitude and great humility. It is an award that speaks to our highest aspirations -- that for all the cruelty and hardship of our world, we are not mere prisoners of fate. Our actions matter, and can bend history in the direction of justice.

And yet I would be remiss if I did not acknowledge the considerable controversy that your generous decision has generated. (Laughter.) In part, this is because I am at the beginning, and not the end, of my labors on the world stage. Compared to some of the giants of history who've received this prize -- Schweitzer and King; Marshall and Mandela -- my accomplishments are slight. And then there are the men and women around the world who have been jailed and beaten in the pursuit of justice; those who toil in humanitarian organizations to relieve suffering; the unrecognized millions whose quiet acts of courage and compassion inspire even the most hardened cynics. I cannot argue with those who find these men and women -- some known, some obscure to all but those they help -- to be far more deserving of this honor than I.

But perhaps the most profound issue surrounding my receipt of this prize is the fact that I am the Commander-in-Chief of the military of a nation in the midst of two wars. One of these wars is winding down. The other is a conflict that America did not seek; one in which we are joined by 42 other countries -- including Norway -- in an effort to defend ourselves and all nations from further attacks.

Still, we are at war, and I'm responsible for the deployment of thousands of young Americans to battle in a distant land. Some will kill, and some will be killed. And so I come here with an acute sense of the costs of armed conflict -- filled with difficult questions about the relationship between war and peace, and our effort to replace one with the other.

Now these questions are not new. War, in one form or another, appeared with the first man. At the dawn of history, its morality was not questioned; it was simply a fact, like drought or disease -- the manner in which tribes and then civilizations sought power and settled their differences.

And over time, as codes of law sought to control violence within groups, so did philosophers and clerics and statesmen seek to regulate the destructive power of war. The concept of a "just war" emerged, suggesting that war is justified only when certain conditions were met: if it is waged as a last resort or in self-defense; if the force used is proportional; and if, whenever possible, civilians are spared from violence.

Of course, we know that for most of history, this concept of "just war" was rarely observed. The capacity of human beings to think up new ways to kill one another proved inexhaustible, as did our capacity to exempt from mercy those who look different or pray to a different God. Wars between armies gave way to wars between nations -- total wars in which the distinction between combatant and civilian became blurred. In the span of 30 years, such carnage would twice engulf this continent. And while it's hard to conceive of a cause more just than the defeat of the Third Reich and the Axis powers, World War II was a conflict in which the total number of civilians who died exceeded the number of soldiers who perished.

In the wake of such destruction, and with the advent of the nuclear age, it became clear to victor and vanquished alike that the world needed institutions to prevent another world war. And so, a quarter century after the United States Senate rejected the League of Nations -- an idea for which Woodrow Wilson received this prize -- America led the world in constructing an architecture to keep the peace: a Marshall Plan and a United Nations, mechanisms to govern the waging of war, treaties to protect human rights, prevent genocide, restrict the most dangerous weapons.

In many ways, these efforts succeeded. Yes, terrible wars have been fought, and atrocities committed. But there has been no Third World War. The Cold War ended with jubilant crowds dismantling a wall. Commerce has stitched much of the world together. Billions have been lifted from poverty. The ideals of liberty and self-determination, equality and the rule of law have haltingly advanced. We are the heirs of the fortitude and foresight of generations past, and it is a legacy for which my own country is rightfully proud.

And yet, a decade into a new century, this old architecture is buckling under the weight of new threats. The world may no longer shudder at the prospect of war between two nuclear superpowers, but proliferation may increase the risk of catastrophe. Terrorism has long been a tactic, but modern technology allows a few small men with outsized rage to murder innocents on a horrific scale.

Moreover, wars between nations have increasingly given way to wars within nations. The resurgence of ethnic or sectarian conflicts; the growth of secessionist movements, insurgencies, and failed states -- all these things have increasingly trapped civilians in unending chaos. In today's wars, many more civilians are killed than soldiers; the seeds of future conflict are sown, economies are wrecked, civil societies torn asunder, refugees amassed, children scarred.

I do not bring with me today a definitive solution to the problems of war. What I do know is that meeting these challenges will require the same vision, hard work, and persistence of those men and women who acted so boldly decades ago. And it will require us to think in new ways about the notions of just war and the imperatives of a just peace.

We must begin by acknowledging the hard truth: We will not eradicate violent conflict in our lifetimes. There will be times when nations -- acting individually or in concert -- will find the use of force not only necessary but morally justified.

I make this statement mindful of what Martin Luther King Jr. said in this same ceremony years ago: "Violence never brings permanent peace. It solves no social problem: it merely creates new and more complicated ones." As someone who stands here as a direct consequence of Dr. King's life work, I am living testimony to the moral force of non-violence. I know there's nothing weak -- nothing passive -- nothing naïve -- in the creed and lives of Gandhi and King.

But as a head of state sworn to protect and defend my nation, I cannot be guided by their examples alone. I face the world as it is, and cannot stand idle in the face of threats to the American people. For make no mistake: Evil does exist in the world. A non-violent movement could not have halted Hitler's armies. Negotiations cannot convince al Qaeda's leaders to lay down their arms. To say that force may sometimes be necessary is not a call to cynicism -- it is a recognition of history; the imperfections of man and the limits of reason.

I raise this point, I begin with this point because in many countries there is a deep ambivalence about military action today, no matter what the cause. And at times, this is joined by a reflexive suspicion of America, the world's sole military superpower.

But the world must remember that it was not simply international institutions -- not just treaties and declarations -- that brought stability to a post-World War II world. Whatever mistakes we have made, the plain fact is this: The United States of America has helped underwrite global security for more than six decades with the blood of our citizens and the strength of our arms. The service and sacrifice of our men and women in uniform has promoted peace and prosperity from Germany to Korea, and enabled democracy to take hold in places like the Balkans. We have borne this burden not because we seek to impose our will. We have done so out of enlightened self-interest -- because we seek a better future for our children and grandchildren, and we believe that their lives will be better if others' children and grandchildren can live in freedom and prosperity.

So yes, the instruments of war do have a role to play in preserving the peace. And yet this truth must coexist with another -- that no matter how justified, war promises human tragedy. The soldier's courage and sacrifice is full of glory, expressing devotion to country, to cause, to comrades in arms. But war itself is never glorious, and we must never trumpet it as such.

So part of our challenge is reconciling these two seemingly inreconcilable truths -- that war is sometimes necessary, and war at some level is an expression of human folly. Concretely, we must direct our effort to the task that President Kennedy called for long ago. "Let us focus," he said, "on a more practical, more attainable peace, based not on a sudden revolution in human nature but on a gradual evolution in human institutions." A gradual evolution of human institutions.

What might this evolution look like? What might these practical steps be?

To begin with, I believe that all nations -- strong and weak alike -- must adhere to standards that govern the use of force. I -- like any head of state -- reserve the right to act unilaterally if necessary to defend my nation. Nevertheless, I am convinced that adhering to standards, international standards, strengthens those who do, and isolates and weakens those who don't.

The world rallied around America after the 9/11 attacks, and continues to support our efforts in Afghanistan, because of the horror of those senseless attacks and the recognized principle of self-defense. Likewise, the world recognized the need to confront Saddam Hussein when he invaded Kuwait -- a consensus that sent a clear message to all about the cost of aggression.

Furthermore, America -- in fact, no nation -- can insist that others follow the rules of the road if we refuse to follow them ourselves. For when we don't, our actions appear arbitrary and undercut the legitimacy of future interventions, no matter how justified.

And this becomes particularly important when the purpose of military action extends beyond self-defense or the defense of one nation against an aggressor. More and more, we all confront difficult questions about how to prevent the slaughter of civilians by their own government, or to stop a civil war whose violence and suffering can engulf an entire region.

I believe that force can be justified on humanitarian grounds, as it was in the Balkans, or in other places that have been scarred by war. Inaction tears at our conscience and can lead to more costly intervention later. That's why all responsible nations must embrace the role that militaries with a clear mandate can play to keep the peace.

America's commitment to global security will never waver. But in a world in which threats are more diffuse, and missions more complex, America cannot act alone. America alone cannot secure the peace. This is true in Afghanistan. This is true in failed states like Somalia, where terrorism and piracy is joined by famine and human suffering. And sadly, it will continue to be true in unstable regions for years to come.

The leaders and soldiers of NATO countries, and other friends and allies, demonstrate this truth through the capacity and courage they've shown in Afghanistan. But in many countries, there is a disconnect between the efforts of those who serve and the ambivalence of the broader public. I understand why war is not popular, but I also know this: The belief that peace is desirable is rarely enough to achieve it. Peace requires responsibility. Peace entails sacrifice. That's why NATO continues to be indispensable. That's why we must strengthen U.N. and regional peacekeeping, and not leave the task to a few countries. That's why we honor those who return home from peacekeeping and training abroad to Oslo and Rome; to Ottawa and Sydney; to Dhaka and Kigali -- we honor them not as makers of war, but of wagers -- but as wagers of peace.

Let me make one final point about the use of force. Even as we make difficult decisions about going to war, we must also think clearly about how we fight it. The Nobel Committee recognized this truth in awarding its first prize for peace to Henry Dunant -- the founder of the Red Cross, and a driving force behind the Geneva Conventions.

Where force is necessary, we have a moral and strategic interest in binding ourselves to certain rules of conduct. And even as we confront a vicious adversary that abides by no rules, I believe the United States of America must remain a standard bearer in the conduct of war. That is what makes us different from those whom we fight. That is a source of our strength. That is why I prohibited torture. That is why I ordered the prison at Guantanamo Bay closed. And that is why I have reaffirmed America's commitment to abide by the Geneva Conventions. We lose ourselves when we compromise the very ideals that we fight to defend. (Applause.) And we honor -- we honor those ideals by upholding them not when it's easy, but when it is hard.

I have spoken at some length to the question that must weigh on our minds and our hearts as we choose to wage war. But let me now turn to our effort to avoid such tragic choices, and speak of three ways that we can build a just and lasting peace.

First, in dealing with those nations that break rules and laws, I believe that we must develop alternatives to violence that are tough enough to actually change behavior -- for if we want a lasting peace, then the words of the international community must mean something. Those regimes that break the rules must be held accountable. Sanctions must exact a real price. Intransigence must be met with increased pressure -- and such pressure exists only when the world stands together as one.

One urgent example is the effort to prevent the spread of nuclear weapons, and to seek a world without them. In the middle of the last century, nations agreed to be bound by a treaty whose bargain is clear: All will have access to peaceful nuclear power; those without nuclear weapons will forsake them; and those with nuclear weapons will work towards disarmament. I am committed to upholding this treaty. It is a centerpiece of my foreign policy. And I'm working with President Medvedev to reduce America and Russia's nuclear stockpiles.

But it is also incumbent upon all of us to insist that nations like Iran and North Korea do not game the system. Those who claim to respect international law cannot avert their eyes when those laws are flouted. Those who care for their own security cannot ignore the danger of an arms race in the Middle East or East Asia. Those who seek peace cannot stand idly by as nations arm themselves for nuclear war.

The same principle applies to those who violate international laws by brutalizing their own people. When there is genocide in Darfur, systematic rape in Congo, repression in Burma -- there must be consequences. Yes, there will be engagement; yes, there will be diplomacy -- but there must be consequences when those things fail. And the closer we stand together, the less likely we will be faced with the choice between armed intervention and complicity in oppression.

This brings me to a second point -- the nature of the peace that we seek. For peace is not merely the absence of visible conflict. Only a just peace based on the inherent rights and dignity of every individual can truly be lasting.

It was this insight that drove drafters of the Universal Declaration of Human Rights after the Second World War. In the wake of devastation, they recognized that if human rights are not protected, peace is a hollow promise.

And yet too often, these words are ignored. For some countries, the failure to uphold human rights is excused by the false suggestion that these are somehow Western principles, foreign to local cultures or stages of a nation's development. And within America, there has long been a tension between those who describe themselves as realists or idealists -- a tension that suggests a stark choice between the narrow pursuit of interests or an endless campaign to impose our values around the world.

I reject these choices. I believe that peace is unstable where citizens are denied the right to speak freely or worship as they please; choose their own leaders or assemble without fear. Pent-up grievances fester, and the suppression of tribal and religious identity can lead to violence. We also know that the opposite is true. Only when Europe became free did it finally find peace. America has never fought a war against a democracy, and our closest friends are governments that protect the rights of their citizens. No matter how callously defined, neither America's interests -- nor the world's -- are served by the denial of human aspirations.

So even as we respect the unique culture and traditions of different countries, America will always be a voice for those aspirations that are universal. We will bear witness to the quiet dignity of reformers like Aung Sang Suu Kyi; to the bravery of Zimbabweans who cast their ballots in the face of beatings; to the hundreds of thousands who have marched silently through the streets of Iran. It is telling that the leaders of these governments fear the aspirations of their own people more than the power of any other nation. And it is the responsibility of all free people and free nations to make clear that these movements -- these movements of hope and history -- they have us on their side.

Let me also say this: The promotion of human rights cannot be about exhortation alone. At times, it must be coupled with painstaking diplomacy. I know that engagement with repressive regimes lacks the satisfying purity of indignation. But I also know that sanctions without outreach -- condemnation without discussion -- can carry forward only a crippling status quo. No repressive regime can move down a new path unless it has the choice of an open door.

In light of the Cultural Revolution's horrors, Nixon's meeting with Mao appeared inexcusable -- and yet it surely helped set China on a path where millions of its citizens have been lifted from poverty and connected to open societies. Pope John Paul's engagement with Poland created space not just for the Catholic Church, but for labor leaders like Lech Walesa. Ronald Reagan's efforts on arms control and embrace of perestroika not only improved relations with the Soviet Union, but empowered dissidents throughout Eastern Europe. There's no simple formula here. But we must try as best we can to balance isolation and engagement, pressure and incentives, so that human rights and dignity are advanced over time.

Third, a just peace includes not only civil and political rights -- it must encompass economic security and opportunity. For true peace is not just freedom from fear, but freedom from want.

It is undoubtedly true that development rarely takes root without security; it is also true that security does not exist where human beings do not have access to enough food, or clean water, or the medicine and shelter they need to survive. It does not exist where children can't aspire to a decent education or a job that supports a family. The absence of hope can rot a society from within.

And that's why helping farmers feed their own people -- or nations educate their children and care for the sick -- is not mere charity. It's also why the world must come together to confront climate change. There is little scientific dispute that if we do nothing, we will face more drought, more famine, more mass displacement -- all of which will fuel more conflict for decades. For this reason, it is not merely scientists and environmental activists who call for swift and forceful action -- it's military leaders in my own country and others who understand our common security hangs in the balance.

Agreements among nations. Strong institutions. Support for human rights. Investments in development. All these are vital ingredients in bringing about the evolution that President Kennedy spoke about. And yet, I do not believe that we will have the will, the determination, the staying power, to complete this work without something more -- and that's the continued expansion of our moral imagination; an insistence that there's something irreducible that we all share.

As the world grows smaller, you might think it would be easier for human beings to recognize how similar we are; to understand that we're all basically seeking the same things; that we all hope for the chance to live out our lives with some measure of happiness and fulfillment for ourselves and our families.

And yet somehow, given the dizzying pace of globalization, the cultural leveling of modernity, it perhaps comes as no surprise that people fear the loss of what they cherish in their particular identities -- their race, their tribe, and perhaps most powerfully their religion. In some places, this fear has led to conflict. At times, it even feels like we're moving backwards. We see it in the Middle East, as the conflict between Arabs and Jews seems to harden. We see it in nations that are torn asunder by tribal lines.

And most dangerously, we see it in the way that religion is used to justify the murder of innocents by those who have distorted and defiled the great religion of Islam, and who attacked my country from Afghanistan. These extremists are not the first to kill in the name of God; the cruelties of the Crusades are amply recorded. But they remind us that no Holy War can ever be a just war. For if you truly believe that you are carrying out divine will, then there is no need for restraint -- no need to spare the pregnant mother, or the medic, or the Red Cross worker, or even a person of one's own faith. Such a warped view of religion is not just incompatible with the concept of peace, but I believe it's incompatible with the very purpose of faith -- for the one rule that lies at the heart of every major religion is that we do unto others as we would have them do unto us.

Adhering to this law of love has always been the core struggle of human nature. For we are fallible. We make mistakes, and fall victim to the temptations of pride, and power, and sometimes evil. Even those of us with the best of intentions will at times fail to right the wrongs before us.

But we do not have to think that human nature is perfect for us to still believe that the human condition can be perfected. We do not have to live in an idealized world to still reach for those ideals that will make it a better place. The non-violence practiced by men like Gandhi and King may not have been practical or possible in every circumstance, but the love that they preached -- their fundamental faith in human progress -- that must always be the North Star that guides us on our journey.

For if we lose that faith -- if we dismiss it as silly or naïve; if we divorce it from the decisions that we make on issues of war and peace -- then we lose what's best about humanity. We lose our sense of possibility. We lose our moral compass.

Like generations have before us, we must reject that future. As Dr. King said at this occasion so many years ago, "I refuse to accept despair as the final response to the ambiguities of history. I refuse to accept the idea that the 'isness' of man's present condition makes him morally incapable of reaching up for the eternal 'oughtness' that forever confronts him."

Let us reach for the world that ought to be -- that spark of the divine that still stirs within each of our souls. (Applause.)

Somewhere today, in the here and now, in the world as it is, a soldier sees he's outgunned, but stands firm to keep the peace. Somewhere today, in this world, a young protestor awaits the brutality of her government, but has the courage to march on. Somewhere today, a mother facing punishing poverty still takes the time to teach her child, scrapes together what few coins she has to send that child to school -- because she believes that a cruel world still has a place for that child's dreams.

Let us live by their example. We can acknowledge that oppression will always be with us, and still strive for justice. We can admit the intractability of deprivation, and still strive for dignity. Clear-eyed, we can understand that there will be war, and still strive for peace. We can do that -- for that is the story of human progress; that's the hope of all the world; and at this moment of challenge, that must be our work here on Earth.

Thank you very much. (Applause.)

Tuesday, January 26, 2010

On matters of grief...

From the New Yorker

Good Grief

Is there a better way to be bereaved?
by Meghan O’Rourke

February 1, 2010

One autumn day in 1964, Elisabeth Kübler-Ross, a Swiss-born psychiatrist, was working in her garden and fretting about a lecture she had to give. Earlier that week, a mentor of hers, who taught psychiatry at the University of Colorado School of Medicine, had asked her to speak to a large group of medical students on a topic of her choice. Kübler-Ross was nervous about public speaking, and couldn’t think of a subject that would hold the students’ attention. But, as she raked fallen leaves, her thoughts turned to death: Many of her plants, she reflected, would probably die in the coming frost. Her own father had died in the fall, three years earlier, at home in Switzerland, peaceful and aware of what was taking place. Kübler-Ross had found her topic. She would talk about how American doctors—who, in her experience, were skittish around seriously ill patients—should approach death and dying.

Kübler-Ross prepared a two-part lecture. The first part looked at how various cultures approach death. For the second, she brought a dying patient to class to talk with the students. Asking around at the hospital, she found Linda, a sixteen-year-old girl with incurable leukemia. Linda’s mother had just taken out an ad in a local newspaper asking readers to send Linda get-well and sweet-sixteen cards. Linda was disgusted by the pretense that her health would improve. She agreed to visit the class, where she spoke openly about how she felt. The students, Kübler-Ross observed, were rapt but nervous. They avoided dealing with the source of their discomfort—the shock of seeing an articulate, lovely young woman on the verge of death—by asking an abundance of clinical questions about her symptoms.

Soon afterward, as her biographer, Derek Gill, relates, Kübler-Ross took a job as an assistant professor of psychiatry at the University of Chicago. Four students from the Chicago Theological Seminary learned that she was interested in terminal illness and asked if she might help them study dying people’s needs. Kübler-Ross agreed to try. At Chicago’s Billings Hospital, she began a series of seminars, interviewing patients about what it felt like to die. The interviews took place in front of a one-way mirror, with students observing on the other side. This way, Kübler-Ross gave the patients some privacy while accommodating the growing number of students who wanted to watch.

Many of Kübler-Ross’s peers at the hospital felt that the seminars were exploitative and cruel, ghoulishly forcing patients to contemplate their own deaths. At the time, doctors believed that people didn’t want or need to know how ill they were. They couched the truth in euphemisms, or told the bad news only to the family. Kübler-Ross saw this indirection as a form of cowardice that ran counter to the basic humanity a doctor owed his patients. Too many doctors bridled at even admitting that a patient was “terminal.” Death, she felt, had been exiled from medicine.


Kübler-Ross began to work on a book outlining what she learned in her work with the dying. It came out in 1969, and, shortly afterward, Life published an article about one of her seminars. (“A gasp of shock jumped through the watchers,” the Life reporter wrote. “Eva’s bearing and beauty flew against the truth that the young woman was terribly ill.”) Kübler-Ross received stacks of mail from readers thanking her for starting a conversation about death. Angered by the article and its focus on death, the hospital administrators did not renew her contract. But it didn’t matter. Her book, “On Death and Dying,” became a best-seller. Soon, Kübler-Ross was lecturing at hospitals and universities across the country.

Her argument was that patients often knew that they were dying, and preferred to have others acknowledge their situation: “The patient is in the process of losing everything and everybody he loves. If he is allowed to express his sorrow he will find a final acceptance much easier.” And she posited that the dying underwent five stages: denial, anger, bargaining, depression, and acceptance.

The “stage theory,” as it came to be known, quickly created a paradigm for how Americans die. It eventually created a paradigm, too, for how Americans grieve: Kübler-Ross suggested that families went through the same stages as the patients. Decades later, she produced a follow-up to “On Death and Dying” called “On Grief and Grieving” (2005), explaining in detail how the stages apply to mourning. Today, Kübler-Ross’s theory is taken as the definitive account of how we grieve. It pervades pop culture—the opening episodes of this season’s “Grey’s Anatomy” were structured around the five stages—and it shapes our interactions with the bereaved. After my mother died, on Christmas of 2008, near-strangers urged me to learn about “the stages” I would be moving through.

Perhaps the stage theory of grief caught on so quickly because it made loss sound controllable. The trouble is that it turns out largely to be a fiction, based more on anecdotal observation than empirical evidence. Though Kübler-Ross captured the range of emotions that mourners experience, new research suggests that grief and mourning don’t follow a checklist; they’re complicated and untidy processes, less like a progression of stages and more like an ongoing process—sometimes one that never fully ends. Perhaps the most enduring psychiatric idea about grief, for instance, is the idea that people need to “let go” in order to move on; yet studies have shown that some mourners hold on to a relationship with the deceased with no notable ill effects. (In China, mourners regularly speak to dead ancestors, and one study has shown that the bereaved there suffer less long-term distress than bereaved Americans do.) At the end of her life, Kübler-Ross herself recognized how far astray our understanding of grief had gone. In “On Grief and Grieving,” she insisted that the stages were “never meant to help tuck messy emotions into neat packages.” If her injunction went unheeded, perhaps it is because the messiness of grief is what makes us uncomfortable.

Anyone who has experienced grief can testify that it is more complex than mere despondency. “No one ever told me that grief felt so like fear,” C. S. Lewis wrote in “A Grief Observed,” his slim account of the months after the death of his wife, from cancer. Scientists have found that grief, like fear, is a stress reaction, attended by deep physiological changes. Levels of stress hormones like cortisol increase. Sleep patterns are disrupted. The immune system is weakened. Mourners may experience loss of appetite, palpitations, even hallucinations. They sometimes imagine that the deceased has appeared to them, in the form of a bird, say, or a cat. It is not unusual for a mourner to talk out loud—to cry out—to a lost one, in an elevator, or while walking the dog.

The first systematic survey of grief was conducted by Erich Lindemann, a psychiatrist at Harvard, who studied a hundred and one bereaved patients at the Harvard Medical School, including relatives of soldiers and survivors of the infamous Cocoanut Grove fire of 1942. (Nearly five hundred people died in that incident, trapped in a Boston night club by a revolving front door and side exits welded shut to prevent customers from ducking out without settling their bills.) Lindemann’s sample contained a high percentage of people who had lost someone in a traumatic way, but his main conclusions have been borne out by other researchers. So-called “normal” grief is marked by recurring floods of “somatic distress” lasting twenty minutes to an hour, comprising symptoms of breathlessness, weakness, and “tension or mental pain,” in Lindemann’s words. “There is restlessness, inability to sit still, moving about in an aimless fashion, continually searching for something to do.” Often, bereaved people feel hostile toward friends or doctors and isolate themselves. Typically, they are preoccupied by images of the dead.

Lindemann’s work was exceptional in its detailed analysis of the experience of the grieving. Yet his conception of grief was, if anything, more rigid than Kübler-Ross’s: he believed that most people needed only four to six weeks, and eight to ten sessions with a psychiatrist, to get over a loss. Psychiatrists today, following Lindemann’s lead, distinguish between “normal” grief and “complicated” or “prolonged” grief. But Holly Prigerson, an associate professor of psychiatry at Harvard, and Paul Maciejewski, a lecturer in psychiatry at Brigham and Women’s Hospital, in Boston, have found that even “normal” grief often endures for at least two years rather than weeks, peaking within six months and then dissipating. Additional studies suggest that grief comes in waves, welling up and dominating your emotional life, then subsiding, only to recur. As George A. Bonanno, a clinical psychologist at Columbia University, writes in “The Other Side of Sadness: What the New Science of Bereavement Tells Us About Life After Loss” (Basic; $25.95), “When we look more closely at the emotional experiences of bereaved people over time, the level of fluctuation is nothing short of spectacular.” This oscillation, he theorizes, offers relief from the stress grief creates. “Sorrow . . . turns out to be not a state but a process,” C. S. Lewis wrote in 1961. “It needs not a map but a history.”


To say that grief recurs is not to say that it necessarily cripples. Bonanno argues that we imagine grief to be more debilitating than it usually is. Despite the slew of self-help books that speak of the “overwhelming” nature of loss, we are designed to grieve, and a good number of us are what he calls “resilient” mourners. For such people, he thinks, our touchy-feely therapeutic culture has overestimated the need for “grief work.” Bonanno tells the story of Julia Martinez, a college student whose father died in a bicycling accident. In the days after his death, she withdrew from her mother and had trouble sleeping. But soon she emerged. She went back to school, where, even if sometimes she felt “sad and confused,” she didn’t really want to talk to her friends about the death. Within a few months, she was thriving. Her mother, though, insisted that she was repressing her grief and needed to see a counsellor, which Julia did, hating every minute of it.

Bonanno wants to make sure that we don’t punish this resilient group inadvertently. Sometimes the bereaved feel as much relief as sorrow, he points out, especially when a long illness was involved, and a death opens up new possibilities for the survivor. Perhaps, he suggests, some mourners do not need to grieve as keenly as others, even for those they most love.

Yet Bonanno’s claims about resilience can have an overly insistent tone, and he himself turns out to be a rather imperfect model of it. He thrived after his own father died, but, as he relates in his book’s autobiographical passages, he became preoccupied, many years later, with performing an Eastern mourning ritual for him. The apostle of resilience is still in the grip of loss: it’s hard to avoid a sense of discordance. All of which forces the question that’s at the heart of all thinking about grief: Why do people need to grieve in the first place?

To the humanist, the answer to that question is likely to be something like: Because we miss the one we love, and because a death brings up metaphysical questions about existence for which we have few self-evident answers. But hardheaded clinicians want to know exactly what grieving accomplishes. In “Mourning and Melancholia” (1917), Freud suggested that mourners had to reclaim energy that they had invested in the deceased loved one. Relationships take up energy; letting go of them, psychiatrists theorize, entails mental work. When you lose someone you were close to, you have to reassess your picture of the world and your place in it. The more your identity was wrapped up with the deceased, the more difficult the loss. If you are close to your father but have only a glancing relationship with your mother, your mother’s death may not be terribly disruptive; by the same token, a fraught relationship can lead to an acute grief reaction.


In the nineteen-seventies, Colin Murray Parkes, a British psychiatrist and a pioneer in bereavement research, argued that the dominant element of grief was a restless “searching.” The heightened physical arousal, anger, and sadness of grief resemble the anxiety that children suffer when they’re separated from their mothers. Parkes, drawing on work by John Bowlby, an early theorist of how human beings form attachments, noted that in both cases—acute grief and children’s separation anxiety—we feel alarm because we no longer have a support system we relied on. Parkes speculated that we continue to “search” illogically (and in great distress) for a loved one after a death. After failing again and again to find the lost person, we slowly create a new “assumptive world,” in the therapist’s jargon, the old one having been invalidated by death. Searching, or yearning, crops up in nearly all the contemporary investigations of grief. A 2007 study by Paul Maciejewski found that the feeling that predominated in the bereaved subjects was not depression or disbelief or anger but yearning. Nor does belief in heavenly reunion protect you from grief. As Bonanno says, “We want to know what has become of our loved ones.”

When my mother died, Christmas a year ago, I wondered what I was supposed to do in the days afterward—and many friends, especially those who had not yet suffered an analogous loss, seemed equally confused. Some sent flowers but did not call for weeks. Others sent well-meaning e-mails a week or so later, saying they hoped I was well or asking me to let them know “if there is anything I can do to help.” One friend launched into fifteen minutes of small talk before asking how I was, as if we had to warm up before diving into the churning waters of grief. Without rituals to follow (or to invite my friends to follow), I felt abandoned, adrift. One night I watched an episode of “24” which established the strong character of the female President with the following exchange about the death of her son:



AIDE: You haven’t let your loss interfere with your job. Your husband’s a strong man, but he doesn’t have your resilience.
PRESIDENT (sternly): It’s not a matter of resilience. There’s not a day that goes by . . . when I don’t think about my son. But I’m about to take this nation to war. Grief is a luxury I can’t afford right now.


This model represents an American fantasy of muscling through pain by throwing ourselves into work; it is akin to the dream that if only we show ourselves to be creatures of will (staying in shape, eating organic) we will stave off illness forever. The avoidance of death, Kübler-Ross was right to note, is at the heart of this ethic. We have a knack for gliding over grief even in literary works where it might seem to be central, such as “Hamlet” and “The Catcher in the Rye.” Their protagonists may be in mourning, but we tend to focus instead on their existential ennui, as if the two things were unrelated. Bonanno says that when he was mourning his father he had to remind himself that “just about any topic pertaining to a dead person . . . still made people in the West uncomfortable.”


Uncomfortable and sometimes—the Johns Hopkins psychologist Kay Redfield Jamison, an expert on bipolar disorder, suggests—impatient. In her new memoir, “Nothing Was the Same” (Knopf; $25), about the death of her husband, Jamison describes an exchange, three months after his death, with a colleague who asked her to peer-review an article. Finding it difficult to switch from contemplative sadness to hardheaded rationalism, Jamison snapped, “My husband just died.” To which her colleague responded, “It’s been three months.” There’s a temporal divide between the mourner and everyone else. If you’re in mourning—especially after a relationship that spanned decades—three months may seem like nothing. Three months, to go by Prigerson’s and Maciejewski’s research, might well find you approaching the height of sorrow. If you’re not the bereaved, though, grief that lasts longer than a few weeks may look like self-indulgence.

Even Bonanno, trying to offer a neutral clinical description of grief, betrays how deeply he has bought into the muscle-through-it idea when he describes a patient who let sad feelings “bubble up” only when she could “afford to.” Many mourners experience grief as a kind of isolation—one that is exacerbated by the fact that one’s peers, neighbors, and co-workers may not really want to know how you are. We’ve adopted a sort of “ask, don’t tell” policy. The question “How are you?” is an expression of concern, but mourners quickly figure out that it shouldn’t be mistaken for an actual inquiry. Meanwhile, the American Psychiatric Association is considering adding “complicated grief” to the fifth edition of its DSM (the Diagnostic and Statistical Manual of Mental Disorders). Certainly, some mourners need more than the loving support of friends and family. But making a disease of grief may be another sign of a huge, and potentially pernicious, shift that took place in the West over the past century—what we might call the privatization of grief.

Until the twentieth century, private grief and public mourning were allied in most cultures. In many places, it used to be that if your husband died the village came to your door, bearing fresh-baked rolls or soup. As Darian Leader, a British psychoanalyst, argues in “The New Black: Mourning, Melancholia, and Depression” (Graywolf; $16), mourning “requires other people.” To lose someone was once to be swept into a flurry of rituals. In many nations—among them China and Greece—death was met with wailing and lamentation among family and neighbors. Some kind of viewing followed the cleaning of the body—what was known as a wake in Ireland, an “encoffining” in China. Many cultures have special mourning clothes: in ancient Rome, mourners wore dark togas, and the practice of wearing dark (or sometimes white) clothes was common in Continental Europe in the Middle Ages and the Renaissance. During the Victorian era in England and the United States, family members followed an elaborate mourning ritual, restricting their social lives and adhering to a dress code. They started in “full mourning” (for women, this was stiff black crêpe) and gradually moved to “half mourning” (when gray and lavender were permitted). Among Hindus, friends visit the house of the bereaved for twelve days and chant hymns to urge the soul on to the next world. In the Jewish shivah, a mourner sat on a low chair and chose whether to acknowledge visitors; those mourning their parents may recite the Kaddish for eleven months, supported by a minyan of fellow-worshippers. Even at the turn of the twentieth century, “the death of a man still solemnly altered the space and time of a social group that could be extended to include the entire community,” notes Philippe Ariès, the author of the magisterial “The Hour of Our Death” (1977), a history of Western attitudes toward dying.


Then mourning rituals in the West began to disappear, for reasons that are not entirely evident. The British anthropologist Geoffrey Gorer, the author of “Death, Grief, and Mourning” (1965), conjectures that the First World War was one cause in Britain: communities were so overwhelmed by the sheer numbers of dead that they dropped the practice of mourning for the individual. Certainly, there does seem to be an intuitive economy of grief: during war, plague, and disaster, elaborate mourning is often simplified or dispensed with, as we now see in Haiti. But many more Americans died during the Civil War than during the First World War; it seems, then, that broader changes in the culture hastened the shift.

Even before the war, according to Emily Post, mourning clothes were already becoming optional for any but the closest of kin. More people, including women, began working outside the home; in the absence of caretakers, death increasingly took place in the protective, and isolating, swaddling of the hospital. With the rise of psychoanalysis came a shift in attention from the communal to the individual experience. Only two years after Émile Durkheim wrote about mourning as an essential social process, Freud’s “Mourning and Melancholia” defined it as something fundamentally private and individual. In a stroke, the work of mourning had become internalized. As Ariès says, within a few generations grief had undergone a fundamental change: death and mourning had been largely removed from the public realm. In 1973, Ernest Becker argued, in “The Denial of Death,” that avoidance of death is built into the human mind; instead of confronting our own mortality, we create symbolic “hero-systems,” conceptualizing an immortal self that, through imagination, allows us to transcend our physical transience. (“In the early morning on the lake sitting in the stern of the boat with his father rowing, he felt quite sure that he would never die,” the young Nick Adams thinks in the last line of Ernest Hemingway’s “Indian Camp.”) Gorer himself had diagnosed an over-all silencing of the mourner: “Today it would seem to be believed, quite sincerely, that sensible, rational men and women can keep their mourning under complete control by strength of will and character, so that it need be given no public expression, and indulged, if at all, in private, as furtively as . . . masturbation.” Ariès added that this silence was “not due to the frivolity of survivors, but to a merciless coercion applied by society.”

In the wake of the AIDS crisis and then 9/11, the conversation about death in the United States has grown more open. Yet we still think of mourning as something to be done privately. There might not be a “right” way to grieve, but some of the work Bonanno describes raises the question of whether certain norms are healthier than others. In Western countries with fewer mourning rituals, the bereaved report a higher level of somatic ailments in the year following a death.

Today, Leader points out, our only public mourning takes the form of grief at the death of celebrities and statesmen. Some commentators in Britain sneered at the “crocodile tears” of the masses over the death of Diana. On the contrary, Leader says, this grief is the same as the old public grief in which groups got together to experience in unity their individual losses. As a saying from China’s lower Yangtze Valley (where professional mourning was once common) put it, “We use the occasions of other people’s funerals to release personal sorrows.” When we watch the televised funerals of Michael Jackson or Ted Kennedy, Leader suggests, we are engaging in a practice that goes back to soldiers in the Iliad mourning with Achilles for the fallen Patroclus. Our version is more mediated. Still, in the Internet age, some mourners have returned grief to a social space, creating online grieving communities, establishing virtual cemeteries, commemorative pages, and chat rooms where loss can be described and shared.

In “On Death and Dying,” Elisabeth Kübler-Ross, too, emphasized community by insisting on the importance of talking to the dying. Against the shibboleth that we die alone, Kübler-Ross thought that we should die with company. “On Death and Dying” shaped our grieving styles by helping establish the hospice movement and by an updated notion of the “good death,” in which the dying person is not only medically treated but emotionally supported.

Yet the end of Kübler-Ross’s own life was a lonely one. Like many pioneers, she was driven by messianic convictions that sometimes distanced her from her friends and family. Named “Woman of the Decade” by Ladies’ Home Journal in the nineteen-seventies, she separated from her husband and left him with the children, bought a house in Escondido, California, called it Shanti Nilaya (Final Home of Peace), and, in 1977, established it as a “growth and healing center” for the dying. She became a devoted exponent of reincarnation, arguing that death was a transition to a better stage, akin to breaking out of a cocoon. (As a volunteer in Europe after the war, she had been moved by the sight of butterflies carved into the walls of the children’s barracks at Majdanek, a concentration camp.)

Then, in 1995, Kübler-Ross suffered a stroke that left her paralyzed on one side. By 1997, living a severely circumscribed life in Arizona, she had grown depressed. “For 15 hours a day, I sit in this same chair, totally dependent on someone else coming in here to make me a cup of tea,” she told a reporter from the San Francisco Chronicle. She became known as “the death-and-dying lady who can’t seem to manage her own death.” Her isolation was chronicled in the documentary “Facing Death” (2003). It showed a solitary Kübler-Ross in her cluttered home. “I always leave the television on,” she says. “That way something is always moving.” An English muffin hardens next to her on a plate. She says that she got in the habit of saving food in case she is hungry later in the day. Her son Kenneth lives nearby and stops in “from time to time.” Yet she seems as hauntingly alone as the patients she interviewed some thirty years earlier.


It has become a truism of the hospice movement that people resist death if they have something left they need to say. After the documentary, Kübler-Ross emerged from her anomie to revisit what she had written about grief. Realizing that the stage theory had grown into a restrictive prescription for grief, she collaborated with David Kessler, a hospice expert, to write “On Grief and Grieving.” Near the end of a chapter about her own grief—which arrived late in life, following the death of her ex-husband—she noted, “I now know that the purpose of my life is more than these stages. I have been married, had kids, then grandkids, written books, and traveled. I have loved and lost, and I am so much more than five stages. And so are you.”

“On Grief and Grieving” was a personal triumph of sorts for the ailing Kübler-Ross. Yet her crusade to open up a conversation about death and grief was ultimately distorted by her own evasions: the woman who wanted us to confront death unflinchingly came to insist that it was really an opportunity for personal growth among the survivors, as if it were a Learning Annex class. As she put it in an essay for an anthology, “Death: The Final Stage of Growth” (1997), “Confrontation with death and dying can enrich one’s life and help one to become a more human and humane person.” This approach—suffused with an American “we can do it better” spirit—made grief the province of self-help rather than of the community. In the end, Kübler-Ross could perhaps have done more to help her own family grieve after her death. Like many Americans, she planned her funeral, and insisted it be a “celebration” rather than an occasion for mourning. Dozens of “E.T.” balloons were released into the air, symbolizing “unconditional love.” Perhaps we were to picture her bicycling through the sky toward home.

Behind the balloons the painful fact of mourning remains: even a good death is seldom good for the survivors. The matter-of-fact mordancy of Emily Dickinson, the supreme poet of grief, may provide more balm to the mourner than the glad tidings of those who talk about how death can enrich us. In her poem “I Measure Every Grief I Meet,” the speaker’s curiosity about other people’s grief is a way of conveying how heavy her own is:



I wonder if It weighs like Mine—
Or has an Easier size.



I wonder if They bore it long—
Or did it just begin—
I could not tell the Date of Mine—
It feels so old a pain—



I wonder if it hurts to live—
And if They have to try—
And whether—could They choose between—
It would not be—to die. ♦

When you are not the right one to break the news...

When you are not the one to break the news... a reflection from another.

Children's TV presenters stopped by police for running around with sparkly hair dryers

"We were stopped, not arrested, but they had to say 'we are holding you under the Anti-Terrorism Act because you're running around in flak jackets and a utility belt', and I said 'and please put spangly blue hairdryer' and he was, like, 'all right'."

The full article from Yahoo:

Presenters quizzed over hairdryers

Kids TV hosts Anna Williamson and Jamie Rickers said they were questioned by police under anti-terrorism powers - for carrying glittery hairdryers.

The pair, who front ITV1's Toonattik, were filming a skit for the programme on London's South Bank wearing combat gear and armed with children's walkie-talkies and hairdryers.

Their fake fatigues aroused the suspicions of patrolling police, who stopped them and took down their particulars.

Anna, 28, said: "We were filming a strand called Dork Hunters, which is to do with one of the animations we have on the show. We were out and about doing 'dork hunting' ourselves on the streets of London.

"Jamie and I were kitted out in fake utility belts, we had the whole bulletproof flakjacket thing, we've got hairdryers in our belt, a kids' £1.99 walkie-talkie, hairbrushes and all that kind of stuff, and we were being followed by a camera crew and a boom mike and we get literally pulled over by four policemen and we were issued with a warning 'under the act of terrorism'."

Jamie, 32, added: "We were stopped, not arrested, but they had to say 'we are holding you under the Anti-Terrorism Act because you're running around in flak jackets and a utility belt', and I said 'and please put spangly blue hairdryer' and he was, like, 'all right'."

The duo named the escapade as one of the most memorable moments from their time on the show, which celebrates its fifth anniversary on March 6 and 7.

Saturday, January 23, 2010

Aleppo

A New York Times article

One door closes...

... and many more open

New York Times article about architects finding other ways to make a living:

Architect, or Whatever
By KRISTINA SHEVORY
Published: January 20, 2010
AT the Ballard Farmers’ Market in Seattle on a recent weekend, passers-by could be forgiven for thinking John Morefield was running for political office. Smiling, waving and calling out hellos to everyone who walked by his stand, he was the picture of friendliness. All he needed was campaign buttons and fliers.

In fact, Mr. Morefield, 29, is no politician, but an architectural designer looking for work. He was seated at a homemade wooden stand under a sign reading “Architecture 5¢,” with a tin can nearby awaiting spare change. For a nickel, he would answer any architectural question.
In 2008, Mr. Morefield lost his job — twice — and thought he could ride out the recession doing design work for friends and family, but when those jobs dried up, he set up his stand. As someone in his 20s without many contacts or an extensive portfolio, he thought he might have an easier time finding clients on his own.
“I didn’t know what I was going to do,” Mr. Morefield said. “I had no other option. The recession was a real kick in the shorts, and I had to make this work.”
A troubled economy and the implosion of the real estate market have thrown thousands of architects and designers out of work in the last year or so, forcing them to find or create jobs. According to the latest data available from the Department of Labor, employment at American architecture firms, which peaked last July at 224,500, had dropped to 184,600 by November.
“It’s hard to find a place to hide when the economy goes down,” said Kermit Baker, the chief economist at the American Institute of Architects. “There aren’t any strong sectors now.”
And it’s not clear when the industry will recover. Architecture firms are still laying off employees, and Mr. Baker doesn’t expect them to rehire until billings recover, which he thinks won’t be until the second half of this year at the earliest.
In the meantime, many of those who have been laid off are discovering new talents often unrelated to architecture.
When Natasha Case, 26, lost her job as a designer at Walt Disney Imagineering about a year ago, she and her friend Freya Estreller, 27, a real estate developer, started a business selling Ms. Case’s homemade ice cream sandwiches in Los Angeles. Named for architects like Frank Gehry (the strawberry ice cream and sugar cookie Frank Behry) and Mies van der Rohe (the vanilla bean ice cream and chocolate chip cookie Mies Vanilla Rohe), they were an immediate hit.
“I feel this is a good time to try new things,” said Ms. Case, who did a project on the intersection of food and architecture while studying for her master’s in architecture at the University of California, Los Angeles, in 2008. “You do things you always wanted to do, something you’ve always been passionate about.”
Since she and Ms. Estreller rolled out their truck, Coolhaus, at the Coachella Valley Music and Arts Festival near Palm Springs last April, they’ve catered events for Mr. Gehry’s office, Walt Disney Imagineering and the Disney Channel.
Their initial investment was low: they bought a 20-year-old postal van on Craigslist and had it retrofitted and painted silver and bubblegum pink, all for $10,000. With seven full- and part-time employees, they now make enough to support themselves and have plans to expand (a Hamptons truck is in the works and they are trying to get their products into Whole Foods stores).
Leigh Ann Black was working as an architectural designer in Seattle when she lost her job over a year ago. After a long struggle to find work, she finally moved back to her hometown of Water Valley, Miss., in June, to take care of her sick grandmother.
Ms. Black, 30, is now living above her parents’ garage, but she finally has time to indulge her love of pottery. She recently converted an old horse barn on her family’s farm into a studio, plans to apprentice with local potters and has applied to several post-baccalaureate ceramics programs, with the hope of selling her wares at farmers’ markets and someday teaching art.
“This is not where I imagined I’d be when I turned 30, but I feel really inspired being back,” she said. “There’s something about being with family and not feeling upset about meeting rent, car payment and groceries every month. Now I have some breathing room.”
When Debi van Zyl, 33, was laid off by a small residential design firm in Los Angeles in May, she decided to do freelance design work for as long as she could, and she picked up jobs doing exhibition design for the Getty and Huntington museums. In her spare time, to relax, she started knitting what she describes as “kooky” stuffed animals like octopuses and jellyfish. Then, at the urging of the readers of her blog, she began selling them on Etsy. Les Petites Bêtes Sauvages, as she calls them, have helped her pay the rent and other bills for the last few months.
“You think you’re in charge of your profession, and then the recession hits and you realize that your career is market driven,” Ms. van Zyl said. “It’s forced me to push myself and become more individual. My motto is don’t say no to anything.”
Richard Chuk, of Lombard, Ill., said that since he lost his position as a commercial designer a year ago, when two of his firm’s clients — both developers — lost financing for their projects, he has been looking for any job he can find to support his wife and children, ages 6 and 7.
Mr. Chuk, 38, began his job search in a good mood because of the wave of optimism surrounding the presidential election. During the first three months, he sent out nearly 150 résumés, applying for many jobs he was overqualified for. (Sears, Home Depot and Lowe’s all turned him down for jobs as a designer because he was overqualified, he said.) He had only one interview.
After that, he said, he applied for the rare job that popped up but spent most of his time taking care of his children, studying for his architectural licensing exam and renovating his basement.
This month, he began commercial truck driving school.
“You feel this year of your life is gone,” Mr. Chuk said. “It’s lost wages and lost experiences. But you have to keep positive and move forward. I look at this as an education. It opens up more doors and you never know when it’ll help you.”
As for Mr. Morefield, the architect in Seattle, he started his booth (and a Web site, architecture5cents.com) with the hope that it would bring in sufficient income to get by until he could find another job. As it turned out, he received so many commissions — to build a two-story addition, a deck, a master bedroom — that he realized he could make plenty of money working for himself.
Last year, he made more than $50,000 — the highest salary he ever made working for someone else — and he expects to do even better this year.
“It’s developed into what I was supposed to do,” he said. “It’s a lot of work, it’s scary, but I love every minute of it. If someone offered me $80,000 to sit behind a computer, I wouldn’t do it.”

The 'bomb scanner' that never was

As reported in the New York Times
This fraud, has led to the diversion of masses of funds from a foundling Iraqi government that could have been to better causes, and has led to the deaths of thousands of innocent civilians.
The people behind this 'business' operation, are not only fraudalent, but thieves and mass murderers. They knew the dire consequences of selling item aimed at saving lives, that simply didn't work meant death.

How they slept at night, or lived with themselves I don't know. I just hope the justice system really does justice.

The article pasted below:

British Man Held for Fraud in Iraq Bomb Detectors

By RIYADH MOHAMMED and ROD NORDLAND
Published: January 23, 2010

BAGHDAD — The owner of a British company that supplies questionable bomb detectors to Iraq has been arrested on fraud charges, and the export of the devices has been banned, British government officials confirmed Saturday.
Iraqi officials reacted with fury to the news, noting a series of horrific bombings in the past six months despite the widespread use of the bomb detectors at hundreds of checkpoints in the capital.
“This company not only caused grave and massive losses of funds, but it has caused grave and massive losses of the lives of innocent Iraqi civilians, by the hundreds and thousands, from attacks that we thought we were immune to because we have this device,” said Ammar Tuma, a member of the Iraqi Parliament’s Security and Defense Committee.
But the Ministry of the Interior has not withdrawn the device from duty, and police officers continue to use them.
Iraqi officials said they would begin an investigation into why their government paid at least $85 million to the British company, ATSC Ltd., for at least 800 of the bomb detectors, called ADE 651s.
The British Embassy offered to cooperate with any Iraqi government investigation.
The New York Times first reported official doubts about the device in November, citing American military officials and technical experts who said the ADE 651 was useless, despite widespread reliance on it in Iraq.
The ADE 651 is a hand-held wand with no batteries or internal electronic components, ostensibly powered by the static electricity of the user, who needs to walk in place to charge it. The only moving part is what looks like a radio antenna on a swivel, which swings to point toward the presence of weapons or explosives.
“We are conducting a criminal investigation and as part of that a 53-year-old man has been arrested on suspicion of fraud by misrepresentation,” a spokesman for the Avon and Somerset Police in England said, without giving the suspect’s name in line with police policy. The suspect was released on bail, the spokesman said.
“The force became aware of the existence of a piece of equipment around which there has been many concerns and in the interests of public safety launched its investigation,” the police spokesman said.
The suspect’s identity was widely reported in the British press as Jim McCormick, managing director of ATSC Ltd., which operates out of a converted dairy in rural Somerset County, England. News reports described Mr. McCormick as a former British police officer from Merseyside.
Contacted by telephone, Mr. McCormick refused to comment on the charges or the case against him, but he insisted that ATSC would remain in business. “Our company is still fully operational,” he said.
A statement issued by the British Department for Business, Innovation and Skills said it was banning export of the ADE 651 and similar devices to Iraq and Afghanistan.
“Tests have shown that the technology used in the ADE651 and similar devices is not suitable for bomb detection,” the department said. “We acted urgently to put in place export restrictions which will come into force next week.” The statement said the department could ban export to those countries because British troops there could be put at risk by the device’s use. ATSC claims to have sold the device to 20 countries, all in the developing world.
The Supreme Board of Audit in Iraq announced it would investigate the procurement of the ADE 651, according to the board’s leader, Abdul Basit Turki. The investigation will focus on officials who previously assured auditors the device was technically sound, he said.
Maj. Gen. Jihad al-Jabiri, who is in charge of procuring the devices for the Ministry of Interior, could not be reached for comment.
In Baghdad on Saturday, the devices were still very much in use. “I didn’t believe in this device in the first place,” said a police officer at a checkpoint in central Baghdad, who spoke on the condition of anonymity because he was not authorized to speak to the media. “I was forced to use it by my superiors and I am still forced to do so.”
Another checkpoint officer said he blamed corrupt officials for bringing the ADE 651 in. “Our government is to be blamed for all the thousands of innocent spirits who were lost since these devices have been used in Iraq,” he said.
An associate of ATSC, who spoke on the condition of anonymity for fear of retaliation, said the devices were manufactured at a cost of $250 each by suppliers in Britain and Romania. “Everyone at ATSC knew there was nothing inside the ADE 651,” he said.
The Iraqi government, according to its auditors, paid $40,000 to $60,000 for each device, although it determined that ATSC was marketing the device for $16,000. The additional money was said to have been for training, spare parts and commissions.
The Times of London quoted Mr. McCormick in November as saying that the device’s technology was similar to that of dowsing or divining rods used to find water. “We have been dealing with doubters for 10 years,” he said. “One of the problems we have is that the machine does look primitive. We are working on a new model that has flashing lights.”
Shortly after the arrest on Friday, the BBC reported that it had arranged a lab test of the device and found that its bomb-detection component was an electronic merchandise tag of the sort used to prevent shoplifting.
ATSC’s brochures claim the ADE 651 can detect minute traces of explosives, drugs or even human remains at distances of up to 6 miles by air, or three-fifths of a mile by land. Scientific trials of similar devices have shown that they are no more accurate than a coin toss.
Riyadh Mohammed reported from Baghdad, and Rod Nordland from Kabul, Afghanistan.

Sunday, January 17, 2010

Whoever said Google can not reassure!

I typed 'I give up' in Google and got this.
So this be the message...
When you feel all:


















Take heart, and:


















You've got to:


















Remember to:


















and:

















Believe in:


















Always:




















and:

Saturday, January 16, 2010

Working nights...

Working nights is very different to working days. It has been said that it is a completely different animal, and that, I have to agree with.

Things I don't like about nights (amongst other things):
- Fighting my body clock
- Skeletal service at night (bare minimum of staff, and much, much less support)
- Using my brain when I feel exhausted
- 'Ships in the night' feel. Sheer lack of human contact outside of work - I work when others are sleeping, I sleep when others are awake...
- Long shifts (combined with being alone and tired, with a big workload)
- Trying to shift my body clock back to days
- Working 70+ hour weeks

Things that I like about nights
- Occasionally having random heart to hearts at 2 o'clock in the morning with other staff, when time allows
- The subdued lighting and mellowed feel
- There is a tighter sense of 'team'
- Take-aways
- The arrival of the day team, heralding the end of the drama
- Having some random days off to recover, and do life stuff!


This is an excellent document from the Royal College of Physicians on working the night shift (preparation, survival and recovery) , filled with lots of excellent advice. I really recommend every junior doctor has a read of it. I only found out about it after searching google with exasperation after having a series of awful night shifts which were horrendously busy and I was unable to sleep particularly or get decent rest between shifts. V. stressful. It has made the world of difference to how I tackle and handle nights.

Things I do to make nights shifts work:
- nap instead of eat during my break
- eat a good meal before starting, and if necessary having a 20-30 min nap an hour or 2 before starting the shift
- avoiding stodgy foods (despite cravings for really unhealthy foods), and getting in plenty of fruit and veg
- black out blinds
- ear plugs
- comfortable and cosy bed (e.g. nice sheets, dark coloured, decent pillow, warm duvet, etc...)
- going to the loo and eating before going to sleep, so my sleep is not disrupted by the calls of nature or hunger
- warm, soothing drink prior to sleep to get my mind thinking of sleep (e.g. hot cocoa or camomile tea, etc...)
- wearing sunglasses (big ones) on the way home (so no sunlight gets into my eyes)
- trying to get a bit of exercise in before starting the shift
- playing some cheerful music before starting the shift (instead of angry music)
- try to have a quick call/texts to family and friends, to stay in touch with the wider world.

Thursday, January 07, 2010