Wheat production in sub-Saharan Africa is at only 10 to 25 percent of its potential and nations can easily grow more to limit hunger, price shocks and political instability, a study showed on Tuesday.
The report, examining environmental conditions of 12 nations from Ethiopia to Zimbabwe, said that farmers south of the Sahara grew only 44 percent of the wheat consumed locally, meaning dependence on international markets prone to price spikes.
“Sub-Saharan Africa has extensive areas of land that are suitable for profitably producing wheat under rain-fed conditions,” according to the study by the non-profit International Maize and Wheat Improvement Center.
It said countries in the region were producing only between 10 and 25 percent of the amounts that the Center’s research suggested was “biologically possible and economically profitable” with a net return of $200 per hectare (2.5 acres).
The 89-page study, issued at a wheat conference in Ethiopia, said it aimed to identify ways to raise wheat production as “a hedge against food insecurity, political instability and price shocks.” (Read more)
Stateless in SomaliaPeter Leeson drawing on statistical data from the United Nations Development Project, World Bank, CIA, and World Health Organization. Comparing the last five years under the central government (1985–1990) with the most recent five years of anarchy (2000–2005), Leeson finds these welfare changes:
* Life expectancy increased from 46 to 48.5 years. This is a poor expectancy as compared with developed countries. But in any measurement of welfare, what is important to observe is not where a population stands at a given time, but what is the trend. Is the trend positive, or is it the reverse?
* Number of one-year-olds fully immunized against measles rose from 30 to 40 percent.
* Number of physicians per 100,000 population rose from 3.4 to 4.
* Number of infants with low birth weight fell from 16 per thousand to 0.3 — almost none.
* Infant mortality per 1,000 births fell from 152 to 114.9.
* Maternal mortality per 100,000 births fell from 1,600 to 1,100.
* Percent of population with access to sanitation rose from 18 to 26.
* Percent of population with access to at least one health facility rose from 28 to 54.8.
* Percent of population in extreme poverty (i.e., less than $1 per day) fell from 60 to 43.2.
* Radios per thousand population rose from 4 to 98.5.
* Telephones per thousand population rose from 1.9 to 14.9.
* TVs per 1,000 population rose from 1.2 to 3.7.
* Fatalities due to measles fell from 8,000 to 5,600.
Despite this, the US wreaked death and destruction upon the country in an attempt to establish a favorable state:
Blowback in Somalia in many cases they were chopping their head off and taking the head to the Americans or whoever. And telling them, ‘We killed this guy.’”
SEGOU, Mali — On a sweltering afternoon, Islamist police officers dragged Fatima Al Hassan out of her house in the fabled city of Timbuktu. They beat her up, shoved her into a white pickup truck and drove her to their headquarters. She was locked up in a jail as she awaited her sentence: 100 lashes with an electrical cord.
“Why are you doing this?” she recalled asking.
Hassan was being punished for giving water to a male visitor.
… [R]efugees say the Islamists are raping and forcibly marrying women, and recruiting children for armed conflict. Social interaction deemed an affront to their interpretation of Islam is zealously punished through Islamic courts and a police force that has become more systematic and inflexible, human rights activists and local officials say.
Doesn’t this type of thing happen among Islamists in London?
The Ugandan town of Mbale was brought to a standstill on Tuesday afternoon, as a naked man ran through the streets, with more than 50 men in pursuit. He was fleeing a forced circumcision.
Deo survived the forced circumcision after guards at the administrator’s office were able to disperse his assailants, but that was not before other men had fallen victim to the enforced surgical operation.
More than 40 men of various ages have been subjected to the cut in the last two days, as the town goes through a general circumcision programme, but this has faced widespread protests.
Mbale is mainly inhabited by the Bamasaba tribe, which prescribes circumcision to all males from the age of 15, and those who do not undergo the cut are forcefully circumcised.
However, it has emerged that the 40 men who were forcefully circumcised are not of the Bamasaba tribe, but rather were forced to undergo the operation, as they either had Bamasaba wives or girlfriends.
“Since they sleep with our sisters and daughters, we felt they had to be circumcised like the rest of us,” Gerald Wambedde, an advocate of forced circumcisions, said.
The leader of the exercise, Badru Wasike said the circumcision exercise was both a cultural and health exercise.
“We are helping those who feared getting circumcised through cultural processes. We are aware that circumcised men do not easily get infected with HIV/Aids. Since they love our relatives we want them to be safe,” he explained. (Read more)
A handful of circumcision advocates have recently begun haranguing the global health community to adopt widespread foreskin-removal as a way to fight AIDS. Their recommendations follow the publication of three  randomized controlled clinical trials (RCCTs) conducted in Africa between 2005 and 2007.
These studies have generated a lot of media attention. In part this is because they supposedly show that circumcision reduces HIV transmission by a whopping 60%, a figure that wins the prize for “most misleading possible statistic” as we’ll see in a minute. Yet as one editorial  concluded: “The proven efficacy of MC [male circumcision] and its high cost-effectiveness in the face of a persistent heterosexual HIV epidemic argues overwhelmingly for its immediate and rapid adoption.”
Well, hold your horses. The “randomized controlled clinical trials” upon which these recommendations are based (I use scare quotes deliberately) represent bad science at its most dangerous: we are talking about poorly conducted experiments with dubious results presented in an outrageously misleading fashion. These data are then harnessed to support public health recommendations on a massive scale whose implementation would almost certainly have the opposite of the claimed effect, with fatal consequences. As Gregory Boyle and George Hill explain in their exhaustive analysis of the RCCTs:
While the “gold standard” for medical trials is the randomised, double-blind, placebo-controlled trial, the African trials suffered [a number of serious problems] including problematic randomisation and selection bias, inadequate blinding, lack of placebo-control (male circumcision could not be concealed), inadequate equipoise, experimenter bias, attrition (673 drop-outs in female-to-male trials), not investigating male circumcision as a vector for HIV transmission, not investigating non-sexual HIV transmission, as well as lead-time bias, supportive bias (circumcised men received additional counselling sessions), participant expectation bias, and time-out discrepancy (restraint from sexual activity only by circumcised men).
That’s a whole laundry list of issues, so let me highlight a few of the more egregious. First, consider the “lack of placebo control.” What does that mean? Normally, when you’re trying to determine whether some medical intervention has a disease-fighting effect specific to its own (hypothesized) mechanisms—and over and above the placebo baseline—you have to have a control group. That group gets a dummy intervention, and nobody is supposed to know which participants were exposed to the actual treatment until after the results are in.
After all, if someone knows (or thinks) that they’re getting a great big helping of medicine, they might act in various ways—whether consciously or unconsciously—that have the effect of generating positive health outcomes but which have nothing to do with the intervention itself. In the case of circumcision, however, there’s no way not to know if you’ve received the “medicine”—you have to go through a whole surgery and then you don’t have a foreskin anymore—so this basic condition of a true clinical trial is violated in the first instance.
But that’s just the tip of the iceberg. As Boyle and Hill point out, the men who were circumcised got additional counseling about safe sex practices compared to the control group, and then they had to refrain from having sex altogether for the simple reason that their lacerated penises had to be wrapped in bandages until their wounds healed – leading to what Boyle and Hill refer to as “time-out discrepancy” in the quote above. By contrast, the non-circumcised men got to keep having sex during the full two month period during which the treatment group was in recovery mode.[emphasis added] Then, mystery of mysteries, the trials were stopped early. These issues pose serious problems for the scientific credibility of the studies. Taken together with the other flaws, here is why:
Let’s assume for a second that the circumcised men really did end up getting infected with HIV at a lower rate than the control-group men who were left intact—even though, as we will see in a moment, we have very little reason to believe that this is so. Why might that outcome have happened?
If you answered, “Because those men knew they were in the treatment group in the first place, had less sex over the duration of the study (because they had bandaged, wounded penises for much of it), and had safer sex when they had it (because they received free condoms and special counseling from the doctors), thereby reducing their overall exposure to HIV compared to the control group by a wide margin” then you are on the right track.
Step 2. How not to report results
Now why should we doubt that the circumcised men actually did have a lower rate of HIV infections in the first place, poor experiment design notwithstanding, as I suggested in the paragraph above? After all, the 60% figure that’s being thrown around in media reports is a pretty big number, and it can’t be off by that much, even if the studies had some flaws, right? Not so fast. Do you know what the “60%” statistic is actually referring to? Boyle and Hill explain:
What does the frequently cited “60% relative reduction” in HIV infections actually mean? Across all three female-to-male trials, of the 5,411 men subjected to male circumcision, 64 (1.18%) became HIV-positive. Among the 5,497 controls, 137 (2.49%) became HIV-positive, so the absolute decrease in HIV infection was only 1.31%.
That’s right: 60% is the relative reduction in infection rates, comparing two vanishingly small percentages: a clever bit of arithmetic that generates a big-seeming number, yet one which wildly misrepresents the results of the study. The absolute decrease in HIV infection between the treatment and control groups in these experiments was a mere 1.31%, which can hardly be considered clinically significant, especially given the numerous confounds that the studies failed to rule out.
Step 3. How not to make public health recommendations
So far we have been discussing problems with the experiments themselves—what’s called “internal validity” in technical terms. I really want you to read the Boyle and Hill paper here, because they go into painstaking detail about each of a long parade of flaws I can’t hope to cover in one blog post. I mean, there are a lot of flaws. Please read the paper. But let’s switch gears now and talk about the flip-side of things, or what’s called “external validity” – that is, problems with taking what you’ve (supposedly) found in a (relatively) controlled setting like an experiment and applying it to the chaotic mess that is the real world.
Lawrence Green and his colleagues published an important article on just this topic as it relates to “the circumcision solution” in the American Journal of Preventative Medicine. “Effectiveness in real-world settings,” they sensibly point out, “rarely achieves the efficacy levels found in controlled trials, making predictions of subsequent cost-effectiveness and population-health benefits less reliable.”
Some major issues with trying to roll-out circumcision in particular include the fact that the RCCT participants—who were not representative of the general population to begin with—had (1) continuous counseling and yearlong medical care, as well as (2) frequent monitoring for infection, and (3) surgeries performed in highly sanitary conditions by trained, Western doctors. All of which would be unlikely to replicate at a larger scale in the parts of the world suffering from the worst of the AIDS epidemic. And of course, circumcisions carried out in un-sanitary conditions (that is, the precise conditions that are likelier to hold in those very places) carry a huge risk of transmitting HIV at the interface of open wounds and dirty surgical instruments. So this is a serious point.
What should we conclude? Green et al. get it right: “Before circumcising millions of men in regions with high prevalences of HIV infection, it is important to consider alternatives. A comparison of male circumcision to condom use concluded that supplying free condoms is 95 times more cost effective.”
. . . .
Step 4. This is serious business
The worst part about all of this is not just that the science behind “the circumcision solution” is so shaky, but that the actual implementation of these recommendations—so vociferously pushed-for by the circumcision advocates doing this research — would very likely lead to more HIV infections, not less. The big idea here is “risk compensation” – the subject of an excellent paper by Robert Van Howe and Michelle Storms.
Risk compensation occurs when people believe they have been provided additional protection (wearing safety belts) [such that] they will engage in higher risk behavior (driving faster). As a consequence of the increase in higher risk behavior, the number of targeted events (traffic fatalities) either remains unchanged or [actually] increases.
Risk compensation will accompany the circumcision solution in Africa. Circumcision has been promoted as a natural condom, and African men have reported having undergone circumcision in order not to have to continually use condoms. Such a message has been adopted by public health researchers. A recent South African study assessing determinants of demand for circumcision listed “It means that men don’t have [to] use a condom” as a circumcision advantage in the materials they presented to the men they surveyed. [Yet] if circumcision results in lower condom use, the number of HIV infections will increase. [Citations can be found in the original paper.]
In Uganda, as Boyle and Hill uncovered, the Kampala Monitor reported men as saying, “I have heard that if you get circumcised, you cannot catch HIV/AIDS. I don’t have to use a condom.” Commenting on this problem, a Brazilian Health Ministry official stated: “[T]he WHO [World Health Organization] and UN HIV/AIDS program … gives a message of false protection because men might think that being circumcised means that they can have sex without condoms without any risk, which is untrue.”
. . . .
The studies we’ve looked at, claiming to show a benefit of circumcision in reducing transmission of HIV, are paragons of bad design and poor execution; and any real-world roll-out of their procedures would be very difficult to achieve safely and effectively. The likeliest outcome is that HIV infections would actually increase—both through the circumcision surgeries themselves performed in unsanitary conditions, and through the mechanism of risk compensation and other complicating factors of real life. The “circumcision solution” is no solution at all. It is a waste of resources and a potentially fatal threat to public health. (Read more)
Indha Adde is not simply a warlord, at least not officially, anymore. Nowadays, he is addressed as Gen. Yusuf Mohamed Siad, and he wears a Somali military uniform, complete with red beret and three stars on his shoulder. His weapons and his newfound legitimacy were bestowed upon him by the US-sponsored African Union force, known as AMISOM, that currently occupies large swaths of Mogadishu.
It is quite a turnabout. Five years ago, Indha Adde was one of Al Qaeda and the Shabab’s key paramilitary allies and a commander of one of the most powerful Islamic factions in Somalia fighting against foreign forces and the US-backed Somali government. He openly admits to having sheltered some of the most notorious Al Qaeda figures—including Fazul Abdullah Mohammed, the alleged mastermind of the 1998 bombings of the US Embassies in Kenya and Tanzania—and to deceiving the CIA in order to protect the men. (Fazul was killed in June in Mogadishu.)
“The CIA failed to convince me to work with them,” Indha Adde recalls of his meetings in Somalia, Kenya and Dubai with agency operatives beginning in 2004, when, he says, he met the CIA’s East Africa chief in the Emirates. “They offered me money, they offered funding for the region I was controlling, they offered me influence and power in Somalia through US cooperation, but I refused all those offers.” At the time, Indha Adde—like many Muslims around the globe—viewed the United States as “arrogant” and on a crusade against Islam. “Personally, I thought of even Osama [bin Laden] himself as a good man who only wanted the implementation of Islamic law,” he tells me at one of his homes in Mogadishu.
Yusuf Mohamed Siad was not always known just as Indha Adde. As one of the main warlords who divided and destroyed Somalia during the civil war that raged through the 1990s, he brutally took control of the Lower Shabelle region, which was overwhelmingly populated by a rival clan, earning him the moniker “The Butcher.” There are allegations that he ran drug and weapons trafficking operations from the Merca port. Then, as the religious and political winds began to shift in Somalia after 9/11, he remade himself into an Islamic sheik of sorts in the mid-2000s and vowed to fight foreign invaders, including rival warlords funded and directed by the CIA.
Perhaps more than any other figure, Indha Adde embodies the mind-boggling constellation of allegiances and double-crosses that has marked Somalia since its last stable government fell in 1991. And his current role encapsulates the contradictions of the country’s present: he is a warlord who believes in Sharia law, is friendly with the CIA, and takes money and weapons from AMISOM. There are large parts of Mogadishu that are not accessible without his permission, and he controls one of the largest militias and possesses more technicals (truck-mounted heavy automatic weapons) in the city than any other warlord.
While the United States and other Western powers have spent hundreds of millions of dollars on arms, training and equipment for the Ugandan and Burundian militaries under the auspices of AMISOM, the Somali military remains underfunded and under-armed. Its soldiers are poorly paid, highly undisciplined and, at the end of the day, more loyal to their clans than to the central government. That’s where Indha Adde’s rent-a-militia comes in.
Over the past year, the Somali government and AMISOM have turned to some unsavory characters in a dual effort to build something resembling a national army and, as the United States attempted to do with its Awakening Councils in the Sunni areas of Iraq in 2006, to purchase strategic loyalty from former allies of the current enemy—in this case, the Shabab. Some warlords, like Indha Adde, have been given government ministries or military rank in return for allocating their forces to the fight against the Shabab. Several are former allies of Al Qaeda or the Shabab, and many fought against the US-sponsored Ethiopian invasion in 2006 or against the US-led mission in Somalia in the early 1990s that culminated in the infamous “Black Hawk Down” incident.
Somali President Sheik Sharif Sheik Ahmed claims that Indha Adde and other warlords have sworn allegiance to the government, but it is abundantly clear from traveling extensively through Mogadishu with Indha Adde that his men are loyal to him above all else. President Sharif seemed almost detached from this reality when I met him at his offices in Mogadishu. “As more territory is gained, it will be easier to unite [the various militias] under one umbrella,” he says.
. . . .
Although there was certainly a small Al Qaeda presence in Somalia before the United States launched its operations—and Islamic militants did carry out assassinations, including the killing of four foreign aid workers in the relatively peaceful Somaliland region in late 2003 and early 2004—the actions of Qanyare and his fellow CIA-backed warlords gave the Islamic militants fodder for an effective propaganda and recruitment campaign.
Qanyare and his allied warlords engaged in a targeted kill-and-capture campaign against individuals they suspected of supporting Islamic radicals. “These people were already heinous warlords; they were widely reviled in Mogadishu. And then they start assassinating imams and local prayer leaders who had nothing to do with terror,” says Abdirahman “Aynte” Ali, a Somali analyst who has written extensively on the history of the Shabab and warlord politics. “They were either capturing them and then renditioning them to Djibouti, where there is a major American base, or in many cases they were chopping their head off and taking the head to the Americans or whoever. And telling them, ‘We killed this guy.’”
. . . .
The “US government was not helping the [Somali] government but was helping the warlords that were against the government,” Buubaa, the former foreign minister, tells me. Washington “thought that the warlords were strong enough to chase away the Islamists or get rid of them. But it did completely the opposite. Completely the opposite.”
. . . .
In the summer of 2006 the ICU, along with fighters from the Shabab, ran the CIA’s men out of town. “The warlords were ejected out of Mogadishu for the first time in sixteen years. No one thought this was possible,” recalls Aynte. From June to December 2006, the ICU “brought a modicum of stability that’s unprecedented in Mogadishu,” reopening the airport and the seaport. “You could drive in Mogadishu at midnight, no problem, no guards. You could be a foreigner or Somali. It was at total peace.”
. . . .
The Bush administration considered the ICU unreconcilable.
. . . .
“The US sponsored the Ethiopian invasion, paying for everything including the gas that it had to expend, to undertake this. And you also had US forces on the ground, US Special Operations forces. You had CIA on the ground. US airpower was a part of the story as well. All of which gave massive military superiority to the Ethiopians,” says Daveed Gartenstein-Ross, director of the Center for the Study of Terrorist Radicalization and a frequent adviser to the US military, including Centcom. “If there’s one lesson in terms of military operations of the past ten years, it’s that the US is a very effective insurgent force. In areas where it’s seeking to overthrow a government, it’s good at doing that. What it’s not shown any luck in doing is establishing a viable government structure.”
The US-backed Ethiopian forces swiftly overthrew the Islamic Courts Union and sent its leaders fleeing or to the grave. Many were rendered to Ethiopia, Kenya or Djibouti; others were killed by US Special Operations forces or the CIA. By New Year’s Day 2007, Prime Minister Gedi was installed in Mogadishu, thanks to the Ethiopians.
. . . .
“Every step taken by the US has benefited Al Shabab,” he told me. “What brought about the ICU? It was the US-backed warlords. If Ethiopia did not invade and the US did not carry out airstrikes, Al Shabab would not have survived so long, because they were outnumbered by those who had positive agendas.”
. . . .
Extrajudicial killings by Ethiopian soldiers were widely reported, particularly in the final months of 2007. Reports of Ethiopian soldiers “slaughtering” men, women and children “like goats”—slitting throats—were widespread, according to Amnesty International. Both Somali government and Ethiopian forces were accused of horrific sexual violence.
. . . .
If Somalia was already a playground for Islamic militants, the Ethiopian invasion blew open the gates of Mogadishu for Al Qaeda.
. . . .
The Ethiopian occupation began to wind down following an agreement signed in Djibouti in June 2008 between Sharif’s faction of the ARS and officials from the TFG. The “Djibouti Agreement” paved the way for Sharif to assume the presidency in Mogadishu in early 2009. To veteran observers of Somali politics, Sharif’s re-emergence was an incredible story. The United States had overthrown his ICU government only to later back him as the country’s president.
. . . .
When President Obama took office in 2009, the United States increased its covert military involvement in and around Somalia, as the CIA and JSOC intensified air and drone strikes in Somalia and Yemen, and began openly hunting people the United States alleged were Al Qaeda leaders. In September of that year, Obama authorized the assassination of Saleh Ali Nabhan, in his administration’s first known targeted-killing operation in Somalia. A JSOC team helicoptered into Somalia and gunned down Nabhan. JSOC troops then landed and collected the body. Earlier, in April, Obama had authorized JSOC to kill Somali pirates who had hijacked the Maersk Alabama, a ship operated by a major Defense Department contractor. But as the United States began striking in Somalia, the Shabab’s influence was spreading.
. . . .
By 2010 the Shabab was in control of a greater swath of Somalia—by a long shot—than the Transitional Federal Government, even though the TFG was supported by thousands of US-trained, -armed and -funded African Union troops. The Ugandan government essentially picked up where the Ethiopian government had left off, and in Mogadishu AMISOM forces consistently shelled Shabab-held neighborhoods teeming with civilians. While the United States and its allies began bumping off militant figures, the civilian death toll pushed some clan leaders to lend support to the Shabab.
. . . .
Under pressure from its paymasters to show that it had some control in Mogadishu, President Sharif’s government began turning to former ICU warlords for help. In parallel, Washington intensified its dealings with various regional power players and warlords.
By late 2010 the Obama administration unveiled what it referred to as a “dual-track” approach to Somalia wherein Washington would simultaneously deal with the “central government” in Mogadishu as well as regional and clan players in Somalia. “The dual track policy only provides a new label for the old (and failed) Bush Administration’s approach,” observed Somalia analyst Afyare Abdi Elmi. “It inadvertently strengthens clan divisions, undermines inclusive and democratic trends and most importantly, creates a conducive environment for the return of the organized chaos or warlordism in the country.”
The dual-track policy encouraged self-declared, clan-based regional administrations to seek recognition and support from the United States. “Local administrations are popping up every week,” says Aynte. “Most of them don’t control anywhere, but people are announcing local governments in the hopes that CIA will set up a little outpost in their small village.”
. . . .
the Shabab’s meteoric rise in Somalia, and the legacy of terror it has wrought, is blowback sparked by a decade of disastrous US policy that ultimately strengthened the very threat it was officially intended to crush. In the end, the greatest beneficiaries of US policy are the warlords, including those who once counted the Shabab among their allies and friends. “They are not fighting for a cause,” says Ahmed Nur Mohamed, the Mogadishu mayor. “And the conflict will start tomorrow, when we defeat Shabab. (Read more)
The US has conducted its first drone strike on Islamist militants in Somalia, marking the expansion of the pilotless war campaign to a sixth country.
The missile strike on a vehicle in the southern town of Kismayo, reported last week as a helicopter assault, wounded two senior militants with al-Shabab and several foreign fighters according to the Washington Post.
Armed Predator and Reaper drones already operate in Afghanistan, Pakistan, Iraq, Yemen and Libya, where they are controlled by the US military or the CIA.
The CIA-run programmes are controversial. Although they provide the Obama administration with a low-risk weapon against Islamist militants, they stir intense anti-American hostility among the local population.
Opposition is most vociferous in Pakistan, where the government said on Wednesday it was shutting down a big CIA drone base, and had ordered US personnel based there to leave. (Read more from guardian.co.uk)
The real evil – this crusading spirit itself – first swept over America in the late 1820s in the form of what is technically called “post-millennial pietism” (PMP). . . . It very quickly became clear that sin was not going to be stamped out very quickly by purely voluntary means, and so the PMPers rapidly turned to government to do the stamping out and the creating and the uplifting. In short, as one historian perceptively put it, for the PMPers, “government became God’s major instrument of salvation.
. . .
Slowly but surely over the decades since 1830, this mainstream Yankee Protestantism became secularized into an only vaguely Christian but passionately held Social Gospel. After all, with this sort of mindset, it was easy for God to gradually drop from sight, and for government to assume a quasi-divine role. It was left to the monster Woodrow Wilson, a PMPer to his very bones and a Ph.D. as well, to take this domestic creed and extend it to foreign policy. It was essentially a “today the U.S., tomorrow the world” credo. Once the PMPers took over the U.S. government and imposed a Kingdom of God at home, their religious duty got raised to the planetary level. As the historian James Timberlake put it, once the Kingdom of God was being established in the United States, it became “America’s mission to spread these ideals and institutions abroad so that the Kingdom could be established throughout the world. American Protestants were accordingly not content merely to work for the kingdom of God in America, but felt compelled to assist in the reformation of the rest of the world.” (James Timberlake, Prohibition and the Progressive Movement, 1900-1920, New York, Atheneum, 1970, pp. 37-38)
. . .
Since Woodrow Wilson, every American president has followed faithfully in the footsteps of the Wilsonian creed. The content of the Kingdom of God to be imposed on other nations may have changed slightly (from alcohol prohibition and coerced global “democracy” in Wilson’s day to smoking prohibition, free condoms, and global democracy in our own) but the form and the spirit remain all too much the same.
. . .
Second, the number of refugees was deliberately highly inflated by the Somali government, in order to sucker Americans into sending aid. Barre was claiming two million refugees when there were far less (he had originally claimed half a million). Thus, Maren found that one camp, Amalow, which was supposed to have 18,503 refugees, and had food allotted for that many, really had only about 3,500. As a result, far too much food was being shipped into Somalia and into the camps by the bamboozled Americans.
Not only that: just as occurred eleven years later, the American excess of food was inspired by duplicitous journalists, “who took pictures of the sick and the hungry, and the relief agencies arrived on the scene with food. And the food was being stolen.”
Moreover, Maren reveals, despite the massive theft, “no one was starving to death in the refugee camps.” Oh, there was plenty of death all right, but the death was caused by disease: malaria, measles, dysentery, diphtheria, pneumonia, river blindness. But food, though not the problem, kept pouring in and being stolen.
There was more method to this madness than simply providing free American food for Barre’s army and for the Ogaden guerrillas. As Maren perceptively points out, the Somalian government, like the Kenyan government, hates nomads. Even though the nomadic Somali refugees weren’t starving, they were attracted to settling in the refugee camps by the promise of free food. After all, it’s easier to sit in a camp and receive food for free than to have to hunt and work for it. As Maren puts it:
“Somalis are nomads who spend most of their time looking for food. If you put a pile of food in the desert they will come and get it…The famine camps were set up and they came.”
And so the American food unwittingly played into the hands of Barre and later Somali rulers: helping to build a modern socialist state by settling nomads. Maren puts the point trenchantly:
“African leaders like to settle nomads. Nomads make it hard to build a modern state, and even harder to build a socialist state. Nomads can’t be taxed, they can’t be drafted, and they can’t be controlled. They also can’t be used to attract foreign aid, unless you can get them to stay in one place.
“In addition, many African leaders, trying hard to be modern, view nomads as an embarrassment and a nuisance. Anything ‘primitive’ is an embarrassment and a nuisance. From Bamko to Nairobi I’ve listened to Africa’s elite discuss nomads as if they were vermin.”
Maren then concludes about the American relief program of the early 1980s:
“So not only was the refugee relief program feeding Barre’s army, it was settling his population of nomads…And all this was happening with the assistance of energetic young foreigners who were helping to build the infrastructure of those new, refugee-populated towns, setting up clinics, drilling wells, trying to teach the former nomads how to settle down and grow food.”
. . .
Maren and his colleague Doug Grice, who was performing the same task in the Bardera region and near the Kenyan border, sat down and wrote reports to their bosses in the USAID program. The reports concluded that the relief program was killing at least as many people as it was saving, and that the net result was to ship food to Somali soldiers who added to their income by selling food, and to enable the WSLF to use the food as rations to conduct the guerrilla war in the Ogaden. Their boss rejected the report, saying: “You guys know you can’t write this stuff. Stick to the facts,” i.e., to the amount of food missing and stolen. And, too, keep the reports technical and boring, so that no critics of the program might figure out what’s going on.
In his final report to his bosses before quitting the program, Michael Maren pointed out an economic absurdity created by the program: people in the towns wanted to know why they were not entitled to the food and health care handed out free to those refugees who had settled in the camps. A man in the town of Belet Huen – the headquarters town in the Hiran region – working for the very high salary of 800 shillings a month, could not supply his family with the amount of food the refugees in the camp received for free.
Maren concluded his report with a prophetic insight into the future: he noted that the American Private Voluntary Organizations (PVOs) were submitting hundreds of proposals to improve services to the refugees. But Maren warned:
“Expanded services to the refugees will only aggravate the problem by encouraging them to stay, and more refugees to arrive. It will spread more thinly the resource base leaving the door open for a real emergency situation in the future. The future for refugees in the camps holds only years of relief.” Instead, Maren declared, the efforts of the international community should be to get the refugees out of the camps, not to attract more.
A study of the Somali economy at the time discovered that the relief industry constituted no less than two-thirds of the Somalian economy. No way that the Somali government would give that up. And now, twelve years later, the 1981 camps are still there, “the residents of those camps are still dependent on relief food and still have no way to earn a living on their own.”
. . .
Cassidy told Maren recently:
“One of the things that got Barre and his henchmen pCCd off was when you wrote reports saying that Somalia was self-sufficient in food. That was because free food is what controls the place. The mentality is, ‘Why should we let people produce their own food and control their own lives when we can keep them under our thumbs and under the gun? We claim famine, flood, and refugees and get the food shipped in here for free. Now we’ll tell you when to eat and when you can’t eat!’”
In short, the food “crisis” has been deliberately created by the Somalian government – by Barre and his successors – in order to exert control over the Somali population, to tell them when and who shall or shall not eat. The humanitarian, said Isabel Paterson, is only happy when a country is filled with breadlines and hospitals. The humanitarian with the guillotine!
. . .
By the fall of 1989, Barre’s massacres could no longer be overlooked, and the U.S. cut off its aid to his regime.
Maren’s analysis of the current situation is that this is simply more of the same ills that have created the problem. The U.S. marines are handing everything over to the PVOs, the relief people, who aggravate the problem still more by pouring in more free food. And what do the PVOs get out of it? Fat government contracts, as well as fat donations by deluded humanitarians who think that these reliefers are doing good and helping to solve the problem. Journalists help the PVOs by getting their information from them and featuring these heads of CARE, Catholic Relief Services, and World Vision on television. The press assumes “that these are humanitarian agencies whose only goal is to help people.” In fact, warns Maren, “they are organizations that stand to reap huge benefits in the form of lucrative contracts to deliver food.”
. . .
These are the do-good relief organizations that have only made all the problems worse: “These are the same organizations that have failed for the past 10 years in Somalia and all over Africa. (Hundreds of billions of dollars of aid in Africa over the last thirty years have left the continent more famine-prone and dependent on outside relief than ever.) They had thousands of refugees in camps in 1981, and they failed to get them out of the camps. They didn’t get them their cattle back. They didn’t teach them to grow food and to be independent. They just delivered food and collected grants for development projects.” These relief agencies, Maren declares, want to fail, for “failure means a chance to try again with new grants, new film footage for fundraising campaigns, and fresh new volunteers who haven’t learned yet that aid kills.”
For the real objective of these agencies, Maren has concluded, is to raise money. . . . “Aid,” Maren declares, “is a business. It is a business in which people make careers, earn a good living, get to see interesting places, and have great stories to tell when they get stateside. It’s a business that has to earn money to pay its executives, pay for retreats and for officials to attend conferences in Rome, buy four-wheel drive vehicles, buy advertising time on television. It’s a business that makes money by attracting clients, i.e., starving, needy people.”
. . .
The crucial point, Maren concludes, is that “reckless use of food aid causes famine. It depresses local market prices and provides disincentive for farmers to grow crops.” . . . The only way to solve the problem, Maren declares, “is a way that may seem cruel”: it is to stop the food – to “wean Somalia from dependence on donated food.”
The long essay makes a strong case against the possibility of the rule of law, and talks about spontaneous justice in communities outside the state justice system.
The same is true of the violence directed against the nonviolent civil rights protestors in the American South during the civil rights movement. Although much of the white population of the southern states held racist beliefs, one cannot account for the overwhelming support given to the violent repression of these protests on the assumption that the vast majority of the white Southerners were sadistic racists devoid of moral sensibilities. The true explanation is that most of these people were able to view themselves not as perpetuating racial oppression and injustice, but as upholding the rule of law against criminals and outside agitators. Similarly, since despite the . 60s rhetoric, all police officers are not “fascist pigs,” some other explanation is needed for their willingness to participate in the “police riot” at the 1968 Democratic convention, or the campaign of illegal arrests and civil rights violations against those demonstrating in Washington against President Nixon’s policies in Vietnam, or the effort to infiltrate and destroy the sanctuary movement that sheltered refugees from Salvadorian death squads during the Reagan era or, for that matter, the attack on and destruction of the Branch Davidian compound in Waco. It is only when these officers have fully bought into the myth that “we are a government of laws and not people,” when they truly believe that their actions are commanded by some impersonal body of just rules, that they can fail to see that they are the agency used by those in power to oppress others.
The reason why the myth of the rule of law has survived for 100 years despite the knowledge of its falsity is that it is too valuable a tool to relinquish. The myth of impersonal government is simply the most effective means of social control available to the state.
. . . .
It is true that the Crits want to impose “democratic” or socialistic values on everyone through the mechanism of the law. But this does not distinguish them from anyone else. Religious fundamentalists want to impose “Christian” values on all via the law. Liberal Democrats want the law to ensure that everyone acts so as to realize a “compassionate” society, while conservative Republicans want it to ensure the realization of “family values” or “civic virtue.” Even libertarians insist that all should be governed by a law that enshrines respect for individual liberty as its preeminent value.
The Crits may believe that the law should embody a different set of values than liberals, or conservatives, or libertarians, but this is the only thing that differentiates them from these other groups. Because the other groups have accepted the myth of the rule of law, they perceive what they are doing not as a struggle for political control, but as an attempt to depoliticize the law and return it to its proper form as the neutral embodiment of objective principles of justice. But the rule of law is a myth, and perception does not change reality. Although only the Crits may recognize it, all are engaged in a political struggle to impose their version of “the good” on the rest of society. And as long as the law remains the exclusive province of the state, this will always be the case.
What is the significance of these observations? Are we condemned to a continual political struggle for control of the legal system? Well, yes; as long as the law remains a state monopoly, we are. But I would ask you to note that this is a conditional statement while you consider the following parable.
. . . .
Most people have been raised to identify law with the state. They cannot even conceive of the idea of legal services apart from the government. The very notion of a free market in legal services conjures up the image of anarchic gang warfare or rule by organized crime. In our system, an advocate of free market law is treated the same way Socrates was treated in Monosizea, and is confronted with the same types of arguments.
The primary reason for this is that the public has been politically indoctrinated to fail to recognize the distinction between order and law. Order is what people need if they are to live together in peace and security. Law, on the other hand, is a particular method of producing order. As it is presently constituted, law is the production of order by requiring all members of society to live under the same set of state-generated rules; it is order produced by centralized planning. Yet, from childhood, citizens are taught to invariably link the words “law” and “order.” Political discourse conditions them to hear and use the terms as though they were synonymous and to express the desire for a safer, more peaceful society as a desire for “law and order.”
The state nurtures this confusion because it is the public’s inability to distinguish order from law that generates its fundamental support for the state. As long as the public identifies order with law, it will believe that an orderly society is impossible without the law the state provides. And as long as the public believes this, it will continue to support the state almost without regard to how oppressive it may become.
. . . .
So, what would a free market in legal services be like? As Sherlock Holmes would regularly say to the good doctor, “You see, Watson, but you do not observe.” Examples of non-state law are all around us. Consider labor-management collective bargaining agreements. In addition to setting wage rates, such agreements typically determine both the work rules the parties must abide by and the grievance procedures they must follow to resolve disputes. In essence, such contracts create the substantive law of the workplace as well as the workplace judiciary. A similar situation exists with regard to homeowner agreements, which create both the rules and dispute settlement procedures within a condominium or housing development, i.e., the law and judicial procedure of the residential community. Perhaps a better example is supplied by universities. These institutions create their own codes of conduct for both students and faculty that cover everything from academic dishonesty to what constitutes acceptable speech and dating behavior. In addition, they not only devise their own elaborate judicial procedures to deal with violations of these codes, but typically supply their own campus police forces as well. A final example may be supplied by the many commercial enterprises that voluntarily opt out of the state judicial system by writing clauses in their contracts that require disputes to be settled through binding arbitration or mediation rather than through a lawsuit. In this vein, the variegated “legal” procedures that have recently been assigned the sobriquet of Alternative Dispute Resolution (ADR) do a good job of suggesting what a free market in legal service might be like. (35)
Of course, it is not merely that we fail to observe what is presently all around us. We also act as though we have no knowledge of our own cultural or legal history. Consider, for example, the situation of African-American communities in the segregated South or the immigrant communities in New York in the first quarter of the twentieth century. Because of prejudice, poverty and the language barrier, these groups were essentially cut off from the state legal system. And yet, rather than disintegrate into chaotic disorder, they were able to privately supply themselves with the rules of behavior and dispute-settlement procedures necessary to maintain peaceful, stable, and highly structured communities. Furthermore, virtually none of the law that orders our interpersonal relationships was produced by the intentional actions of central governments. Our commercial law arose almost entirely from the Law Merchant, a non-governmental set of rules and procedures developed by merchants to quickly and peacefully resolve disputes and facilitate commercial relations. Property, tort, and criminal law are all the products of common law processes by which rules of behavior evolve out of and are informed by the particular circumstances of actual human controversies. In fact, a careful study of Anglo-American legal history will demonstrate that almost all of the law which facilitates peaceful human interaction arose in this way. On the other hand, the source of the law which produces oppression and social division is almost always the state. Measures that impose religious or racial intolerance, economic exploitation, one group’s idea of “fairness,” or another’s of “community” or “family” values virtually always originate in legislation, the law consciously made by the central government. If the purpose of the law really is to bring order to human existence, then it is fair to say that the law actually made by the state is precisely the law that does not work.
. . . .
One thing it seems safe to assume is that there would not be any universally binding, society-wide set of “legal” rules. In a free market, the law would not come in one-size-fits-all. Although the rules necessary to the maintenance of a minimal level of order, such as prohibitions against murder, assault, and theft, would be common to most systems, different communities of interest would assuredly adopt those rules and dispute-settlement procedures that would best fit their needs. For example, it seems extremely unlikely that there would be anything resembling a uniform body of contract law. Consider, as just one illustration, the differences between commercial and consumer contracts. Commercial contracts are usually between corporate entities with specialized knowledge of industrial practices and a financial interest in minimizing the interruption of business. On the other hand, consumer contracts are those in which one or both parties lack commercial sophistication and large sums do not rest upon a speedy resolution of any dispute that might arise. In a free market for legal services, the rules that govern these types of contracts would necessarily be radically different.
This example can also illustrate the different types of dispute-settlement procedures that would be likely to arise. In disputes over consumer contracts, the parties might well be satisfied with the current system of litigation in which the parties present their cases to an impartial judge or jury who renders a verdict for one side or the other. However, in commercial disputes, the parties might prefer a mediational process with a negotiated settlement in order to preserve an ongoing commercial relationship or a quick and informal arbitration in order to avoid the losses associated with excessive delay. Further, it is virtually certain that they would want mediators, arbitrators, or judges who are highly knowledgeable about commercial practice, rather than the typical generalist judge or a jury of lay people.
The problem with trying to specify the individuated “legal systems” which would develop is that there is no limit to the number of dimensions along which individuals may choose to order their lives, and hence no limit to the number of overlapping sets of rules and dispute resolution procedures to which they may subscribe. An individual might settle his or her disputes with neighbors according to voluntarily adopted homeowner association rules and procedures, with co-workers according to the rules and procedures described in a collective bargaining agreement, with members of his or her religious congregation according to scriptural law and tribunal, with other drivers according to the processes agreed to in his or her automobile insurance contract, and with total strangers by selecting a dispute resolution company from the yellow pages of the phone book. Given the current thinking about racial and sexual identity, it seems likely that many disputes among members of the same minority group or among women would be brought to “niche” dispute resolution companies composed predominantly of members of the relevant group, who would use their specialized knowledge of group “culture” to devise superior rules and procedures for intra-group dispute resolution. (36)
I suspect that in many ways a free market in law would resemble the situation in Medieval Europe before the rise of strong central governments in which disputants could select among several fora. Depending upon the nature of the dispute, its geographical location, the parties’ status, and what was convenient, the parties could bring their case in either village, shire, urban, merchant, manorial, ecclesiastical, or royal courts. Even with the limited mobility and communications of the time, this restricted market for dispute-settlement services was able to generate the order necessary for both the commercial and civil advancement of society. Consider how much more effectively such a market could function given the current level of travel and telecommunication technology.
. . . .
My personal belief is that under free market conditions, most people would adopt compositional, rather than confrontational, dispute settlement procedures, i.e., procedures designed to compose disputes and reconcile the parties rather than render third party judgments. This was, in fact, the essential character of the ancient “legal system” that was replaced by the extension of royal jurisdiction. Before the rise of the European nation-states, what we might anachronistically call judicial procedure was chiefly a set of complex negotiations between the parties mediated by the members of the local community in an effort to reestablish a harmonious relationship. Essentially, public pressure was brought upon the parties to settle their dispute peacefully through negotiation and compromise. The incentives of this ancient system favored cooperation and conciliation rather than defeating one’s opponent. (38)
Although I have no crystal ball, I suspect that a free market in law would resemble the ancient system a great deal more than the modern one. Recent experiments with negotiated dispute-settlement have demonstrated that mediation 1) produces a higher level of participant satisfaction with regard to both process and result, 2) resolves cases more quickly and at significantly lower cost, and 3) results in a higher rate of voluntary compliance with the final decree than was the case with traditional litigation. (39)
. . . .
The fact is that there is no such thing as a government of law and not people. The law is an amalgam of contradictory rules and counter-rules expressed in inherently vague language that can yield a legitimate legal argument for any desired conclusion. For this reason, as long as the law remains a state monopoly, it will always reflect the political ideology of those invested with decisionmaking power. Like it or not, we are faced with only two choices. We can continue the ideological power struggle for control of the law in which the group that gains dominance is empowered to impose its will on the rest of society, or we can end the monopoly.
Our long-standing love affair with the myth of the rule of law has made us blind to the latter possibility.
I’m not sure how to reconcile his “Myth of the Rule of Law” with his 2006 winning of the Bastiat prize for an essay supporting the rule of law, but I think here was using the idea of the Rule of Law as a counterweight against Activist Judges.
Also by John Hasnas, Philosophical Case Against Corporate Criminal Liability
* Corporate criminal liability does not serve any legitimate purpose of punishment.
- retribution (It’s the owners, not the guilty people who are punished, and by definition, a coorporation is the seperation of ownership and control.
- deterrence (while it is a deternent, it is not deterrence by punishment of the guilty. Corporate criminal liability as deterrence is analogous to punishing parents for the crimes of their teenagers)
- rehabilitative (punishing the owners for the crime of employees is only rehabilitative in the sense of group punishment — the guilty along with the innocent)
@ 16:00 he makes a now-incorrect predition of Enron — that they won’t be prosecuted.
* No limit on discretion. No difference in evidence and burden of proof to charge individuals vs. coorporation. It’s entirely up to the discretion of the prosecutors and rife with abuse. Also corporations cannot control all their employees. Corporate criminal liability ensures innocent will be punished along with guilty.
* No difference in harm caused by individual vs. corporation. Individual criminal liability is a stronger deterrence than coorporate criminal liability. (Civil liabilities and regulations still exist.)
Why do we have it?
Supreme Court argues it’s a matter of public policy — more effective law enforcement. But this cannot (should not) override our protection of innocent people.
@ 29:00 Corporate criminal liability actually forces corporations to cooperate with Dept. of Justice, and PROTECTS guilty individuals.
Today the crime is not the criminal act, but failure to cooperate, and the punishment is a corporate endictment, which can be a death sentence.
Purpose of corporate criminal liability is to enlist corporations into the service of the “justice” system.
What’s so bad about this?
State unrestrained. Cost of law enforcement exported to private sector. No limitation to regulations.
@ 33:15 Example: KPNG investigated for tax sheltering which it camed was legal. Under threat of endictment, KPNG caved in, admitted guilt, did everything it could to help feds prosecute employees, partners, including refuing to pay attorney fees. Many of the partners thought everything they did was legal b/c KPNG’s legal department kept telling them the tax shelters were legal. KPNG ruined their defense. Pre-existing contract threatened to fire KPNG employees who disagreed with the company indictment of some of its employees.
When most people think of Somalia they think of chaos and deterioration. Some may even think of violence and mayhem. No one, however, thinks of progress when they hear about Somalia, let alone of the possibility that anarchy has been good for its development. Maybe they should.
Indicators of Somali welfare remain low in absolute terms, but compared to their status under government show a marked advance. Under statelessness life expectancy in Somalia has grown, access to health facilities has increased, infant mortality has dropped, civil liberties have expanded, and extreme poverty (less than $1 PPP/day) has plummeted. In many parts of the country even security has improved. In these areas citizens are safer than they’ve been in three decades (UNDP 2001). Somalia is far from prosperous, but it has made considerable strides since its government collapsed 15 years ago.
Despite this progress, there has been much hand-wringing over what to do about the situation of anarchy that has characterized the country since 1991. To be sure, this concern is not without cause. In the year following the state’s collapse, civil war, exacerbated by severe drought, devastated the Sub-Saharan territory killing 300,000 Somalis (Prendergast 1997). For a time it seemed that Somali statelessness would mean endless bloody conflict, starvation, and an eventual descent into total annihilation of the Somali people.
Though largely unrecognized by economists, the widespread violence that ravaged Somalia in its first year without government vanished considerably by 1994. By the mid-1990s peace prevailed over most of the country (Menkhaus 1998, 2004). Since 1997 most indicators of Somali development show slow but steady progress and today are above their pre-stateless levels. Nevertheless, conventional wisdom sees Somalia as a land of chaos, deterioration and war, and is certain that statelessness has been detrimental to Somali development.
The reason for this belief is two-fold. On the one hand, popular opinion sees government as universally superior to anarchy. Government is considered necessary to prevent violent conflicts like those that erupted when Somalia’s state first crumbled, which disrupt economic activity. Government is also considered critical to supplying public goods such as roads, schools, and law and order, which are important to the process of development. From this perspective it is easy to conclude that Somalia, which has no central government, must have been better off when it did.