A Blog by Jonathan Low

 

Feb 26, 2011

Is Your Advertising Telling A Lie? Chevron and the Environment

One of the biggest mistakes corporations make is to assume they can say one thing in their advertising (happy little children running through fields of flowers supposedly emblematic of the company's sustainability consciousness) while corporate doctrine focuses ruthlessly on curtailing costs, no matter the consequences. The tricky thing is that both are probably true. The company at some level does support indigenous communities and, at the same time, regards environmental problems as a manageable cost of doing business. However, from a brand standpoint, this is at best inefficient use of resources since the consumer of both messages resents either the hypocrisy or - if she is an investor - the wasted advertising and PR expense since they understand how it undermines the brand. Companies need to be consistent if they wish to build a brand and a corporate reputation.

In the following column, reprinted from Advertising Age magazine, Jonathan Salem Baskin lays out the reasons why such bifurcated messaging is doomed to fail. Jonathan writes a column on brand in Advertising Age magazine. His thoughts have appeared in this space before.

"The story in London's Independent newspaper last week was unequivocal: Chevron has been fined more than $8 billion for causing an environmental disaster called by some "the Amazon's Chernobyl" (it has also been fined another billion for reparations). Texaco caused the problem and fought tooth and nail for nearly a decade to avoid paying on the lawsuit. It became Chevron's fight when it bought Texaco in 2001, and the company has already vowed to appeal the verdict. Its alleged victims are now preparing for the next round.

This is the same Chevron that is running a glossy "We Agree" branding campaign that claims it's in a conversation with people about saving the planet and, oddly enough, supporting communities.

Every CMO should take note of this dichotomy: Both stories are true when presented separately. It's only when you put them together -- which is exactly what I think consumers are going to begin to do more of -- that they add up to one big lie. So you need to make sure your brand isn't risking this sort of conflict.

Like most companies, Chevron is made up of different departments with different mandates and metrics. Its legal team has been fighting a lawsuit triggered by an operational division, using various delaying tactics reminiscent of the way tobacco companies used to keep terminally ill plaintiffs out of court. Chevron's lawyers have probably received bonuses for their accomplishments, and its fees to outside counsel have seemed well spent. The operational folks they're defending probably did the job of extracting oil to the best of their ability, which included dumping billions of gallons of waste into open pits and whatever other actions met the limits of law. Maybe they got bonuses for doing their jobs, too.

In fact, I choose to believe that most people at Chevron do their jobs the way most people do at most companies: They follow the letter of the law and whatever regulations apply to their functions, whether then or now. I don't care if the lawsuit is fair and I can't judge if any verdict or settlement could be just. What matters to me professionally is how these separate activities add up to a bigger brand statement, and I find it gallingly stupid that the company has dedicated significant money and staff resources for almost two decades to fight communities, and yet it recently decided to claim it cares about them, too.

Who's talking for the brand? Operations, legal or its marketing department? Which actions are more legitimate, or merit our attention: Its handling of toxic wastes, expenditures on legal fees, or payments for branding creative? What's the truth of the Chevron brand?

Ultimately, customers decide what brands stand for, and Chevron's leadership must be daft if it thinks its gas-station consumers don't occasionally troll the internet for something more than a pickup poker game. Ditto for any state or federal regulator they're trying to impress. We learn things about brands far beyond the information marketers want us to know, and actions -- real-world activities, not clicks or qualitative surveys -- speak louder than words. You just can't take Chevron's branding seriously if you know what it's doing in Ecuador. The campaign all but dares people to check up on them.

That's what The Yes Men did when the campaign first broke last October. These culture-jammers created a faux website and press releases through which Chevron actually admitted its infractions and insults to humanity and the planet. The modified "We Agree" content said things such as that oil companies should "clean up their messes," "fix the problems they create" and "put safety first." Lots of people took them seriously and applauded the doctored stuff, as if the company had finally addressed the dissonance between its base actions and lofty claims.

And therein is the rub: Why not tell the truth and then back it up with consistent action?

I don't know about you, but my gut tells me that oil companies don't give a hoot about alternative energy or wind farms, and I don't think they have to. Big Oil drills, retrieves, processes and distributes thick gunk that hurts the Earth as it helps us power or manufacture pretty much every product we use. No rational person expects this immense impact to be perfect, and we see ample evidence around us that it's not.

These issues are huge and they involve trade-offs that deserve to be acknowledged and discussed. A real conversation would express the sentiments of The Yes Men's culture-jamming and find creative ways to help people talk with one another about what Chevron's brand truly is and does.

Promoting an isolated POV isn't education, it isn't believable, and it doesn't really work anymore. Just ask BP.

There are lessons here for every CMO, and they're relevant irrespective of what industry you're in:

Go find out about the nasty battles under way in legal or the shady links in your supply chain. These activities are as important as anything you can dream up in marketing, so you can't risk being unaware of them (or purposefully ignore them).

Prompt real conversations about real things, not symbolic gestures or finely sliced examples that support your corporate position. These conversations are happening anyway, and you look even worse if you're not proactively participating.

If you can't come up with something to say that's legitimately and consistently true, shut up. Have the guts to tell your fellow C-suiters that a brand-image campaign ain't what it used to be.

Your marketing communications no longer speaks for your brand. It's just one touchpoint. Your goal should be to make sure you're not poking your consumers in the eye with it.

Who Actually Contributes to Wisconsin State Workers' Pension Funds?

Much of the mainstream media, particularly the TV commentators, appear to a) not do their homework or b)take their cues from the conservative think tanks, whose funding matches their ideological intensity. In the case of the Wisconsin political battle in which the governor is demanding that workers surrender certain long-held benefits and 'contribute more' to their pension funds and health care, it turns out that the funding comes ENTIRELY from the workers themselves. Logically, the only way they could contribute more would be if they were paid more. Since that is not going to happen anytime soon, what is it that the governor wants?

David Cay Johnson in tax.com, explains how this toxic blend of really bad reporting and political gameplaying has distorted the facts:

"When it comes to improving public understanding of tax policy, nothing has been more troubling than the deeply flawed coverage of the Wisconsin state employees' fight over collective bargaining.

Economic nonsense is being reported as fact in most of the news reports on the Wisconsin dispute, the product of a breakdown of skepticism among journalists multiplied by their lack of understanding of basic economic principles.

Gov. Scott Walker says he wants state workers covered by collective bargaining agreements to "contribute more" to their pension and health insurance plans.

Accepting Gov. Walker' s assertions as fact, and failing to check, created the impression that somehow the workers are getting something extra, a gift from taxpayers. They are not.

Out of every dollar that funds Wisconsin' s pension and health insurance plans for state workers, 100 cents comes from the state workers.

How can that be? Because the "contributions" consist of money that employees chose to take as deferred wages – as pensions when they retire – rather than take immediately in cash. The same is true with the health care plan. If this were not so a serious crime would be taking place, the gift of public funds rather than payment for services.

Thus, state workers are not being asked to simply "contribute more" to Wisconsin' s retirement system (or as the argument goes, "pay their fair share" of retirement costs as do employees in Wisconsin' s private sector who still have pensions and health insurance). They are being asked to accept a cut in their salaries so that the state of Wisconsin can use the money to fill the hole left by tax cuts and reduced audits of corporations in Wisconsin.

The labor agreements show that the pension plan money is part of the total negotiated compensation. The key phrase, in those agreements I read (emphasis added), is: "The Employer shall contribute on behalf of the employee." This shows that this is just divvying up the total compensation package, so much for cash wages, so much for paid vacations, so much for retirement, etc.

The collective bargaining agreements for prosecutors, cops and scientists are all on-line.

Reporters should sit down, get a cup of coffee and read them. And then they could take what they learn, and what the state website says about fringe benefits, to Gov. Walker and challenge his assumptions.

And they should point out the very first words the state has posted at a web page on careers as a state employee (emphasis added):

The fringe benefits offered to State of Wisconsin employees are significant, and are a valuable part of an individual's compensation package. Coverage of the controversy in Wisconsin over unions collective bargaining, and in particular pension plan contributions, contains repeated references to the phrase "contribute more."

The key problem is that journalists are assuming that statements by Gov. Scott Walker have basis in fact. Journalists should never accept the premise of a political statement, but often they do, which explains why so much of our public policy is at odds with well-established principles.

The question journalists should be asking is "who contributes" to the state of Wisconsin' s pension and health care plans.

The fact is that all of the money going into these plans belongs to the workers because it is part of the compensation of the state workers. The fact is that the state workers negotiate their total compensation, which they then divvy up between cash wages, paid vacations, health insurance and, yes, pensions. Since the Wisconsin government workers collectively bargained for their compensation, all of the compensation they have bargained for is part of their pay and thus only the workers contribute to the pension plan. This is an indisputable fact.

Not every news report gets it wrong, but the narrative of the journalistic herd has now been set and is slowly hardening into a concrete falsehood that will distort public understanding of the issue for years to come unless journalists en masse correct their mistakes. From the Associated Press and The New York Times to Wisconsin's biggest newspaper, and every broadcast report I have heard, reporters again and again and again have written as fact what is nonsense.

Compared to tax, this economic issue that reporters have been mishandling is simple. But if journalists cannot grasp the economics of this issue, then how can we hope to have an intelligent debate about tax policy?

Dedicated tax journalists like my colleagues Lee Sheppard and Martin Sullivan at Tax Analysts have exposed, and explained in laymen terms, the arcane rules underlying the important tax debates and controversies that affect corporate and individual taxpayers. But the mainstream press is not even getting basic labor economics right, a much simpler matter.

Among the reports that failed to scrutinize Gov. Walker' s assertions about state workers' contributions and thus got it wrong is one by A.G. Sulzberger, the presumed future publisher of The New York Times, who is now a national correspondent. He wrote that the Governor "would raise the amount government workers pay into their pension to 5.8 percent of their pay, from less than 1 percent now."

Wrong. The workers currently pay 100 percent from their compensation package, but a portion of it is deducted from their paychecks and a portion of it goes directly to the pension plan.

One correct way to describe this is that the governor "wants to further reduce the cash wages that state workers currently take home in their paychecks." Most state workers already divert 5 percent of their cash wages to the pension plan, an official state website shows.

Gov. Walker says that he wants them to "contribute more" via deductions from their paychecks. But since the workers already contribute 100 percent of the money going to the pension plan the real issue is changing the accounting for this to reduce cash wages.

Once the state has settled on the compensation package for its workers then how the cash flows is merely accounting for how the costs are divvied up. If the workers got higher cash pay and diverted all of the pension contributions from their pay it would be the same amount compared to having the state pay directly into the pension funds.

By falsely describing the situation the governor has sought to create the issue as one of the workers getting a favor. The Club for Growth, in broadcast ads, blatantly lies by saying "state workers haven't had to sacrifice. They pay next to nothing for their pensions."

We expect ideological marketing organizations to shade the truth and even outright lie, as the Club for Growth has done. But journalists are supposed to check the facts, not adopt lies as truths.

Having had the good fortune long ago to train the presumed future publisher of the Los Angeles Times I focused on making sure he understood why careful checking of facts and questioning assumptions was a commercial, as well as journalistic value, for which reporters should be properly compensated because it made the paper reliable and thus more valuable to its owners. (Sadly my trainee later died and the paper was sold.)

Having worked at The New York Times I can tell you how editors might try to excuse this error. They call it "shorthand." But shorthand that is wrong is, in short, still wrong. So, Mr. Sulzberger, take the initiative and correct your error. Doing so, you would set an example that will become newsroom lore long after you retire.

Here are some other examples of inaccurate reporting of the issue, followed by a critique and a simple solution.

Todd Richmond of the Associated Press reported on Feb. 20 that the governor wants state workers "to contribute more to health care and pension costs." Richmond has repeatedly used variations of that phrase.
On Feb. 18, Michael Cooper and Katherine Q. Seelye of The New York Times reported that the legislation sponsored by Gov. Walker would "require workers to contribute more to their pension and health care plans."
Jane Ford-Stewart of the Milwaukee Journal-Sentinel' s on-line community news service reported Feb. 22 on "an effort by Gov. Scott Walker to get state employees to contribute more toward their health insurance and pensions so that the costs are more in line with contributions by workers in the private sector."

Politifact.com has a Wisconsin operation and it was also among those that got it wrong – 100 percent dead wrong -- because it assumed the facts as stated by Gov. Walker and failed to question the underlying premise. Further, contrived assumptions make it is easy for the perpetrators of the misrepresentation to point to data that support a false claim, something Politifact missed entirely, on at least two occasions, in proclaiming false statements to be true.

Given how many journalists rely on Politifact to check political assertions, instead of doing their own research, this is, by far, the inaccuracy likely to have the greatest (or most damaging effect) on subsequent reporting. (Examples of Politifact' s inaccurate assessments can be found here and also here.)

Again, the money the state "contributes" is actually part of the compensation that has been negotiated with state workers in advance so it is their money that they choose to take as pension payments in the future rather than cash wages or other benefits today.

Next, journalists should ask how elected officials are treated by the pension system. The pay of elected leaders is set by the legislature without collective bargaining. Here it is also true that any money withheld from paychecks to fund the pension plans comes from the employee (the elected leaders) but this is not the result of a negotiated compensation package so there is a colorable argument that pension benefits that are received by elected leaders beyond the wages deducted from those employees' compensation package are a gift from taxpayers.

The payroll deduction –- again, a mere accounting measure - - was 5 percent last year for "general participants," official state documents show, a rate that is 56 percent higher than the 3.2 percent rate for "elected leaders."

The rates were adjusted for 2011 and now the elected leaders pay 3.9 percent, still well below what the "general participants" collectively bargained to divert from their cash wages through this accounting device.

The rest of the money going into the plan is also wages the workers diverted, it just does not show up in paychecks as a line item, the same way that half of Social Security and Medicare taxes do not show up on paychecks, but are still part of total compensation to each worker in those plans.

I am being repetitive on purpose – experience supervising others has taught me you usually have to teach something three to seven times before it sinks in. Some management texts also make this point.

That is not to say that the state workers make too much or too little. It is to say that journalists as a class are fundamentally getting the facts wrong by not understanding compensation.

Simplistic coverage has also resulted in numerous reports that Wisconsin state workers make more than workers in Wisconsin' s private business sector. This is true only if you compare walnuts to tuna fish.

State governments (indeed almost all governments) tend to hire people with college educations, including advanced degrees. Overall, private employers in all states tend to hire people with less education. More education means more pay because there is more skill required.

America has roughly the same number of food preparers, who can be high school dropouts, as registered nurses, who require a college education. But the nurses make on average $66,500, compared to just $18,100 for the food service workers. The food service workers collectively made less than $50 billion, while the registered nurses made almost $172 billion in 2009, my analysis of the official data shows.

Business and government hire both food service workers and registered nurses, but you are much more likely to work for the government as a registered nurse than as a food preparation worker.

When you control for the education required to be a prosecutor or nurse, government workers get total compensation that is less than those in the corporate sector. This may reflect the fact that fewer and fewer private sector workers are in unions, about 7 percent at last count. As economic theory predicts, as fewer workers can bargain collectively the overall wage level falls. Effectively wiping out public employee unions would only add to downward pressure on wages, standard economic theory shows.

On the other hand, unionized state workers run a much smaller risk of going through bouts of joblessness, an economic benefit. Numerous studies indicate that public workers, including those in Wisconsin, make about 5 percent less than private sector workers when you control for education. But what is the lifetime cost, and risk, of episodic joblessness among comparable private sector workers? Is that cost equal to 5 percent or so of lifetime earnings, which would even out the differential? I have yet to read an analysis of that issue by an academic economist, much less a journalist, so I do not know the truth of that question.

What Gov. Walker has achieved in selling a false assumption as fact occurs because journalists failed to follow what I call the first and second rules of journalism. This problem is pervasive in coverage of tax and budget issues, where so much nonsense gets reported as fact by the Washington Press corps that I have stopped filing away all but the most egregious errors – and still I copy a story or three every day to use in lectures on getting it right and not writing nonsense.

And what are these two rules for journalists?

Rule One: Check it out. Be so skeptical that if your mother says she loves you, check it out.

Rule Two: Cross check again and again until you not only know the facts, but can put them in proper context and understand all sides so well that their perspective gets proper weight and lecture, or as I like to say, everyone recognizes their oar in the water.

Deadlines may make Rule Two difficult, and often impossible, in writing the first rough draft of history. We are now in the umpteenth draft and the initial mistake keeps getting repeated, as so often happens when a big story brings a herd, until it becomes accepted as unassailable truth.

The reason that falsehoods are transformed into the public' s common knowledge via inaccurate reporting is simple. When editors or producers back home get an account that differs from what the news herd says they raise questions and often delete unique and accurate insights. But if a reporter just repeats what everyone else is saying it usually sails unchallenged to print or airtime even when it is untrue.

Then there is this: How the compensation packages of state workers get divided up is not a matter of tax burdens. Only how much the state workers get paid is a matter of tax burdens.

There are two other important aspects to this, which go to the heart of tax policy and why our country is in for a long stay in the economic doldrums.

Traditional or defined benefit pension plans, properly administered, increase economic efficiency, while the newer defined contribution plans have high costs whether done one at a time through Individual Retirement Accounts or in group plans like 401(k)s.

Efficiency means that more of the money workers contribute to their pensions - - money that could have been taken as cash wages today - - ends up in the pockets of retirees, not securities dealers, trustees and others who administer and invest the money. Compared to defined benefit pension plans, 401(k) plans are vastly more expensive in investing, administration and other costs.

Individually managed accounts like 401(k)s violate a basic tenet of economics – specialization increases economic gains. That is why the average investor makes much less than the market return, studies by Morningstar show.

This goes to Adam Smith's famous insight in 1776 about specialization increasing wealth: when pins were made in full by each worker each could make only a few each day, but when one person draws the wire, another cuts, another fashions the point, etc., the output rises to tens of thousands of pins and their price falls from dear to cheap.

Expecting individuals to be experts at investing their retirement money in defined contribution plans -- instead of pooling the money so professional investors can manage the money as is done in defined benefit plans -- is not sound economics.

The concept, at its most basic, is buying wholesale instead of retail. Wholesale is cheaper for the buyers. That is, it saves taxpayers money.

The Wisconsin State Investment Board manages about $74.5 billion for an all-in cost of $224 million.

That is a cost of about 30-cents per $100, which is good but not great. However it is far less than many defined contribution plans, where costs are often $1 or more per $100.

So, I hope that Mr. Sulzberger in particular will take the initiative to correct the inaccurate reporting and show the way to other reporters, for the betterment of both America and his family' s investment And I hope that all reporters will start questioning the assumption in the governor' s position instead of assuming his statements are infallible.

My larger hope is that reporters, editors and producers will apply this thinking when covering taxes and taxation, the system by which we distribute the burdens of living in and sustaining this, the Second American Republic.

And The Award for The Best Dictator Goes To...

538 Blog analyzes the statistical data behind a variety of socio-political questions. In his most recent post, the blog's author, Nate Silver, and colleagues take on a pressing question of the day: how does one define what it means to be a dictator and who might be the best of the species? He has created The Dictator Index, covering 70 individuals who have ruled during the past 40 years with varying degrees of stability and cruelty. Strap on your crown and medals...

"The resignation of President Hosni Mubarak of Egypt has prompted a flurry of writing about other Middle Eastern monarchies and dictatorships, past and present. Indeed, across the region people are discussing, debating and in some cases taking to the press and the streets to protest oppressive forms of government.

On the other hand, the seemingly unlikely fall of Mr. Mubarak has also been cause for reflection on exactly how effective he was as an authoritarian leader. Thirty years in power in a country of nearly 80 million people is no small feat, especially when bordered on one side by a volatile Israel and Palestine, a disintegrating Sudan to the south and an unpredictable Libya to the west.

Of course, authoritarian and undemocratic leadership is not restricted to Egypt or the Middle East. Traditional and hereditary government enforced by threats of violence or exile has long been a facet of human political and social organization. Especially during times when stability and internal coherence were (often rightly) deemed to be the keys to survival of a group, nation or tribe, strong central leadership prevented dissent or personal ambition from endangering the whole group.

In today’s world, however, stability and progress are often invoked as rationales for longstanding and dictatorial regimes regardless of their applicability. In some cases driven by personal ambition, and other times by genuine belief in one’s ability to be the transformational leader needed by his or her people (regardless of the reality), dictators today range from the iron-fisted Teodoro Obiang of Equatorial Guinea, who has ruled since 1979, to Frank Bainimarama, the controversial Fijian naval officer who took power in his country in a 2006 coup, but has since battled with the national courts over its legality.

Even deciding who is a dictator and who is not is a tricky business. Much like the distinction between “freedom fighter” and “terrorist,” the difference between a “dictator” and a “stable and progressive monarchy” is a matter of perspective. Many American analysts are quick to call Fidel Castro and Ali Khamenei dictators of the worst kind, while leaving the monarchies of Jordan and Saudi Arabia alone and ignoring “friendly” oil-producing West African dictators in Angola, Gabon or Equatorial Guinea.

With that said, I would like to introduce you to the Dictator Index, a measure of the relative effectiveness of national leaders with authoritarian tendencies in the last 40 years or so. The initial list of candidates for Best Dictator ran more than 70 people long, making it necessary to set a few cutoffs for inclusion. Only people who were in power since 1970 have been included (some began their rule before that), and only leaders who managed to stay in power for 15 or more years have made it to the final round of evaluation. Apparently, if you want to be counted among the elite of authoritarian rulers, you must put in a good 20 or 30 years – no small feat, indeed.

The 34 potential dictators have been measured in six main categories that define well-executed despotic rule. First, as alluded to by the cliché, “Possession is nine-tenths of the law,” the number of years a leader was in possession of national power is an important factor. Second, allowing elections and/or democratic processes is ranked on a 0 to 5 scale, with 5 being free and fair elections and 0 being absolutely none. Candidates, naturally, receive Dictator Index points for being less democratic. Similarly, tolerance of opposition or the publication of opposition ideas is ranked on a 0 to 5 scale, with 0 being no tolerance of opposition, and 5 being a free and influential opposition. The fourth factor is the level of development that the country achieved during the rule of the dictator, as compared with its local region and/or comparable countries. This is measured using the United Nations Development Program’s Human Development Index (HDI) and where those data are not available, Life Expectancy at birth as measured by the United Nations Department of Social and Economic Affairs. Fifth, the candidates are measured on how much wealth they were able to expropriate through their charge as a national leader. Again, this is based on a 0 to 5 ranking, using lifestyle and broad brackets of proven personal wealth. It is quite challenging to get exact figures on each person, however we can clearly differentiate between the wealth of a Mobutu Sese Seko from Congo, who embezzled every cent he possibly could, and Kim Il Sung of North Korea, who while rather unbalanced, did not actually embezzle that much. Lastly, the Dictator Index takes into account the size of each country, assuming that it is generally harder keep hold of a huge country like Egypt or Indonesia than a small principality like Brunei or Swaziland.

So without further ado, I am pleased to present the award of Best Modern Dictator to Muammar el-Qaddafi of Libya, who topped a very competitive list of authoritarians.

Rank Leader Country Dictator Index
1 Muammar al-Qaddafi Libya 81.0
2 Francisco Franco Spain 72.3
3 Hassanal Bolkiah Brunei 69.4
4 Kim Il Sung North Korea 62.6
5 Fahd bin Saud Saudi Arabia 61.2
6 Hafez al-Assad Syria 61.1
7 Fidel Castro Cuba 60.7
8 Saddam Hussein Iraq 59.3
9 Mobutu Sese Seko DR Congo 56.9
10 Hosni Mubarak Egypt 55.3
11 Omar Bongo Gabon 54.7
12 Kim Jong-il North Korea 54.2
13 Suharto Indonesia 54.2
14 Hussein bin Talal Jordan 54.0
15 Nicolae Ceauşescu Romania 53.0
16 Teodoro Obiang Equatorial Guinea 47.4
17 Todor Zhivkov Bulgaria 46.3
18 Gnassingbé Eyadéma Togo 43.9
19 Paul Biya Cameroon 43.3
20 Mswati III Swaziland 43.1
21 Hastings Banda Malawi 41.9
22 Saparmurat Niyazov Turkmenistan 40.5
23 Ahmed Sékou Touré Guinea 40.0
24 José Eduardo dos Santos Angola 39.9
25 Jean-Claude Duvalier Haiti 39.8
26 Ali Khamenei Iran 38.4
27 Augusto Pinochet Chile 36.7
28 Islom Karimov Uzbekistan 33.2
29 Zine Ben Ali Tunisia 31.8
30 Daniel arap Moi Kenya 29.8
31 Than Shwe Myanmar 22.4
32 Robert Mugabe Zimbabwe 20.5
33 Idriss Déby Chad 19.7
34 Omar al Bashir Sudan 17.1

In second place is the controversial Francisco Franco of Spain who led a highly oppressive regime for 39 years, though presided over a post-World War II economic boom that was called the Spanish Miracle. Third place is occupied by the Sultan of oil-rich Brunei, Hassanal Bolkiah, who has overseen two decades of economic expansion, though at the price of political and social freedoms in the small kingdom. Kim Il Sung, the father of the current North Korean leader Kim Jong Il, led a highly oppressive regime for 45 years, which made up for his relatively low accumulated wealth and the stagnant economic and health situation that existed during this rule. Rounding out the Top 5 is the enormously wealthy King Abdullah bin Saud of Saudi Arabia, who, like his brothers before him, allows virtually no opposition to his rule and certainly no elections.

The rest of the Top 10 are not surprising, with Saddam Hussein, Mobuto Sese Seko, Hosni Mubarak, Hafez al-Asad and Fidel Castro – who led the list with 47 years in power – taking positions 6 through 10 on the index.

Surprise underperformers on the index include Robert Mugabe, who while having spent more than 30 years at the helm in Zimbabwe, has actually overseen several contested elections, has a fairly active opposition, and has seen a crash of all the development indicators of health and education in the country. Similarly, Augusto Pinochet of Chile and Omar al-Bashir of Sudan, while maintaining reputations of being men of iron, did indeed allow elections or referenda that strongly challenged their rule (Pinochet was actually removed from power democratically, and al-Bashir is within weeks of losing the southern portion of Sudan to independence).

In conclusion, while there are many oppressive governments around the world, there is only one Qaddafi, who manages to mix Leader for Life status, an enormous quantity of oil wealth, an impressive wardrobe and just enough craziness to keep his enemies and opponents on their toes.

Feb 25, 2011

Innovation and Anti-Trust: A Surprising Nexus of Future Growth?

Government regulation has been derided but as anyone who uses the internet realizes, without government protection, services that have become an essential part of modern life could well be denied or charged at a rate that could make them inaccessible. The leader of the US Department of Justice's Anti-Trust Department discusses her evaluation of tech deals and the disputes that could determine levels of service as well as the industry's ability to innovate.

Elizabeth Wasserman reports in Politico:


"A veteran of the tech industry’s antitrust wars, Christine Varney was in the middle of another one last year as cable giant Comcast was seeking Justice Department approval for its acquisition of NBC Universal.

The nation’s top antitrust enforcer, Varney zeroed in immediately on the potential competitive hurdle to the deal at a meeting with public interest groups in her office.

“She very quickly honed in on the online video problem,” Corie Wright, policy counsel for Free Press, recalled of the meeting last summer. “That was a good sign. That showed she was being proactive in looking at the far-reaching implications of the merger.”

Varney’s office impressed public interest lawyers by working with the FCC to wrangle concessions out of Comcast and NBC that recognized the emerging online video market as a potential competitor to cable in the digital age. The companies agreed to requirements to make programming available to online video distributors on the same terms they offer to cable providers.

“We don’t have a particular dog in that fight or a particular vision of how it should turn out,” Varney told POLITICO in a recent interview, “but what we are concerned about is that the market — and the market participants — be able to continue to innovate.”

Varney is now the traffic cop at the increasingly busy intersection of business and technology. It’s a job made even more important as the Obama administration pushes U.S. innovation as an economic priority.

As the DOJ’s assistant attorney general for antitrust, she is in charge of assessing whether proposed mergers violate antitrust laws, in addition to policing business practices to make sure industry powerhouses — in technology and other sectors — aren’t abusing their might illegally to cut out competitors and disadvantage consumers.

In the tech industry, it’s a particularly delicate balance between ensuring market competition while not stifling innovation. It’s a world in which innovators can create important new markets for devices or services seemingly overnight — and rapidly achieve levels of dominance that spark antitrust complaints from competitors.

These days, the roster of technology companies now under scrutiny by Varney’s office includes a who’s who of Silicon Valley.

Google’s proposed acquisition of flight-software maker ITA is under examination for the deal’s potential impact on the online travel industry.

Apple’s launch last week of a subscription service for publications and entertainment through its App Store is getting a preliminary review, sources confirmed to POLITICO. The department started looking into Apple’s iTunes store last year. Other tech titans that have received recent scrutiny include Oracle, Microsoft, Yahoo! and IBM.

Varney declined to talk about any pending cases. She has recused herself from overseeing the Google-ITA merger because ITA hired her former law firm; under the Obama administration’s ethics rules, appointees agree to avoid contacts with previous employers for two years. Nevertheless, she discussed in detail how antitrust enforcement can help keep competition alive in the dynamic technology sector.

One way is by considering nascent markets when reviewing corporate mergers. “Technology is such a rapidly evolving field, but in any merger — not only technology — you’re looking at the future. You’re looking at what future competition looks like and how you preserve the competitive marketplace,” Varney said. “In technology, I think you’re also looking at innovation. You want to make sure no merger is unduly quelling or retarding innovation.”

Those were some of the driving factors in the DOJ’s antitrust review of the Comcast-NBC deal, which was approved last month with a set of conditions. “What we had were the country’s largest cable provider and one of the country’s premier content providers coming together and creating some efficiencies — but also changing the economic incentives of both the cable provider and the content provider,” Varney said.

Those economic incentives could have resulted in a bottleneck that thwarted the emerging market for online video distribution — a growing field with competitors that range from Google TV to Hulu to Netflix. Under conditions set by the DOJ and FCC, Comcast and NBC agreed to sell programming to online video providers and to uphold net neutrality principles and not discriminate against Web traffic carrying this content to customers.

“The remedies that we put into place in that instance, in our view, are the closest approximation to embedding or retaining the existing market incentives prior to the merger,” Varney said.

The Comcast-NBC conditions mark an example of “a philosophical change” from lax antitrust enforcement during the Bush administration and “a new model for these types of mergers,” said Bert Foer, founder and president of the American Antitrust Institute, an independent think tank that tends to take an activist view of antitrust enforcement.

“They made the decision not to try to stop the merger but to compromise by a series of detailed conditions,” Foer said. “It’s a regulatory type of intervention and the ability to evaluate how it works is going to take years.”

Foer said he believed that the DOJ was positioning itself for the same type of remedy for approving the Google-ITA merger. The antitrust group came out against that merger in a report last week.

Others said the Comcast-NBC merger review was a success — not only in terms of ensuring that Comcast shares content with competitors and complies with FCC Open Internet rules, but because the DOJ and FCC worked hand in hand on the review.

Capitol Hill veteran Colin Crowell, who worked on telecommunications and other issues for Rep. Ed Markey (D-Mass.) for 21 years, recalled that the antitrust merger review process during the Bell telephone mergers of the 1990s and early 2000s was handled one agency at a time. The DOJ would take up analyses of the relevant markets and either develop a case or negotiate for consent decrees and only later would the FCC weigh in with broader public interest concerns.

“It really was one shoe dropped and then the other shoe dropped,” said Crowell, who now runs his own firm, Crowell Strategies. “While they may have been in communication, it wasn’t apparent that they were in coordination.”

The difference this time was that DOJ and FCC staff communicated and coordinated how the two agencies were developing evidence, the economic analysis, the benchmarks for the market and so forth. Both agencies had online access to documents from the companies through a secure online “e-room,” which officials believe was a first for the government agencies.

Varney earned her stripes in both the private and public sector.

During her previous stints in government, Varney served as a Federal Trade Commissioner from 1994 to 1997 and was a leading voice on a variety of Internet issues, including online privacy. During the early years of the Clinton administration, she was an assistant to the president and secretary to the Cabinet — in the latter role serving as the main point of contact for the 20-member Cabinet and helping to facilitate coordination between the White House and various agencies.

After leaving the FTC, Varney in 1997 started the Internet law division at Hogan & Hartson, now Hogan Lovells. One of her biggest clients was Netscape Communications, the company that popularized the Web browser until beaten in the “browser war” by Microsoft’s Internet Explorer. Varney advised Netscape during the DOJ’s monopoly abuse case against Microsoft. Later, she became a lawyer and lobbyist for such groups as the Online Privacy Alliance and such companies as AOL, eBay, MySpace and DoubleClick, which was acquired in 2007 by Google.

“What we’ve learned is that the antitrust laws are very suitable in the digital age for protecting innovation,” Varney said.

Book It: Why Barnes & Noble Needs To Publish

The print media economics get grimmer. Electronic disintermediation has taken its toll on the publishing industry, squeezing the profit out of writing for everyone from author to retailer. The strategic reality of the industry's shrinking margins is not lost on B&N's leaders.

Michael Wolf explains the economic reality for books and those who love them in GigaOm:

"Talk about frustrating: This week Barnes & Noble announced topline growth year over year and its first profit in four quarters, and how was it rewarded for its hard work?

With a pounding by Wall Street.

The drubbing was due in part to the news the company was eliminating its dividend in order to invest more in its digital business, but there’s no doubt the recent Borders bankruptcy filing weighed on the minds of investors. After all, B&N is the Coke to Borders’ Pepsi, and it’s easy to assume what happens to one will eventually inflict the other.

But as this excellent answer on Quora by former Borders employee Mark Evans points out, Borders failed for numerous reasons, the most important of which was its outsourcing of online to Amazon. What B&N realized — and Borders didn’t — was you don’t become a true online retailer by outsourcing the business, especially to what may be your number one competitor.

B&N’s strategic decision to build its own site was critical, and the same can be said for its decision to create its own e-reader, the Nook. The company is making good headway, claiming this week it owns 25 percent of the e-book market. That’s impressive (if true) in a market that includes what may be the two most fearsome competitors in the digital media space — Apple and Amazon.

But as I discuss in my weekly analysis at GigaOM Pro (subscription required), they must do more. Let’s face it, the total pie in books is going to shrink, and the long and unwieldy value-chain from writer to customer is going to collapse. Amazon knew this a long time ago, and that’s why they’ve been moving to disintermediate the publisher and the wholesaler in the e-book world by becoming, essentially, the entire value chain themselves.”

B&N should do the same, and do it quick. Sure, like Amazon, it launched its own self-pub platform in PubIt!, and it tinkered around with a few imprints on the print side for some time. But in the collapsing world of books, it’s every man for himself, and its time for B&N to accelerate its push into becoming a digital publisher.

Longtime agent and e-book pioneer Richard Curtis suggested that maybe it’d be a good idea for one of the big publishers to pick up Barnes & Noble. Maybe, but I think things might make more sense the other way around, with B&N either acquiring a publisher or, perhaps more likely (and more wisely), investing more in an organic effort to become a leading publisher of e-books without the legacy cost-burden of New York based publishing.

Will the publishers complain? Of course, but there’s nothing they can do about it. In the end, publishers have to work with B&N and, as Amazon has shown, not being liked has nothing to do with how successful you will eventually be.

Cell Phone Samba: Brazil Sees the Future of Music in Mobile Downloads

Brazil's legendary contributions to music make its trends in dissemination important. The central role that music plays in Brazilian culture means that developments there may be indicative of global trends.

Luis Henrique Viera explains the development in Fox News Latino:

"After the death of popular music download sites Napster and Kazaa, Brazil went in a very different direction it came to buying songs. According to Universal Music Group, 30% of revenue in the country comes from digital sales. Within this amount, 30% of the sales are online and 70% comes from cell phones – almost the opposite from the rest of the world, where online represents 80% of the total. The music industry sees this as a new breath in sales in the biggest market of Latin America.

Part of the phenomenon is explained by the recent growth of the middle class in Brazil. Economist Ricardo Amorim, an expert consultant in emerging markets and a former strategist at WestLB bank in New York, explained the issue: “There are three times more users of cell phones in Brazil than [personal computer] users. Also, here there are more mobile phones than people,” said Mr. Amorim.

The biggest Brazilian label, Som Livre, has caught on to the trend. The company is set to launch a site called Escute (Listen). The goal is to goose legal music downloads. Som Livre still has a relatively small percentage of revenue coming from mobile purchases: 5%.

“We are trying to make a platform with easy navigation, a vast catalog, cheap packages and easy payment by cell phone, as well. The advantage is that the customer can buy 24 hours a day, wherever he is,” said Isadora Wegner, marketing analyst of Som Livre. She also said that Escute will allow clients to purchase entire albums; until now, each track has been sold individually.

Brazil already has other popular music-download pay sites. One of them is Sonora. Part of Terra Networks, Sonora charges a monthly fee which allows customers to download any song they want.

Despite strong mobile sales, executives from the sector see the market as still challenging. Piracy remains a big problem. “People really don't respect an artistic piece as they respect a product like a car, for example,” said the president of Som Livre, Leonardo Ganem.

Amorim, the economist, agrees that focusing on mobile phones sales is a good idea for the music industry. “The population's income is growing and the trend cannot be reversed,” he said.

For 17-year-old high school student Fernanda Nagami, there are simple reasons to buy music with her cell phone. “It’s good to have your favorite songs on your phone and it is totally affordable,” she said.

Huawei's Acquisition Denial by US Government Could End Up Hurting US Businesses

Huawei's attempt to acquire a US business (essentially its intellectual property)was rejected by the US government, which 'requested' that the assets be returned to the previous owner. This is one in a long string of such decisions against Chinese companies. While the competitive issues are thorny, their impact is debatable both as a business issue and as a longer term strategic matter for the US. There has certainly been a lot of tit-for-tat behavior in the evolving US-China relationship. In the long run, the US has to take some risks in order to signal its openness to investment - and so that US businesses can continue to do the same in other countries.

Joff Wild explains why the implications of this decision could be counterproductive in IAM:

"Since our establishment, Huawei has respected and protected the rights of all intellectual property holders while vigorously defending our own intellectual property rights. We have applied for 49,040 patents globally and have been granted 17,765 to date. In addition to our own innovations, we buy access to other patent holders' technologies through cross-licenses. In 2010, Huawei paid western companies US$222 million in licensing fees. Of that total, US$175 million was paid to American firms. For example, over the years we have paid U.S. company Qualcomm more than US$600 million in fees related to their intellectual property. The fact that Cisco withdrew the lawsuit it filed against Huawei in 2003 regarding allegations of intellectual property rights infringement further vindicates Huawei's position in that matter and supports our position that we are only engaged in legitimate business practices. We learned from that experience that while disputes may arise in the course of business, they can be settled properly through bilateral negotiations.

So reads part of an open letter written by Ken Hu, Deputy Chairman of Huawei Technologies and Chairman of Huawei USA posted on the Chinese company's website yesterday. It comes after the recent furore of a decision by a US government body to ask Huawei to divest itself of IP it acquired from 3Leaf last May over concerns about national security. Hu ends the letter with this paragraph:

We sincerely hope that the United States government will carry out a formal investigation on any concerns it may have about Huawei. The United States is an advocate for democracy, freedom, rule of law, and human rights. The United States government has demonstrated its efficiency in management, fairness and impartiality and we have been impressed by that ever since we made our first investment in this country some 10 years ago. We have faith in the fairness and justness of the United States and we believe the results of any thorough government investigation will prove that Huawei is a normal commercial institution and nothing more.

It seems to me that Huawei has a very strong point here. On the face of it, the company looks like one that plays by the rules. If there is evidence to the contrary, the US authorities should provide it. Unless, they do, then objecting to the Huawei purchase from 3Leaf will look a lot like a political decision based on where Huawei comes from, rather than anything truly prompted by a national security concern.

Hu states: "Futurewei, Huawei's U.S. subsidiary, purchased certain assets from 3Leaf, an insolvent technology start-up located in Santa Clara, California, in May and July 2010, when 3Leaf was ceasing its operations and no other buyers for its intellectual property were forthcoming." So it's not as if there was any kind of covert deal involved here. And if the IP in question really is a threat to US national security in what is deemed to be the wrong hands, then surely that implies it is incredibly valuable; but no-one else apart from Huawei wanted it. So how does that work? And, in any case, Huawei has had access to the IP, as well as the knowledge needed to work it, for many months now. So if it is an entity helbent on undermining the US, it has all the knowledge it is going to need already. A sale now makes no difference.

It is, of course, entirely down to the US to decide who can and cannot purchase American IP that is being sold by American entities. But in doing so, if the US is not open and up-front about providing the reasons for making them, then it may find that things do not work out entirely as planned. There are plenty of US companies that are interested in investing in China and in IP owned in China. My guess is that they will now find it much harder to do so. Who's the loser then?

Patterns vs Processes: How Speed and Complexity Are Changing the Way We Work

The impact of speed and complexity on the way we work is profound. Sources of value and the drivers of outcomes have changed dramatically as the amount of information as well as its nature (think about the impact of reputation versus financial reports)forces managers in government or business to adapt to new and frequently unanticipated challenges as well as opportunities. Decision-makers must act on incomplete information in record time. Among the ways in which this affects our thinking is the way in which we organize our approach to analyzing the inputs we are receiving.

Thierry de Baillon presents a thought-provoking thesis on one aspect of this - the growing importance of pattern recognition versus process management - in the blog "Sonnez en cas d'absence:"

"As ever increasing speed and amount of available knowledge are reshaping day after day the world we live in, it looks like a gap is widening between the way most businesses still operate and the capabilities needed to deal with our environment’s growing complexity.

Organizational responses to overall increasing speed too often are costs reductions, automation and optimization. Efficiency has become the new business’ black, and BPR is its credo. But speed isn’t only a factor we have to cope with; it is deeply transforming the nature of our relationships to the world. As Paul Virilio wrote: “The speed of light does not merely transform the world. It becomes the world. Globalization is the speed of light.” When considering speed as an external constraint, companies are keeping themselves deliberately out of many of today’s new fundamental dynamics. Pushing the gas pedal won’t drive anyone faster than the engine was built for, and current business engine was assembled in the — industrial – XIXth century, and amended more than thirty years ago with the rise of the process-driven enterprise.

The shy face of Enterprise 2.0

On every subject, for every aspect of our life, the quantity of information available is so tantalizing, that we cannot simply store all information we need at some time into our memory anymore. Such abundance has transformed our cognitive process: we now mostly remember links and references to information, extending our memory map, our knowledge, to a network of peers and sources. The more information is made available, the stronger and wider this network becomes, and the faster knowledge is able to flow. This networked nature of our representation of the world in turn participates in increasing the global speed of the world.

One major Enterprise 2.0 frameworks’ motto is to help companies to deal better with this information overabundance, to make organizational knowledge expandable and faster to access, with the help of social software: connecting with the right information at the right time. So far so good. Power has shifted from knowledge to knowledge sharing. Cool; but for how long? Even if there is little hope to break the 90-9-1 rule in organizations, information is becoming ubiquitous in an exponential way.

A recent attempt to deal with this growing quantity of knowledge flows is content curation, to allow for a better distribution of information. Unfortunately, this only helps facilitating knowledge acquisition when the desired outcome is already known, since what is relevant to you isn’t necessarily so for someone else, or even in another situation. Context is missing here. What we need is another way to filter information in context, another way to make information usable through non-deterministic tasks. The real power resides in knowledge use, not in knowledge sharing.

Another motto is to start with clear objectives. Business objectives… When quantity of information and speed of transmission are changing our way of thinking, are deeply transforming our lives, is it reasonable to believe that aligning corporate practices with private habits will spare us to rethink the way we work, the way we do business? Can we seriously think that getting from silos to clusters will save us deeper organizational transformations? Yes, we have to set up business objectives to any collaborative initiatives, but we have to consider which new kind of objectives can be achieved through social business, and what it means for the future of business.

The poor performance of processes

Umair Haque recently stated that “Making Room for Reflection Is a Strategic Imperative“. This is a nice injunction, backed with lucid and thoughtful arguments, but can we just “stop doing”, in an environment where speed has become the very stuff of things? I don’t believe so, taking a break is no more an option, and what we really need instead is to think differently. Accelerated growth of available data requires new ways to acquire knowledge and put it into action. In such a situation, unlearning has become as important as learning.

As most of our knowledge is now stored outside of our memory, the challenge not only lies in matching real-world situations with experiences stored in our memory, but also in pairing those situations with the right external connections, in order to gain access to the relevant knowledge. Not only do we have to deal with data, in anything but routine thinking, but with people, and our cognitive process now encompasses our networks. Information retrieval, and learning, had become inherently hyper-connected.

From internal “social” initiatives (let us consider them as knowledge networks rather than true collaborative environments for demonstration purpose) to customers’ relationships, present process-based approach to business is broken. Business processes expect a deterministic output; they rely on repeatability and explicit workflows, which often proves itself far from the nature of human relationships. The cognitive process, instead, is a non-linear mechanism, able to make sense from disjointed information. Cognition doesn’t appeal for processes, but for patterns. Furthermore, processes suit perfectly machine-to-machine communication. Human-to-machine communication needs to take into account user experience, which hardly resumes to processes, and human-to-human communication is all about weak signals and pattern recognition.

Knowledge work is all about patterns

Venessa Miemis has written a great post about the importance of patterns recognition in the cognitive process. To quote her: “there are strong and weak signals all around us, patterns, which indicate a change has happened, is happening, or has the potential to happen”. Business processes work as long as nothing changes, or at least changes slowly, which happens less and less in present business environments. Dynamic patterns, instead, are emergent phenomena of complex systems. They are highly adaptive and relate not only to existing flows (whether they be knowledge, work, customer journey, etc.), but also to how these flows change over time. In other words, they can be harnessed as predictive tools as well as operational routines design. A simple change in an underlying process might translate into huge and fast modifications of related pattern. Looking at the way patterns change (sometimes dramatically) in our networks provides us critical clues on how to improve broken processes, or on when to seamlessly switch to another one.

Here is a short summary of dynamic patterns versus processes characteristics:

Processes Patterns
Linear Non-linear
Designed on purpose Emergent and self-organizing
Inside-out Mostly outside-in
Hard to change Highly adaptive
Need stability to perform Require instability to form
May cause formation or modification of a single pattern
May emerge from multiple different processes

Patterns are already used in business context. Emergent practices leveraged from online communities are patterns. Ethnography, and many design thinking methods, invoke pattern recognition to decipher customers’ behavior. Social learning implies the use of patterns in knowledge acquisition. Dynamic patterns are much more adapted to knowledge work than business processes are.

As they can be broken down to processes, monitoring patterns’ evolution in networks represent a promising way to handle the exceptions crippling most of the processes in which human interaction is involved. Integrating pattern recognition into work might require dedicated competencies, but it also requires new approaches. Adaptive Case Management is a promising framework to help dealing with knowledge flows rather than with processes, considered the fact that not only should we focus on information, but also on the way information, and connections to it, changes over time. Time has come, to understand that information is not only the blood of our networked organizations, but also their bones.

Net Fiscal Stimulus Results in Lower Country Borrowing Costs

The following is a summary of a new National Bureau of Economic Research paper about the effects of fiscal stimulus during the recent (on-going?) recession. It presents the counterintuitive result that countries investing in more fiscal stimulus enjoyed lower subsequent borrowing costs. The implication is that much of the rhetoric coming from Britain's governing Tories and the US Republicans about the benefits of budget cutting and denying stimulus funds may simply be wrong. the paper is available for purchase on-line per directions below. Thanks to Erik Brynjolfsson for the tip:

"Net Fiscal Stimulus During the Great Recession"

Joshua Aizenman, Gurnain Kaur Pasricha
NBER Working Paper No. 16779
Issued in February 2011


This paper studies the patterns of fiscal stimuli in the OECD countries propagated by the global crisis. Overall, we find that the USA net fiscal stimulus was modest relative to peers, despite it being the epicenter of the crisis, and having access to relatively cheap funding of its twin deficits. The USA is ranked at the bottom third in terms of the rate of expansion of the consolidated government consumption and investment of the 28 countries in sample. Contrary to historical experience, emerging markets had strongly countercyclical policy during the period immediately preceding the Great Recession and the Great Recession. Many developed OECD countries had procyclical fiscal policy stance in the same periods. Federal unions, emerging markets and countries with very high GDP growth during the pre-recession period saw larger net fiscal stimulus on average than their counterparts. We also find that greater net fiscal stimulus was associated with lower flow costs of general government debt in the same or subsequent period.



You may purchase this paper on-line in .pdf format from SSRN.com ($5) for electronic delivery

Monetizing Social Media: The Data Mining Opportunity

While no one can gain-say the global embrace social media has received from an adoring public, figuring out how to make 'cash money' as the saying goes, has proved more of a challenge.

In the following article in Mashable by Chris Boorman, he explains how data mining may be the answer for which so many have been searching:



"The thinking about social media in corporate marketing departments is rapidly evolving. Initially, social media was seen as yet another broadcast opportunity for pushing messages out into the world, and for many companies that view persists. A social media consultant recently said that even today, when he approaches potential clients for the first time, they typically refer him to their PR agency, because “they handle Facebook for us.”

There’s nothing wrong with using social media as a tool for disseminating marketing messages or trying to establish deeper relationships with current or potential customers. However, there is another use of social media which may prove to be more powerful over the long term: Listening to the voice of the customer by data mining social networks.

Currently, CRM systems create customer profiles to help with marketing decisions using a combination of demographics and prior behavior, primarily historical buying patterns. These systems essentially enable companies to see their customers in the rear view mirror.

The customer data available via online communities like Facebook is both richer and more forward looking. A financial organization with access to such data would not only know that a customer had a checking account, savings account, two CDs and a mortgage, but also that the same customer was interested in golf or gourmet cooking — information that could be useful in planning future marketing initiatives. Every minute of every day, Facebook, Twitter and other online communities generate enormous amounts of this data. If it could be tapped, it could function like a real-time CRM system, continually revealing new trends and opportunities. Here’s how.

Tapping Social Media Data

The good news is that with today’s technology, this data can be tapped. But the process is not without its challenges. The data stream is a prime example of “Big Data.” Dealing with data sets measured in petabytes is a challenge in itself, and there is a serious problem with the signal-to-noise ratio. At my company, we estimate that at best, only 20% of the social media data stream contains relevant information. But before this problem even arises, companies face the issue of identifying their customers among the millions of participants in any given online community.

The Problem of Customer Identity

Most companies approach the problem of finding customers on social sites through the slow, arduous and expensive process of participating themselves. On Facebook, for example, businesses can gain access to the profiles of anyone who clicks the “Like” button on the company’s business site (depending on each customer’s privacy settings). With the right pitch, offer or game, companies can gradually gain an enhanced understanding of a subset of their social customer base.

With new matching technology that’s now available, the process is faster and more comprehensive. For example, matching technology uses artificial intelligence to figure out whether a given “John Smith” in a company’s customer database is the same individual as a particular John Smith on Facebook. The algorithms that accomplish this are extremely sophisticated, and they work. In fact, matching technology has been successfully used by law enforcement agencies to locate criminals.

If a company has one or two key pieces of information about its customers — e-mail address is often the most important — that company can accurately identify them on a social site and extract a substantial amount of data, including both profile data and transactional data that can reveal relationships important for marketing purposes. (Again, the amount of data available for any given customer depends on that customer’s personal privacy settings.)

Putting Data to Work

The second problem with social media is transforming data that is potentially useful into data that is actually useful. Social media data is generated by an entirely different technology stack than the transactional data that typically feeds CRM systems. Accordingly, it is stored in entirely different formats. That data can be transformed into a useful format with Master Data Management (MDM) technology.

MDM is the process of managing business-critical data, also known as master data (about customers, products, employees, suppliers, etc.) on an ongoing basis, creating and maintaining it as the system of record for the enterprise. MDM is implemented in order to ensure that the master data is validated as correct, consistent, and complete.

MDM has been used for more than a decade by companies that want to integrate disparate databases for a 360 degree view of their customers (or product portfolios, for that matter). It is equally effective in integrating social media data into existing CRM systems, and filtering that data for relevance.

What this all means is that companies can achieve important process improvements with bottom-line significance. For example, they can:

Obtain behavioral data that will allow them to more appropriately target segments for better marketing results.

Obtain data on personal preferences and interests to move closer to a true one-to-one relationship with their customers.

The disciplined use of demographic and historical customer data has enabled large numbers of companies to substantially increase the effectiveness of their marketing campaigns. Social media data will enable marketers to take targeting to the next level. It’s Big Data, but today’s technology can handle it.

Short This Idea: Why Most Start-Up Acquisitions Fail

One of the business world's most well-known 'secrets' is the appalling failure rate of corporate acquisitions; generally at the 65% level or higher. Arbitrageurs and investors know that when a merger or acquisition is anticipated or announced the stock price drops due to the markets' understanding of the drag on performance such activities usually create. The reasons are many and focus on intangibles; understandable employee concern about what it means for them, executives clashing about who will prevail, computer system synchronization problems, cultural differences, compensation questions, customer concerns about how they will be treated. The litany is a long one.

In the following article by Matthew Ingram in GigaOm via Business Week, the even more disturbing record of start-up acquisitions is dissected:


"Yahoo! has a pretty miserable track record when it comes to startup acquisitions, a roll call of the doomed and soon-to-be forgotten that includes Flickr, Delicious, MyBlogLog, and several others that may or may not be trapped in "sunset" mode. It's not just Yahoo, of course: Google has also made a series of startup acquisitions that went nowhere, including the purchase of Dodgeball—which languished until founder Dennis Crowley left to create Foursquare—and the acquisition of Blogger, which also withered on the vine for the most part after the company bought it. The reality is that most of these big startup acquisitions fail, and they likely always will.

The stories 37signals collected in its post give a sense of why Yahoo's acquisitions did so poorly; it's like the movie Groundhog Day, except instead of Bill Murray reliving the same day over and over in Pennsylvania, it's startup founders like Joshua Schachter of Delicious repeatedly running into big-company bureaucracy, combined with a toxic mishmash of technological determinism, ignorance, and outright incompetence.

A Flickr developer's tale of how Yahoo continually tied up development at the photo-sharing service is a perfect example: Most of the unit's time—85 percent, in fact—was spent dealing with the Yahoo bureaucracy, says Kellan Elliott-McCrea, and months were wasted trying to migrate Flickr's API to the mandated Yahoo equivalent. Oh, and Yahoo also starved its new acquisition of resources, which made it impossible to add new features or expand to remain competitive.As a result, Facebook ate the company's lunch in the photo-sharing market.Schachter, meanwhile, has described how he was effectively shunted aside and not allowed to have any input into the design or development decisions around Delicious. He called his time at Yahoo "an incredibly frustrating experience."

Missing the Right DNA
Are some of these kinds of complaints a result of founder egos clashing with big company management processes? Perhaps. And a number of commenters on the 37signals post noted that some startups have no choice but to be acquired, because they have no real business model. But the biggest single reason why startup acquisitions fail to have any impact on the company that acquires them is that large companies such as Yahoo and Google in many cases don't have the institutional know-how or the internal DNA to take advantage of them. And it's not just Google and Yahoo—large companies of all kinds require large infrastructure, and that means layers and layers of management processes, departments, committees, and boards, not to mention alignment with strategic goals, revenue targets, marketing messages, and so on.

Startups grow and succeed in some ways because they don't have any of those things. In most cases, they are poorly funded and inadequately managed collections of misfits powered solely by a passion and determination that borders on mania. The marketing, payroll, and IT departments are frequently a single person, so getting them to agree on something isn't usually a problem.They can move quickly—and make mistakes quickly—and that can make all the difference. The most that a big company can hope for is to get a startup founder who can make the transition (as FriendFeed founder Bret Taylor, now CTO of Facebook, appears to have done).

The flipside of all this, of course, is that founders whose startups get acquired and then smothered can go on to do some incredible things. Crowley started Foursquare, which is what Dodgeball could have been, and Evan Williams started what became Twitter. And did the cash and notoriety they got from being acquired help them do so? Undoubtedly (although Delicious founder Schachter said that if he had to do it over again, he would gladly give up the cash and not have sold to Yahoo).

So large companies like Google and Yahoo will no doubt continue to try to inject some startup DNA into their corporate bloodstreams—and in an overwhelming number of cases, they will fail. And startup founders will continue to cash in, and then cash out.

Metrics At The Movies: Handicapping The 2011 Academy Awards

The 2011 Academy Awards "Oscar" telecast is Sunday, February 27. Nate Silver of 538 blog tries to take the mystery out of the selection process by looking at the data to determine who will win in various categories this weekend.


"Two years ago, I tried to predict the Oscar winners for New York Magazine by crunching data from the last 30 years of awards history. The project was a mixed success. I got 4 out of 6 categories right, an acceptable score on paper. But several of the choices (like “Slumdog Millionaire” for Best Picture) were giveaways and conversely, the system’s longshot pick — Taraji P. Henson for her role in “The Curious Case of Benjamin Button” — was a total flop.

Okay, so the system isn’t exactly Watson (not that Watson doesn’t make mistakes too). Nor, really, could any system be when trying to predict the behavior of the relatively fickle group of human beings who make up the Academy of Motion Picture Arts and Sciences.

But I’ve dusted off the database (literally: it was on an old desktop that had begun to gather moss) and aimed to simplify it — boiling it down to a few core factors that have been especially reliable predictors in the past. The idea is to focus on those variables that have some logical meaning rather than being statistical artifacts. Here, then, are the rules-of-thumb for predicting the Oscars:

1. By far the best predictors are the winners of other major awards, like the Golden Globes. That isn’t rocket science, I know. But there is some utility in knowing which awards have the best track records in which categories. In the best picture category, for instance, awards given out by ‘outsiders’ like critics tend to be far less reliable predictors than those given out by professionals like directors and producers.

2. A nomination for best picture is a boon in the other cateogries. If one nominee for best actress appears in an Oscar-nominated film and another does not, the one in the nominated film is more likely to win. Unfortunately, now that 10 rather than five movies are nominated for Best Picture, this state of affairs is less likely to happen.

3. The Academy — which can take itself very seriously — is relatively unfriendly toward comedies. If two candidates otherwise seem tied, lean toward the more dramatic film. The exception is in the supporting actor and actress categories, where the Academy likes to have a bit more fun and playing comedic or otherwise quirky and offbeat roles may actually be an advantage.

4. Hollywood has some tendency to “spread the wealth” — generally, it hurts a nominee’s chances if she’s won in her category before. The converse is also somewhat true — if someone has been nominated a lot but has not won, they may build up some sympathy points. This is not absolute, however — otherwise, Meryl Streep would not have been shut out in her last 11 best actress nominations.

That is pretty much what we have to work with. By contrast, other variables like release dates, Rotten Tomatoes scores and box office grosses (otherwise, how could “Avatar” have been upset last year?) don’t seem to matter, at least not once you’ve accounted for these other factors.

Of course, there might also be all sorts of intangible dimensions to voter psychology that are fun to speculate about, but are hard to quantify. And the Academy can go through different moods — recall for instance, its tendency to favor glossy but somewhat vapid films like “Terms of Endearment” during parts of the 1980s. So while our database goes back to 1979, we put more weight on more recent winners.

Here is how the system handicaps the odds in the six major cateogies:

BEST PICTURE As I’ve noted, although “The King’s Speech” and “The Social Network” have won a roughly equal number of awards, “The King’s Speech” has won those that matter most, like the awards from the directors’ and producers’ guilds. The statistical case for “The Social Network” rests on its victory at the Golden Globes, which does have some predictive power; the psychological one probably depends on it now having become the underdog since “The King’s Speech” has been on such a winning streak. Nevertheless — although we’re retiring the pretense of decimal-point precision this year in favor of a softer, gentler approach — “The King’s Speech” is overwhelmingly more likely to win.

BEST DIRECTOR Often the most boring award since it so closely tracks to best picture, but this year is a potential exception. Betting markets, even though they have “The King’s Speech” heavily favored for best picture, have “The Social Network’s” David Fincher slightly favored for best director. And our system likes Mr. Fincher, too.

The most formidable piece of evidence is that the awards were split in exactly this fashion at the Baftas (the British equivalent of the Oscars), with “The King’s Speech” winning best film but Mr. Fincher best director. Also, when there has been a split between the two categories, it is sometimes the more adventurous film (think “Brokeback Mountain” to “Crash”) that wins best director while the other wins the big prize; this can be observed, for instance, in the fact that awards given out by critics (almost all of which were won by “The Social Network”) do have some predictive power for Best Director, which they do not for best picture.

Another factor is that if it does not win best director, “The Social Network” may be entirely shut out of the major awards; just one of its actors (Jesse Eisenberg) was nominated and he is unlikely to win. Finally, going by the sympathy points theory, Mr. Fincher has been nominated before (for “Benjamin Button”, which was also shut out) while the director of “The King’s Speech,” Tom Hooper, has not.

Are you persuaded? It’s a tentative case — and notably, Mr. Hooper won the Directors Guild of America award, which is the single best predictor of the lot. But you have to take a few risks to win an Oscar pool, and predicting the split here is a pretty decent one.

BEST ACTOR No need to get fancy here: Colin Firth has swept every major award and is the overwhelming favorite.

BEST ACTRESS. Annette Bening won the Golden Globe for her role as Nic in “The Kids Are All Right”, but Natalie Portman has won the majority of awards and — recalling our rule-of-thumb from above — the Academy tends to prefer serious roles to comedic ones when the choice is otherwise close. Plus, everyone seems either to have loved “The Black Swan” or thought it so terrible that Ms. Portman deserves some empathy for having competently played such a ridiculous character (guess which group I’m in?). A small factor helping Ms. Bening is that she has twice been nominated before without winning (for “American Beauty” and “Being Julia”), but this is Ms. Portman’s award to lose.

BEST SUPPORTING ACTOR. Geoffrey Rush made this mildly more interesting by winning the Bafta. But despite one recent coup — the Baftas rightly picked Alan Arkin for “Little Miss Sunshine” when most other awards did not — the Brits have a fairly poor track record in this category and the weight of the evidence points toward Christian Bale for his performance as a crack-addicted former boxer in “The Fighter.” If you wanted to pick a long-shot, in fact, you might do just as well to go with Mark Ruffalo from “The Kids Are All Right,” since his was the only comedic performance nominated and since that’s actually an advantage in this category.

BEST SUPPORTING ACTRESS Here too, the Baftas split from the other awards by picking Helena Bonham Carter rather than Melissa Leo. But the victory was a bit tainted since neither Ms. Leo nor the lovely Hailee Steinfeld (whom the Baftas quite rightly considered a leading actress for her role in “True Grit”) was nominated.

So could Ms. Steinfeld win instead? She could; this is among the hardest categories to predict, and we did adjust the system some for the fact that there is some confusion over her role. Nevertheless, Ms. Leo won both the Golden Globes and the Screen Actors’ Guild Awards in direct competition with her, and so the case for Ms. Steinfeld is more sentimental than statistical.

Asymmetric Threats and Coevolution: Israeli Army Faces Diplomatic, Legal As Well As Military Challenges

The advent of the internet age forced organizations to accept that the greatest threats to their survival could come from directions they had not previously anticipated. Books like Harvard Business School Professor Clayton Christensen's "The Innovators Dilemna" highlighted the need to look beyond the boundaries of one's company, industry or profession to understand the nature of competition. Offshoring US legal work to India, insurance companies offering banking, access to markets in return for intellectual property are all examples of this sort of asymmetric threat, eg, a threat that comes from a source, direction and time that is not consistent with traditional perceptions of organizational strategy.

In a thoughtful article from the Financial Times, Tobias Buck analyzes how the Israeli Defense Forces success on the battlefield has forced its enemies to innovate in coevolutionary ways that may not require military force, but may be much more dangerous to the country's long-term survival.

"The superiority of the Israel Defense Forces has been evident for so long that no regular army has dared to challenge them on the battlefield in almost 40 years.

For the time being, Israel’s most immediate foes – Hizbollah in neighbouring Lebanon and Hamas in the Gaza Strip – appear to want calm. Israel fought deeply controversial military campaigns against Hizbollah in 2006 and against Hamas in 2008-09. Though neither conflict produced a conclusive victory, the IDF inflicted sufficient damage to create a deterrent, and forced a sharp drop in rocket attacks from Gaza.

What is more, Israel’s military edge will increase in the coming years. The air force has ordered 20 American F-35 fighter jets, the world’s most advanced attack aircraft. The navy will receive two new submarines. Israel is pouring money into missile defence systems. In recent years, military leaders have also worked hard to spruce up the capabilities of the country’s conventional land forces.

But despite the IDF’s towering position, some analysts have started asking difficult questions – and are drawing increasingly harsh conclusions. They fear the military is facing serious challenges, on and off the battlefield, that will ultimately blunt its abilities and erode the country’s strategic advantages. At a time of unprecedented upheaval in the region, with the peace process lying dormant once again, their arguments should worry Israeli policymakers.

POLITICS AND THE MILITARY

One of the most striking features of Israeli politics is the sheer number of former military leaders that crowd around the cabinet table and populate the Israeli parliament.

Ehud Barak, the former prime minister and current defence minister, is a former Israel Defense Forces chief of staff; Moshe Yaalon, the strategic affairs minister, is another. A third former IDF chief, Shaul Mofaz, currently serves as deputy leader of the opposition Kadima party.

Benjamin Netanyahu, the prime minister, was never a career soldier. However, his curriculum vitae does boast a stint in the IDF’s elite Sayeret Matkal special forces unit (where his commanding officer was Mr Barak). Like many politicians, Mr Netanyahu has found that a distinguished record of service in the forces is a valuable asset in the eyes of Israeli voters.

The reasons for this are easy to see: the IDF has long been the most admired institution in the country, seen as both meritocratic and crucial for Israel’s survival. Whoever makes it to the top of the military is therefore seen almost by definition as worthy of the highest political office. As Israel is also a country that often engages in armed conflict, Israelis want their political leaders to have a thorough grounding in military and security matters.

Not everyone is convinced, however, that this concentration of military expertise in the political sphere is good for decision-making. One danger is that ex-generals serving as ministers involve themselves in operational details and lose sight of their real function in setting overarching policy goals. As a result, the dividing line between civilian and military responsibilities becomes blurred, undermining the overall war effort.

Indeed, both the 2006 Lebanon war and the 2008-09 war in the Gaza Strip were notable for the absence of clearly defined war aims. In the case of Gaza, the government gave the lofty ambition of improving the “security situation” in southern Israel. In the case of Lebanon, it aimed for the return of two abducted Israeli soldiers – a goal that a subsequent committee of inquiry derided as “over-ambitious and not feasible”.

In an attempt to tackle the blurring divide between generals and politicians, parliament passed a law in 2007 forcing senior officers to wait at least three years after their retirement before entering politics. Judging by the clamour among parties keen to recruit Gabi Ashkenazi, who stepped down as IDF chief this month, the law has yet to make a big impact.

Analysts point to the sheer number of threats with which the military must deal, from Iran’s nuclear programme to the growing arsenal of rockets and missiles at the disposal of Hizbollah and Hamas. Adding to these concerns is the recent overthrow of the Hosni Mubarak regime in Egypt, and consequent fears that a cornerstone of Israel’s security – the 1979 peace treaty with Cairo – could ultimately unravel.

Meanwhile, the arrival of Iranian warships in the Mediterranean this week, the first such voyage since 1979, was a fresh reminder of Tehran’s determination to expand its influence. In response, prime minister Benjamin Netanyahu warned: “Israel’s security needs will grow, and the defence budget must grow accordingly.”

At the same time, there is concern that the Israeli military, for all its prowess, is increasingly shackled by the country’s growing isolation and a shift in strategy by its enemies. Some Israelis also fear that the 44-year occupation of the Palestinian territories and two decades of diplomatic deadlock have weakened the IDF, not least by removing international support for Israeli military action.

Indeed, Lt Gen Benny Gantz, who took over as IDF chief of staff this month, faces an entirely new set of legal and political constraints. He leads a military whose reputation – at least outside the country – has been tarnished by allegations of war crimes, most recently after last year’s attack on the Gaza aid flotilla. The wave of recent criticism has in turn triggered a concerted international effort to bring senior Israeli officers to court to face criminal charges.

Even among Israelis, the army has lost some of its lustre after a string of scandals involving its leaders. Lt Gen Gantz was picked at the last minute after the frontrunner faced accusations that he illegally took over neighbouring land to expand his home. The revelations followed a series of widely publicised scandals and incidents, including allegations of a dirty tricks campaign linked to the succession.

The real problem for the IDF, however, lies not so much in the human fallibility of senior officers but in their inability to formulate a coherent response to a changing security environment. That, at least, is the thesis advanced by Ron Tira, an Israeli military analyst and a former air force pilot. “We are now facing a new warfare paradigm by the enemy. The old approaches are not very useful, we need to come up with something new – and we are not there yet,” he says.

The threat today is not invasion or battlefield defeat. Instead, argues Mr Tira, Israel’s enemies in Iran, Syria, southern Lebanon and the Gaza Strip have launched a war of attrition aimed at the “long-term erosion of the Israeli will and the long-term erosion of Israeli legitimacy”. The approach cleverly combines political and military elements, conventional and non-conventional warfare, and draws on the international community’s increasing frustration with Israel.

For the proponents of this approach, military defeat can be transformed into political success. Mr Tira notes that Israel may have deterred Hizbollah and Hamas – but at the cost of its own diplomatic standing. “We cannot do these wars every two years,” he argues. At the very least, future campaigns will have to achieve their goals as quickly as possible – both because of diplomatic pressure and because Israel’s biggest cities will be facing attack from potentially thousands of rockets and missiles. “This is the first kind of threat that we don’t know how to remove,” Mr Tira says.

His assessment is far from universally shared, but even serving IDF officers acknowledge the challenges are growing. One senior officer, speaking on condition of anonymity, says the IDF face four distinct threats: non-conventional weapons, especially the Iranian nuclear programme; conventional armies, whose potential is increasingly married to “asymmetric” capabilities from militias and non-conventional military operators such as Hamas and Hizbollah; the fast-expanding missile and rocket arsenal of those groups; and terrorism. “Israel’s deterrence power is still overwhelming. But in all these areas the trends are negative,” the officer says.

Among the biggest worries for military planners is the prospect of a massive missile and  rocket attack. Hizbollah, in particular, has built up a vast arsenal in recent years, estimated by Israel to include 40,000 to 50,000 rockets and missiles. These include an unspecified number supplied by Iran and Syria able to reach targets deep inside Israel, including Tel Aviv and surrounding population centres. Hamas, too, has beefed up its arsenal since the Gaza war ended.

Despite the perception that Middle East wars have always been “terrible”, in past conflicts adversaries made an effort to keep civilian populations out of the fighting, says Shlomo Brom, a former director of the IDF’s strategic planning division, now a senior research fellow at the Institute for National Security Studies. “The big change is that the civilian population is now being drawn into the war.” Mr Brom believes this shift will force “Israel to fight very short wars. And it forces Israel to define realistic goals for these wars.”

Lurking behind the debate over the IDF’s challenges and capabilities is an altogether more controversial issue: the forces’ role in maintaining the occupation of the West Bank and, until 2005, Gaza. It is a role that has come increasingly to define the Israeli military in the eyes of the world, gradually eclipsing earlier images of heroism and daring achievement.

Analysts such as Martin van Creveld, Israel’s best-known military historian, argue that the long years and countless soldiers that the IDF have devoted to policing a civilian population, manning checkpoints and scuffling with protesters has sapped the military’s strength. “If you fight the weak, you become weak. And we have been fighting the weak for far too long,” he says.

The occupation puts the IDF in a position where they have everything to lose and nothing to gain, he adds. “When you fight the weak and you kill the weak, then you are a criminal. And when the weak kill you, then you are an idiot. That is the dilemma.”

This glum view of the IDF’s predicament is not shared by Israeli leaders, who rarely miss an opportunity to praise the armed forces. In some ways, their confidence is justified: the forces are, and will almost certainly remain, technically capable of dealing with all security threats that arise. That assessment, analysts say, extends even to the scenario, still highly improbable, of a renewed confrontation with Egypt.

Yet few can seriously doubt that – in political terms at least – the IDF’s room for manoeuvre is shrinking. What use, some may ask, is overwhelming firepower when the international community prevents its use? And how effective are new American fighter jets and German submarines when the enemy is targeting Israel’s legitimacy as well as its cities?

International pressure alone may not be enough to prevent Israel ordering an attack, especially when and if the country concludes it faces an existential threat. But the political price of military action is rising steadily, making a repeat of the recent campaigns fought in Lebanon and Gaza increasingly difficult. For the current crop of generals, this new military and political environment poses a challenge very different from that faced by their predecessors.

“In the 1970s, the Israeli chief of staff faced a lot of problems, but he did not have an intellectual challenge,” says Mr Tira. “Today, the question of how to apply power is a major problem – the borders between the military, the political and the legal are becoming more and more blurred.”

Feb 24, 2011

Lonely At the Top: Apparently TOO Lonely - Harvard and Princeton Reinstate Early Decision

Even elite institutions have to pay attention to the competition. Harvard, Princeton and the University of Virginia, three of the US's most elite undergraduate institutions announced three years ago they were eliminating early decision because research showed it favored higher income students from schools in better neighborhoods (and therefore discriminated against poorer students who might be just as entitled academically.

They did so to the applause of many who felt it was a noble thing to do and, besides, what high school senior in his or her right mind would turn down even the possbility of Harvard or Princeton?

Well, it seems a LOT of high school seniors, stressed after four years of college admissions pressure, were only too happy to get it over with and accept admissions from marginally less competitive colleges (Brown, Stanford?)in order to be able to relax a bit for the balance of their senior year. Does this mean sanity is returning to college admissions? No. It means competition lives and even the most competitive institutions - who accept approximately 1 out of every 30 applicants - are not committed to the fairness versus "We're #1" tradeoff.

From Cathy Rampell at the New York Times Economix blog:

"A real-life allegory on the perils of unilateral action: First Princeton tried to be the leader on grade deflation, but no one followed. Then Harvard and Princeton decided to end their early admission programs, on the grounds that they were unfair to poor students. Again, apparently few schools followed suit.

While Princeton has still held firm on its stricter grading policies, both Princeton and Harvard on Thursday reinstated their early admission programs. (The University of Virginia, which had also ended its early admission program with great fanfare, gave in last year.)

From the Daily Princetonian article:

“We have carefully reviewed our single admission program every year, and we have been very pleased with how it has worked,” Princeton President Shirley M. Tilghman said in a University press release. “But in eliminating our early program four years ago, we hoped other colleges and universities would do the same and they haven’t.”

Tilghman explained that one consideration that played into the University’s decision was that high school students would apply to other schools early even if they thought of the University as their first choice.

“By reinstating an early program, we hope we can achieve two goals: provide opportunities for early application for students who know that Princeton is their first choice, while at the same time sustaining and even enhancing the progress we have made in recent years in diversifying our applicant pool and admitting the strongest possible class,” Tilghman said.

And from the 2006 press release in which Princeton announced it was ending the early admissions program:

“We agree that early admission ‘advantages the advantaged,’” Tilghman said. “Although we have worked hard in recent years to increase the diversity of our early decision applicants, we have concluded that adopting a single admission process is necessary to ensure equity for all applicants.