A Blog by Jonathan Low

 

Aug 16, 2014

Internet Trolls Turn Out To Behave That Way Offline Too

The recent deaths of celebrities like Robin Williams and the arrest of two journalists recharging their phones in a McDonalds while covering the Ferguson, Missouri racial clashes have highlighted the persistent problem of internet trolling and bullying.

The issue has taken on added salience as the gap between our online and 'real' lives narrows to the point of nonexistence.

One of the questions raised was whether internet trolling is a unique phenomenon fueled by the relative anonymity and distance the 'net provides or whether it is symptomatic of a broader pathology.

Research into the causes of internet trolling have established that it is directly related to a broader pattern of anti-social behavioral traits. The inescapable conclusion being that when you're a digital, electronic internet jerk, you're a complete, all-around human jerk. JL

Jordan Lewis reports in The Guardian:

Trolls are, by far, more likely to have narcissistic, Machiavellian, psychopathic, and sadistic personality traits

Ferrari-nomics: The Benefits of Scarcity

Innovation, exclusivity or ego?

Ferraris are known for the high prices they fetch. Not very many of them are made in any given year, their designs are iconic and for those who can afford to treat them in the manner which they require, they deliver a memorable driving experience.

But one of the truly extraordinary things about them is that their value frequently increases with age. As the following article explains, dealers are rumored to claim that they can make more money selling old models than new ones (though few, if any, would ever be quoted as saying so).

The fact is that careful nurturing of the company's intangibles - brand, reputation, quality, customer experience - adds significant tangible value to the product's economic worth. It is not just that it holds its value, but that it invests in those factors that contribute to the perception of that value and that, in turn, results in its extraordinary financial as well as operational performance. JL

Kyle Stock reports in Business Week:

“I don’t think, in general, people that spend a lot of money on things are idiots."

Why Tech Now Wants to Delight Rather Than Deliver: The Meaning Behind The Industry's Favorite New Word

There was a time when tech companies couldn't wait to impress consumers with the power and functionality of their devices. They were thrilled to be able to share the most obscure features of the most arcane components, this despite plenty of research showing that most customers had no idea what they were talking about and couldn't have cared less even if they knew.

The tech companies just couldnt help themselves because they were smitten with their own genius and wanted to share the joy.

It was cute, in its own rather boring way, especially when it became apparent through research that the average customer used about 10 percent of a smartphone's features, for example, and never opened the manual, whether in paper or online.

The reality was that people expected  functionality, convenience and ease of use. In economic terms, the market announced that it was not going to pay producers a premium for all that workaday stuff because it was considered the ante to stay in the game. Keep up or die. So the manufacturers and marketers had to find something else to differentiate themselves from the competition. Whereupon 'delighting' customers became the new marketing corollary to Moore's Law.

There were, of course, a couple of other reasons to do so: Apple had established design as a feature for which people would pay extra, so just as Hollywood produces copycat sequels, everyone in tech discovered clean lines and cool materials.

But another rather more defensive incentive was lurking in the background: as tech increasingly becomes associated with job destruction and economic inequality, the industry is attempting to disassociate itself from that dystopian reality and recapture the magic that won the world's hearts, minds and wallets in the first place. JL

Kevin Roose reports in New York Magazine:

"The world is pretty bored with being able to accomplish tasks efficiently.” When did the titans of tech start talking like kinder­garten teachers?

Aug 15, 2014

The End of the Bromance, Bro?

Have we had it? Are we done? Does this mean we are finally, irretrievably getting bro'ed out?

Bud, buddy, my man and dude all have their adherents, but none have had quite the universal run that bro has enjoyed as a term of male - and sometimes female - endearment.

Part of the term's success was its universality. Black, white, Hispanic, Asian, geek, jock, hipster, flamer, whatever: anyone could be a bro. It even had international offshoots like bra in other former English speaking colonies.

Dude had and still has a sometimes slacker-ish quality. Bud or buddy tends to be a bit country. But bro is all encompassing: from brogrammers to brothas, we're all in. But it is that very ubiquity that may signal its demise as a catch-all code for the familiar and favorable. As the following article suggests, overexposure can lead quickly from parody to contempt. All of which signals that the bromance is over. JL

Matthew Malady reports in Slate:

Every genre ends in parody of itself in literary form, and there are ways in which slang also moves toward that parody status. So that when something is about ready to explode or die out, is when people have gotten it to the stage of parody.

What Neuroscience Is Teaching Us About Brain to Brain Selling

One person's manipulation is another's neuroplasticity.Or at least that is how the debate appears to be shaping up over the uses of neuroscience to effect economic impacts.

This is the knowledge era, where all that Big Data is just aching to be applied in order to achieve some sort of result.

There are those who are queasy about this, fearing - perhaps justifiably - that the wisdom gleaned will be misused, which is to say focused in such a way that information asymmetries are created. That means that the knowledge is not evenly or uniformly distributed - or fully explained - putting one party at a distinct disadvantage.

The optimists assert that if all parties are given access to the same data - by no means assured - they are then responsible for making whatever judgments they believe to be in their best interests. In other words, let the market decide and the buyer beware. There are even those who claim that this is just a means of sharpening what is already a substantial body of knowledge about what is possible.

The truth is that we do not yet know enough to say with certainty how this will used or whether its impact will change either behavior or outcomes. But we do know enough to sense that caution is probably advisable. JL

Lou Carlozo reports in The Exchange:

Buying and selling are brain-to-brain processes. Our brains are pattern-making organs that have been programmed to look for cues, visual and emotional, that will determine the resulting behavior. Most of these cues are picked up by the brain before we are consciously aware."

The Coming Robot Apocalypse Looks More Like A Spreadsheet Than a Killer Drone

They're here. You probably didnt notice because the fact is you were looking for the wrong signal.

You've been conditioned by Hollywood action films and the most lurid press opinion pieces to expect mechanical avatars looking vaguely like Star Wars characters but with more human features: RoboCop with glasses and a briefcase.

But the reality is that they are as intangible as the power they wield. Because they are algorithms, software and statistical packages that manage our economic world. Not so much to eliminate it as to squeeze every imaginable efficiency out of it. They will not take your job by sitting mechanically in your chair, because that would be a waste of space and time. They are ephemeral so dont need a chair or space - and very little time. They will identify opportunities to reduce or expand hours as projections indicate effort is needed, but with an eye towards keeping expenses, benefits and tax consequences to a minimum.

It's nothing personal, it's just the reductio ad absurdum of a process designed to eliminate the unnecessary. In a world focused on increasing returns, the best way to do that is by reducing the denominator. And in most organizational systems, the denominator is you.

The sharing economy is a stopgap measure grasped by the desperate as a means of filling the gap between the imperative and the possible. The imperative is stuff like rent and food. The possible is what you are paid or what you have saved. It's just a matter of time before the sharing function, too, is automated.

The only rational possibility for addressing this growing disparity will come when professionals like the coders and investment bankers find that too many of their jobs are being eliminated. This is already happening in medicine, accounting, law and even finance, where trading is algorithmic and analysis can be computerized. Critical mass has yet to be achieved. But there's no doubt a spreadsheet already programmed to predict its arrival. JL

Peter Kane comments in Vice:

Almost nobody becomes an Uber driver unless they have to—and that goes double for TaskRabbits. These are shitty jobs that college-educated people are forced into because their professions have been gutted

Aug 14, 2014

Shssh! The Contrarian View on Data Breaches

The presumption these days is that a data breach must be reported. The sooner and more comprehensibly the better. 47 US states and the Securities and Exchange Commission (SEC) demand various levels of disclosure by law.

It is not clear that this has made the public, the institutions to which it entrusts its data or the data itself any safer, but it has certainly raised awareness about the problem.

While it can not accurately be said that we live in an age of transparency, it is the case that transparency is the default posture expected. Given the plethora of information available and the relative respect for reputation, it is certainly more advantageous to be as honest as is prudently possible, rather than being caught in an endless downward spiral of obfuscation and dissimulation.

But what if too much disclosure is actually more harmful than too little? As the following article suggests, questions are being raised about the relative merits of exercising discretion rather than sharing all.

It seems apparent that society has generally benefited from an emphasis on honest, open discourse. However, given the degree of personal data now online and the vulnerability that creates, it may be useful to consider the implications of cautious discretion versus reflexive disclosure. JL

Danny Yadron reports in the Wall Street Journal:

Disclosing hacks wasn't always routine. Talking openly about cyberthreats is controversial because some executives fear it can make the company a target for hackers and such statements could be used against them.

Why Corporate America Has Not Been Disrupted. And Probably Won't Be


You could be forgiven for thinking that the 'gales of creative destruction' are having their way with the global economy.

Every enterprise with one or more employees is being exhorted to embrace change. Every new product is described as disruptive.The only thing that isn't changing, we are advised, is the inevitability of change itself.

And yet. As the following article explains, it has probably never been safer to be an incumbent in whatever market you find yourself - or which you choose. The reality is that fewer new businesses are being started and of those who do take the plunge, more are failing. If managing career risk is your goal, enterprises that have been in existence 15 years or longer are the ones most likely to survive.

This is consistent with research my colleagues and I conducted on the success rate of IPOs during the first phase of the technology era. The data showed that those organizations most likely to exceed their initial public offering stock price were those which had already been in existence at least five years and had 100 employees or more. These outcomes suggested that the advantages of strength, stability, experience and market reputation were becoming codified. But since then, the trend has solidified rather than diminished, even in the face of faster and more comprehensive technological advances.

There appear to be two logical explanations for this trend. The first is that technology in concert with the growing professionalization of business management over the past two decades has made it easier for established businesses to defend themselves. Such institutions have the wherewithal to identify, purchase and implement technological fixes that enhance their already formidable advantages. They can afford to adapt - and they have the impetus to do so.

The other primary reason for the relatively muted impact of disruption is that the financialization of the economy has further bolstered the benefits of scale. Investment flows to the sources of greatest - and most likely - return. It is a self-reinforcing principle of considerable power. And while western societies may mourn the passing of the yeoman farmer and the family owned business, the tax, regulatory and operational advantages of that civilization are all designed to reinforce the dominance of size and strength.

These trends will probably continue for the foreseeable future. Technology and finance have been co-evolutonary partners. Their success has been significant and relentless. In fact, the only likelihood that this may change will come when they each perceive that their future interests are diverging. JL

Ben Casselman reports in the 538 blog:

Recent research suggests that established businesses have less and less to fear from would-be disruptors.

iPhone: The Affordably Luxurious Global Accessory

Apple is taking to heart Mahatma Gandhi's injuncture that you should be the change you want to see in the world. Though it should be noted the company has also been mindful of the observation that the only quality differentiating humans from the other primates is our  ability to accesssorize. 

Apple has entered the Indian market, as it has most others, in its own unique way. Rather than introduce a less expensive phone to meet broader market potential, it is selling older models of its phones which are still considered an upgrade from previously available options by its competitors.

What Apple has observed is that the iPhone is the global luxury accessory of choice. More ubiquitous and of greater perceived value than handbags, shoes or cars. In China and much of the rest of Asia it was not necessary - or possible - to do this because the population was so much greater and relatively more affluent, but also because that is where most advanced electronics are made and the market was too sophisticated - and proud- to be saddled with last year's model.

But as the company continues its relentless expansion, it is finding that the old 'different strokes for different folks' strategy is still applicable. The point, as in India, is that owning the brand matters more than possessing the latest model. If that sometimes results in ironic touches - like cell phone covers for a luxury item featuring a man who has come to symbolize self-abnegation and poverty - well, he was a leader who understood the imperatives of mass appeal. JL

Ben Thompson comments in Stratechery:

It’s not that the iPhone has fully penetrated developed countries – we’re not talking about Pampers or Pepsi here. Rather, the iPhone is an affordable luxury item; the percentage of the population to which it is affordable just happens to differ market-by-market.

Aug 13, 2014

Almost One in Six Doctor Visits Will Be Virtual...This Year

Given the degree to which we appear to live our lives online - and the rate at which the difference between what is 'virtual' and what is 'real' is becoming harder to discern - it is not surprising that technology and the medical profession have conspired to deliver care via electronic means.

What may be somewhat more astounding is the rate at which this transformation is occurring. As the following article explains, there will be 100 million evisits worldwide this year. That represents a 400 percent increase in two years. That rate may moderate over time, but the relentless growth it infers will continue unabated.

The reasons for this are operational and financial. It is more efficient to deliver digital primary care in cases where a personal visit is unnecessary, given the growth in demand and the relative paucity of medical professionals to provide that coverage. It is also considerably less expensive to address minor distresses in this way, which will help alleviate some of the pressure on an over-burdened system.

There will naturally be concerns about the potential dangers of offloading what used to be considered essential personal contact to machines. Advances in data analysis address some of these concerns, but doubts will linger until the quality of care can be proven to be equal to or better than the current offering. It is hoped that this will assist in providing more basic care to more people faster. And our experience with technology adoption suggests that improvements in quality will track the experience curve. JL

Lucas Mearian reports in ComputerWorld:

This year in the U.S. and Canada, 75 million of 600 million appointments with general practitioners will involve electronic visits, or eVisits. Globally, the number of eVisits will climb to 100 million this year.

23 Million Active Twitter Users Are Bots. Or Maybe Not. Or Only Sometimes.

Sometimes it is really, truly better to take a deep breath and think a while before responding to anything. Especially if you are feeling aggrieved. 

The initial allegation in this cautionary tale was that according to a routine required  US Securities and Exchange Commission filing, Twitter copped to the fact that 23 million of its ostensible users were actually bots.

This garnered some attention, though it must be said that given the quality of many submissions in the average Twitter feed, it was hardly a surprise - and could actually be a pretty interesting take on advances in our ability to use technology to express current opinions based on past exclamations. By this standard, the human antecedents of any number of politicians around the world could legitimately be questioned. But we digress.

Twitter, however, took these reports - 'misinterpretations' - by its lights, as a threat to its quality, veracity and stock price value. So it issued a refutation whose rhetorical arabesques and narrative flourishes could only be described as obfuscatory. Not that anyone thinks Twitter is trying to cloud the truth, but that this is the natural end product of a discussion in which lawyers, marketers, technicians attempt to come up with a logical explanation - for anything.

At some point someone will figure out how to effectively express what this means and will also, presumably, be paid a lot of money to make it go away. In the meantime, as the following article all too clearly articulates, less is more. JL

Jack Linshi reports in Time:

While it’s difficult to know accurately if apps can also auto-post content, the statistic’s focus was on only apps that aggregate Twitter content automatically “with no user action involved."

The Button Effect: Why Big Data Doesn't Have All the Answers

'Wouldn't it be nice,' as the Beach Boys once sang, if we could just press a button, machines would hum, click, whir and then - Presto! - The Right Answer would pop out...every time.

That is the seductive siren song Big Data is crooning in our collective institutional ears. No thought, no guess work, no effort, no risk: all I have to do is dump my data into the device the way I dump my laundry into the washer and a few minutes later, mystery solved.

If this sounds too good to be true, well, that's because it is. And the danger we run is in assuming, hoping more like it, that this shimmering vision will absolve us of the burden and responsibility inherent in decision-making positions.

The reality, of course, is that these data are only going to be as useful, accurate and applicable as are the abilities of those who interpret them. Without the knowledge, experience and interpretive skills of those who understand the implications, the conclusions, such as they are, will be like the Delphic Oracle, sufficiently obtuse so that anyone can draw the meaning that best supports their predisposition.

In the end, these machines and the data they can endlessly spew are supplements, not substitutes, for the judgment, courage and belief that those who employ them bring to applying their output. JL

Douglas Merrill comments in Harvard Business Review:

The “button effect” gives the correct answer, every time, and I don’t need to think about it. In the messy real world, though, there are some bits of knowledge you should have so that you can interpret the button’s offering.

Aug 12, 2014

The Economics of Trivia Night

Building sales, loyalty and long term value is the trifecta of business strategy.

There are as many ways to accomplish this as there are products to sell or services to provide.

At the anecdotal intersection of promotion and predictability lies efforts like trivia night, efforts to provide additional opportunities for customers to indulge their inclinations and spend a few bucks.

There is no need to do this sort of thing on Saturday, because that's when customers flock in anyway, though that same logic was thrown out when retailers on Black Friday in November realized that promotional sales increased profits across the base of embedded costs.

Most trivia night-like events are on Tuesday, ostensibly the slowest night of the week, but almost any night will do. The fact is that we are always being manipulated. The only question is the degree to which we enjoy it sufficiently to make it worth the merchant's while. JL

Jim Pagels reports in Priceconomics:

The the investment in pub quiz seemed to pay off for everyone we interviewed. Additionally, many of the bar managers we spoke with mentioned trivia nights attracting repeat customers who also stop by other nights of the week—a highly valuable asset for any business

Where Did All the Entry Level Jobs Go?

On the job training? Wasn't that something companies offered back when, you know, they actually provided perks like health care and pensions and - remember this one? - job security?

Yeah, dream on. In fact, it's not just training that has disappeared, but, as the following article explains, the very notion of an entry level position. Because to offer such a wasteful deployment of assets would be to suggest that it is to the enterprise's benefit to invest time and effort in someone. Far more efficient to presume that they have acquired all the requisite skills and experience on their own before they apply. Isn't that why your parents bought you that summer internship? And if they couldn't afford to do so, well, social ills are not my problem.

The job market has evolved to reflect the financial pressures imposed on the institutions doing the hiring. The unintended consequences include unfilled openings due to the lack of 'qualified' applicants who can not meet the standards based on the circular logic that denies the value of entry level positions as well as the notion that people sourced from national or global pools with declining household incomes somehow have the resources to pay for the training and experience they are told they need to be hired.

Ultimately this is a competitiveness issue. But in a global economy, it is not always apparent who is competing with whom - and for what. JL

Lauren Weber and Melissa Korn report in the Wall Street Journal:

Training programs for jobs at major corporations regularly lasted two years in the 2000s, teaching the ins and outs of the products they were selling and explaining market trends for distributors and end users. Now, new hires would be lucky to get six months of ramp-up time

The Sources and Uses of Arrogance: Silicon Valley Confronts a Less Adoring World

Being right is annoying. Being right and rich is even worse. Being right, rich and cocky about it is insufferable.

But that doesnt make not being adored any easier.

Silicon Valley is confronting a new reality. And there's no app for it. The thirty years of unalloyed hero worship are ending with recriminations, distrust and resentment.

Working for an industry in which the entire world know its leaders by their first or nick names - Bill, Steve, Sergey, Zuck - no longer counts for much when the sum total of their efforts means loss of job or home for you and more money for them. And that they are unapologetic about it or cheerily proclaim that they are changing the world for its own good no longer resonates with quite the same universally admired clarity.

The reality is that tech, as an industry, is growing up in a world which no longer views its efforts as magical so much as useful. The thrill is gone, baby. Sure, the faithful will still line up like lemmings to buy the latest iToy, but other companies' ads can mock them for it - and everyone gets the joke.

The problem is that some of the industry's putative leaders still behave like geeky adolescents: a Princeton-educated VC proposing that Silicon Valley secede from California, or a newly wealthy Indian immigrant suggesting The Valley is the only true source of  global value creation or others comparing industry critics to Nazis and communists or that women will never make it because they havent been hacking since grade school.

The reality is that all those whiners and losers are also customers and at some point, convergence and commoditization means you'd better have something to commend you because otherwise it's all just about price. JL

Joel Stein comments in Business Week:

“In the future there’s potentially two types of jobs: where you tell a machine what to do, programming a computer, or a machine is going to tell you what to do. You’re either the one that creates the automation or you’re getting automated.”

Aug 11, 2014

The McDonalds Franchise: The Implications of Selling and Managing the Brand, Not the Product

'In business for yourself, but not by yourself.' That is the McDonalds franchise mantra. It implies a supportive partnership. 

But the relationship is far from equal and the restrictive nature of the economics means that franchisees have few options when it comes to managing the profitability of their enterprise.

Ironically, however, the very dimensions of that interaction may now threaten the corporate legal shield behind which McDonalds has attempted to manage its risk exposure, its financial results and the nature of the relationship with its business partners and 'associates.'

The courts have penetrated the corporate veil, exposing what many competitors have long observed about the way in which McDonalds imposes its demands while maintaining its deniability as a potential litigant.

The economics that drove the company's performance model for so long may now be undermining it. By forcing franchisees to apply  relentlessly downward pressure on the cost of employment and, in turn, by contributing to a broader societal decline in household income, McDonalds finds itself facing chronically disappointing financial results and, at the same time, increasing pressure to improve compensation. This, in turn, may well have damaged a once unassailable brand by raising questions about perceptions of operating philosophy that have also dogged that other low-end employer and merchant, Walmart. The courts become involved when the situation becomes sufficiently insupportable that questions of viability have to be adjudicated since no other alternative appears effective.

The reality may be that the company was too busy managing its brand and that process that feeds it to notice how its own actions were undermining that value it thought it was creating in perpetuity. JL

Cathy O'Neill reports in the Mathbabe:

By the franchise contract, the money available to a franchise owner is left over after they pay McDonalds for advertising, buy all the equipment and food that McDonalds tells them to from the sources that they tell them to, and after they pay for rent on the property (which McDonalds typically owns)

Changing Consumer Habits Are Transforming Marketing: Retailers and Brands Must Adapt

It used to be said that old habits die hard. Now, it must be added that they die fast.

Marketing strategies based on observed behaviors may be only as relevant - and useful - as the last time they were employed. Although it seems as if the internet and the smart phone have been with us forever, their ubiquity and consumers' familiarity with them is still evolving. This is forcing merchants to rethink and adapt rather than strategize, then plan.

Traditional delineations of responsibility are being supplanted by new relationships based on flexibility and adaptability. The sum total of what you dont know is likely to dwarf that which you do know. The implication is that brands and those who sell them at whatever point in the value chain must be quick to identify their own points of reference rather than relying too much on aggregations.

The biggest challenge, as the following article explains will be assessing how benchmarks and other metrics will change over time. Becoming wedded to one set of measures is as misleading as becoming dependent on one way of selling or one set of customers. The good news is that when nothing is certain, everything is possible. JL

Allen Mason comments in Advertising Age:

With the rapid shift in digital consumer habits and e-commerce, the role of the retailer is much more strategic. Amazon is both a retail partner to brands and a media property for
brand marketing investment.

Getting Paid: How Disparate Rulings About Silicon Valley Engineers and Amateur Athletes Define Intellectual and Human Capital Ownership

Just another sweltering Saturday morning in August. Not much to do but think about how to get under some cool water somewhere.

Unless, that is, you happened to glance at the front page of the New York Times and noticed two articles about seemingly different realms of human endeavor, but placed curiously close to each other below the fold.

One of them reported that the judge in the case charging that Silicon Valley firms had conspired not to poach one another's engineers, in effect depressing their compensation potential, had rejected a proposed settlement as inadequate given the evidence - and the financial damage to the employees. The other article reported that judges had similarly rejected National Collegiate Athletic Association (NCAA) claims that amateur student athletes had no right to compensation for the use of their images, names and likenesses.

What the two have in common is a growing consensus - now effectively codified (despite whatever appeals may be filed) that the era of free information is over. Gone. Dead. History.

The Big Data era is predicated on the assumption that there is a substantial return available on all that personal information just waiting to be analyzed interpreted and deployed to reduce uncertainty and increase the likelihood that whoever it targets will pony up for whatever's being sold. The size of that return, however, fluctuates with the cost of the underlying data being acquired. And the reality is that, by and large, the return is so large because for much of that info, there is virtually no cost. 

The cost is minimized because its sourcing is based on the assumption that consumers will often provide useful information without demanding compensation. They behave this way because they believe they are obligated to do so or because they perceive some advantage in providing, however imprecise the terms of the exchange. By extension, the enterprises interested in using this data have operated on a model that suggests - or has suggested until now - that their rights supersede those of their customers, employees and suppliers.


What the rulings described in the articles below infer is that that model may no longer be valid or, at the very least, no longer as profitable. Because the notion of ownership and the rights that accrue to such ownership are catching up with the economic value inherent in the uses of that data. So, however seemingly subordinate a person's power may be in relation to the forces guiding his or her life, the right to basic protections governing compensation for activities, freedom of movement and ownership of ideas, likenesses, names and images - assets once considered so 'intangible' as to be incapable, even unworthy, of any accounting treatment - now have value that can be defined, analyzed, protected - and monetized. JL

David Streitfeld, Ben Straus and Marc Tracy report in two New York Times articles:

1)Silicon Valley was engaged in an "overarching conspiracy" against its own employees that accused leading tech companies of agreeing not to poach one another's engineers.
2)"the NCAA does not provide credible evidence that demand for the product would decrease if student-athletes were permitted to receive a share of the revenue generated from the use of their own names, images, and likenesses,”

Aug 10, 2014

Why Learning Institutions Are Replacing Their iPads with PCs

The trend seemed unmistakeable. Until it wasn't.

Questions raised a few years ago about the impact of technology on learning have appeared to give way to what situations, if any were inappropriate and what sort of devices should be employed.

The iPad appeared to own the education market. It was easy and fun, which was the point at first. But then it became part of the problem, not the solution.

The challenge is one that has dogged the tablet market since its inception. Those cares were initially blown away by the scale of market acceptance. But then the old concerns returned to haunt the manufacturers across the breadth of the consumer environment. And no where has there been more revisionist thinking than in education. Power, utility and purpose conflict with convenience. But in an era of budget cuts and austerity, cost has become a significant factor. Tablets could be less expensive at the outset, but like all assets, maintenance, repair and replacement costs were soon found to favor the much maligned PC.

Just as reports of the PCs demise have been exaggerated, so too, will be those of the tablet. The reality is that our experience spans such a short period of human experimentation that almost no initial opinions are worth much, especially when features are changing so rapidly. As the following article explains, tablets have their benefits, but when coding is fast becoming a mandatory 'language' study, the PC has advantages. We are only beginning to understand the interplay of hard and software in our ability to meld technology with the rest of our lives. There will be further revisions of position and purpose, while notions of what is 'best' remain subject to changing needs. JL

Meghan Murphy reports in The Atlantic:

Students saw the iPad as a “fun” gaming environment, while the Chromebook was perceived as a place to “get to work.” It was far easier to manage Chromebooks than iPads. Since all the Chromebook files live in an online “cloud,” students could be up and running in seconds on a new device. And apps could be pushed to all of the devices with just a few mouse clicks.

Credit Card Debt Growth Exceeds US Wage Growth

'Can I pay you back on Tuesday for a hamburger today?' That was the question the character Wimpy was always asking in the Popeye the Sailor cartoon series

It reflects the implications of the news that credit card debt growth has now exceeded US wage growth and may continue to do so for some time.

Usually, people load up on credit card debt when they dont have enough current income to cover their expenses. But it suggests that they believe they will be able to pay it back in the not-too-distant future. The problem, of course, is trying to figure out who has a reasonable chance of doing so and who is hoping, wishing and otherwise deluding themselves.

At some point, of course, if the debt obligations keep rising and the wage growth does not, there will be a huge gap between obligation to pay and capacity to do so. That is how financial crises often happen. We know that, of course, but we seem destined to relearn it every few years. Because the pressure to not permit credit expansion is greater than the demand for fiscal probity as well, frankly, as the support for actually increasing wages. Credit card debt boosts sales, higher wages cost money and fiscal probity is just an election year slogan, so the odds are already stacked.

The lines will cross and the bubble will burst but if history is any guide, whether any steps will be taken to do something about it before that happens seems increasingly remote. JL

Walter Kurtz reports in SoberLook:

In the long run this is not going to be sustainable...

The Human Brain Subliminally Judges the Trustworthiness of Faces: Digital Implications?

Who you gonna believe: your instincts or your lyin' eyes?

Research demonstrating that the brain makes subliminal judgments about trustworthiness based on responses to certain visual clues is perhaps not all that surprising. After all, for the several tens of thousands of years prior to Big Data, mankind did not have a lot to go on when it came to telling friend from foe so something had to help support the propagation of the species.

If a furrowed brow was one of those 'watch out!' tells, well, thanks for the tip. But it is the implication for extrapolating such knowledge into useful intelligence for the digital era that bears further examination. There are already reports of billboards that can 'read' us as we pass by, of programs that relentlessly sift our personal data for clues in order to increase the chance that we'll respond positively to a sales pitch or election campaign. The interactivity of our electronic devices is making such analyses both more sophisticated and ubiquitous.

This takes us into the realm of fairness and rights and liberties and privacy and personal and commercial. Where does one end and another begin? Should there even be such boundaries and if so, who should set them? Society is currently torn between its desire to monetize anything it can and the need to preserve some semblance of personal freedom. Where that balance will be struck is by no means certain. JL

Ian Sample reports in The Guardian:

Even though people might not have conscious awareness, they might move back very subtly when perceiving an untrustworthy face