Skip to content
Algorithmic Economics
  • Home
  • Mission
  • Text as data
  • Macro analysis
  • Research
  • Blog
  • About us
  • Contact Us
Does the productivity gap actually exist? Uncategorized

Does the productivity gap actually exist?

  • July 4, 2017
  • by Rickard Nyman

Does the productivity gap actually exist?

Whoever wins the election tomorrow will have to grapple with what appears to be a fundamental economic problem. Estimated productivity growth in the UK is virtually at a standstill.

The standard definition of productivity is the average output per employee across the economy as a whole, after adjusting output for inflation – or “real” output, in the jargon of economics.

The amount in 2016 was the same as it was almost a decade ago in 2007, immediately prior to the financial crisis.

Productivity is not just some abstract concept from economic theory. It has huge practical implications. Ultimately, it determines living standards.

Productivity is real output divided by employment. The Office for National Statistics (ONS) has a pretty accurate idea of how many people are employed in the economy. They get data from company tax returns to HMRC.

What about output? The ONS uses a wide range of sources to compile its estimates. But these essentially provide it with information about the total value of what the UK is producing.

The ONS has the key task of breaking this number down into increases in value which are simply due to inflation, and those which represent a rise in real output.

This problem, easy to state, is fiendishly difficult to solve in practice. To take a simple illustrative example, imagine a car firm makes exactly 10,000 vehicles of a particular kind in each of two successive years, and sells them at an identical price. It seems that real output is the same in both years.

But suppose that in the second year, the car is equipped with heated seats. The sale price has not changed. But buyers are getting a better quality model, and some would pay a bit extra for the seats. So the effective price, taking into account all the features, has fallen slightly.

Assessing the impact of quality changes is the bane of national accounts statisticians’ lives. The car example above is very simple. But how do you assess the quality change when, for example, smartphones were introduced?

The ONS and its equivalents elsewhere, such as the Bureau of Economic Analysis in America, are very much aware of this problem. But even by the early 2000s, leading econometricians such as MIT’s Jerry Hausman were arguing that the internet alone was leading inflation to be overestimated by about 1 per cent a year, and real output growth correspondingly underestimated.

Martin Feldstein is the latest top economist adding his name to this view. Feldstein is a former chairman of the President’s Council of Economic Advisers, so he is no ivory tower boffin.

In the latest Journal of Economic Perspectives, Feldstein writes:

“I have concluded that the official data understate the changes of real output and productivity. The measurement problem has become increasingly difficult with the rising share of services that has grown from about 50 per cent of private sector GDP in 1950 to about 70 per cent of private GDP now”.

The Bean report into national accounts statistics last year acknowledged these problems. It could well be that there is.

Paul Ormerod 

As published in City AM Wednesday 7th June 2017

Image: Smartphone by JÉSHOOTS  is licensed under CC by 2.0

Sorry Corbyn, consumers aren’t as sold on nationalisation as you’d like to think Uncategorized

Sorry Corbyn, consumers aren’t as sold on nationalisation as…

  • July 4, 2017
  • by Rickard Nyman

Sorry Corbyn, consumers aren’t as sold on nationalisation as you’d like to think

One of the most remarkable features of the Conservative election campaign was the dog which did not bark.

There was no systematic attempt to undermine Jeremy Corbyn’s wholly implausible economic narrative. Magic Money Tree comments aside, Labour’s economic incompetence was allowed to pass almost unchallenged.

One part of Labour’s economic offer which really did strike a chord with the electorate was the promise to nationalise industries such as rail and water. To anyone with direct experience of the old British Rail or the Post Office (which made you wait six months to get a phone installed) this almost defies belief. But only those over 55 can remember.

The fact is that for a number of years there has been strong and consistent support in surveys for taking industries such as rail into public ownership.

In 2013, for example, the moderate Labour website Labour List commissioned an analysis by the poll company Survation. In terms of rail nationalisation, 42 per cent thought fares would be cheaper, compared to only 12 per cent who thought they would go up. Those believing the quality of the services would improve easily outnumbered those who thought it would get worse, by 38 to 14 per cent. There are many similar examples.

Economists are pretty dismissive of the results of surveys about hypothetical situations or choices. A key foundation of economic theory is the concept of revealed preference, to use the jargon phrase. Individuals are assumed to have reasonably stable tastes and preferences. These preferences are revealed not through answers to hypothetical questions, but through how they actually respond to changes in the set of incentives which they face.

In the National Passenger Survey, for example, 80 per cent of respondents routinely express satisfaction with their journey, compared to fewer than 10 per cent who are dissatisfied. But how does this translate into actual decisions?

Prior to rail privatisation just after the 1992 election, the peak number of passenger journeys made each year was some 1.1bn in the mid-1950s. Faced with rapidly rising road competition, the rail industry saw journeys fall steadily, to a trough of around 750m in the mid-1990s.

After privatisation, massive investment programmes have been carried out and, in the form of the train operating companies, there is now a distinct part of the industry whose priority is the consumer. Journey numbers rose, passing the 1bn mark in 2003, to the current level of 1.7bn, a figure not seen since the early 1920s, when road competition was weak.

So the revealed preference of consumers seems to be that they rather like the current structure. They actively choose to use rail in massive numbers.

Rather like a good Party member in George Orwell’s book 1984, the electorate seems capable of believing two contradictory things at the same time. This reinforces the importance of narratives in politics. Trying to treat voters as rational agents often ends in tears, as both Cameron and May have discovered.

Paul Ormerod 

As published in City AM Wednesday 14th June 2017

Image: Jeremy Corbyn by Garry Knight is licensed under CC by 2.0

Less austerity will always mean more tax Uncategorized

Less austerity will always mean more tax

  • July 4, 2017July 4, 2017
  • by Rickard Nyman

Less austerity will always mean more tax

There is a great deal of discussion, following the election, of relaxing or even abandoning austerity.

There is an equal amount of confusion about this, because the same word is being used to describe two quite separate concepts.

The consequences of the government changing its policy on austerity are dramatically different, depending on which one it is.

One meaning of the word is what we might call “social austerity”. From any given pot of money available to a government, its supporters believe that, in general, tax cuts should be promoted rather than public spending increased. Opponents argue that public spending as a result has become underfunded. Local councils, education, and the NHS all need more money.

Social austerity can be relieved, as even the DUP and some Conservatives argue, by increasing spending appropriately, and funding it by increases in taxation. This was an important aspect of Labour’s manifesto, and the tragedy at Grenfell Tower has intensified the discussion around it.

The main risk is purely political. Are voters really and truly willing to pay more tax, rather than just wanting someone else to pay it?

There are some potential adverse economic consequences if the policy of higher taxation is pushed too far. Former French President Francois Hollande’s 75 per cent tax rate led to several hundred thousand skilled young people leaving France, mainly for the UK. If companies are taxed too heavily, they may choose to locate to another country. Both skilled labour and capital are geographically mobile.

But, within reason, social austerity could be relaxed without perhaps too many fears in this direction.

“Economic austerity” is quite a different matter. Opponents of this want to increase the gap between government spending and tax receipts – the so-called fiscal deficit. This is funded by issuing government bonds. So the deficit in any given year goes up, and the outstanding stock of government debt also rises.

Any relaxation of social austerity is paid for by higher taxes now. Any relaxation of economic austerity is paid for by borrowing more now.

But the debt has to be repaid at some point, and the interest payments on it must be met. So taxes in the future will be higher. Either way, less austerity means more tax.

John Maynard Keynes himself made it very clear that increasing public spending at a time of full employment would simply lead to more inflation. There are areas of the country where there probably are people registered as unemployed who genuinely do want to work – the Welsh Valleys, for example. But the rest of the UK is at full employment.

The number of people in employment is at an all-time high, at 32m. This has risen by 2.8m since 2010. Meanwhile the unemployment rate has fallen from 7.9 per cent in 2010 to just 4.6 per cent today.

Any major fiscal stimulus to the economy now would simply bid up wages, leading to higher costs and higher inflation.

The public mood on social austerity may have shifted. But the case for economic austerity is stronger than it has ever been.

Paul Ormerod 

As published in City AM Wednesday 21st June 2017

Image: People’s assembly by Peter Damian is licensed under CC by 2.0

How to stop tech hubs in urban hotspots from intensifying geographic inequalities Uncategorized

How to stop tech hubs in urban hotspots from…

  • July 4, 2017July 4, 2017
  • by Rickard Nyman

How to stop tech hubs in urban hotspots from intensifying geographic inequalities

Perhaps George Osborne’s most abiding legacy from his time as chancellor will be the creation of the concept of the Northern Powerhouse. Certainly Manchester, its principal focus, is booming.

The landscape of the centre is being altered dramatically by skyscrapers. Peel Holdings, the huge investment and property outfit, is planning to double the size of the development around Media City in the old docks, where the BBC was relocated. The airport, already the third busiest in the UK, is expanding.

All in all, it seems a triumph for modern capitalism. After decades of relative decline, a city is being transformed by private enterprise. But what is really going on?

In a piece this month in the MIT publication Technology Review, urban guru Richard Florida has picked up on a startling new trend in the location of new technology companies in the US.

In the 1980s, there were essentially no high tech companies in city locations. Instead, we had Intel and Apple in Silicon Valley, Microsoft in the Seattle suburbs, the Route 128 beltway outside Boston, and the corporate campuses of North Carolina’s Research Triangle.

Now, urban centres are rapidly becoming the places which attract technology companies. In 2016, the San Francisco metro area was top of the list for venture capital investment, attracting more than three times the amount of the iconic location of Silicon Valley. Google has taken over the old Port Authority building in Manhattan. Amazon’s headquarters are in downtown Seattle.

The impact of this new, high concentration of tech firms is to intensify geographic inequalities. As Florida puts it: “tech startups helped turn a handful of metro areas into megastars. Now, they’re tearing those cities apart.”

A relatively small number of urban areas in America, and within them a small number of neighbourhoods, are capturing all the benefits.

The same sort of thing seems to be going on in Greater Manchester. A few areas are soaring away and attracting wealth and talent. In 1981, fewer than 600 people lived in what the Council describes as “the heart of Manchester”. Now, over 50,000 do, almost all of them young graduates.

But the more traditional outlying boroughs of the city region, especially to the north and east, are struggling to capture any trickle down from this massive transformation. Indeed, they are at risk of losing out, as their young bright sparks are attracted by the life of the inner metropolis.

Richard Florida does not just identify the problem, he suggests some possible solutions. One of which is a programme of building lots of good housing in the outlying areas, supplemented by a top class public transport service. This would keep house prices down, and attract some of the people stuck in rabbit warrens in the urban centres.

Manchester already has a modern tram service. But the new Labour mayor, Andy Burnham, is resolutely opposed to building on the green belt just to the north and east of the city. Yet another example of the sanctimonious intentions of the Left serving to intensify, not reduce, inequality.

Paul Ormerod 

As published in City AM Thursday 29th June 2017

Image: Media City UK by Magnus D is licensed under CC by 2.0

Understanding humans as well as maths Uncategorized

Understanding humans as well as maths

  • April 24, 2017July 4, 2017
  • by Rickard Nyman

Understanding humans as well as maths

The largest global cyber-attack in history, using malware known as WannaCry or Wanna Decryptor, infected Windows computers in a network.  It utilised a security glitch in the Windows operating system that allowed it to jump from computer to computer on an internal network

It appears to have been halted, largely by chance, by a British 22-year-old cyber expert when he discovered a failsafe that when triggered made the malware essentially ‘self-destruct’.  By registering a certain web address the computer virus that held organisations and individuals at ransom in over 150 countries globally and that halted large parts of the NHS in Britain terminated itself. The 22-year-old admits to this being an accidental discovery as he tweeted:

“I will confess that I was unaware registering the domain would stop the malware until after I registered it, so initially it was accidental”

The attack was in terms of computing technology, simple and even naive.  For example, no sophisticated password-breaking software was used.

But the developers of the malware had a deep understanding of the behavioural biases which are frequently encountered in organisations.

The attackers simply hoped that at least one user on a network would click on a malicious link an email or download a malicious attachment, for example a PDF file seemingly sent to you by your boss with the software hidden inside. The virus could then get to work with minimal sophistication.   One of the ways the Bletchley Park team cracked the German Enigma code was because some of the lazier enemy users of it began their first messages of the day in similar fashion day after day.  So the WannaCry developers simply relied upon human inertia.

But even so how could this happen? Microsoft had released a system patch for its operating system long before the virus became large-scale operational. The NHS in Britain was notified by Microsoft two months ago about a security patch that could have prevented the spread of the virus within the organisation. But nearly all NHS trusts use a version of the Windows operating system that Microsoft stopped providing security updates for as long as three years ago.

A long-standing reluctance by large organisations to roll out system updates before substantial internal testing clearly contributed to the crisis. In effect, paranoia about system security was a cause of this system vulnerability.

This problem is not new. The illusion that elaborate rule-based systems can eliminate systemic risk was prevalent amongst regulators in the run up to the financial crisis, and still persist even to this day.  The apparent security provided by having lots of boxes ticked and the paperwork passed through endless committees before an update could be approved proved to be completely false.

The state of affairs puts into question the complicated security procedures in place within organisations. For example, what is the point of spamming staff with emails forcing them to update their passwords as often as once every two/three months when this large-scale cyber-attack took place without breaking even a single password?

A study published in the Proceedings of the 17th ACM conference on Computer and communications security attempted to assess the security advantages of password expiration policies, aptly titled The security of modern password expiration: an algorithmic framework and empirical analysis. It has long been thought that frequent password updates makes it much harder for hackers to guess correct passwords. However, this assumption is now rightly being questioned.

The study points out that users asked to update their passwords will often apply mental shortcuts and heuristics (due to what we would term cognitive overload) on all but the first password. For example, changing a character to a symbol or number, such as ‘s’ to ‘$’ or ‘A’ to ‘4’, by removing or adding a special character (for example changing ‘!!!’ to ‘!!’) or by simply incrementing a number.

Such mental shortcuts introduce patterns and biases to your password that makes it much easier to guess your updated password if you have at least partial access to the password history. Hence, once a hacker knows at least one of your past passwords using state-of-the-art password breaking software he/she can easily guess your current password, often in a matter of seconds (more specifically, the researchers in this study could break the current password of approximately 41% of the accounts in less than three seconds, if they knew their previous password).

The situation might be even more disconcerting as reported in a technical report by Carnegie Mellon University[1]. In a study, several personal characteristics and traits were correlated with password strengths among CMU students, faculty and staff. They found a strong correlation indicating that individuals who reported annoyance with university password policies also tend to choose weaker passwords. It is not difficult to imagine the causal argument.

Like the reluctance to roll out system updates without substantial testing by an organisation’s IT department, the obsessiveness for frequent password updates may in fact reduce overall system security rather than enhance it.

Perhaps do us all a favour and stop filling our inboxes with password-update requests.

[1] https://www.cylab.cmu.edu/research/techreports/2013/tr_cylab13013.html

Rickard Nyman

Image: Data Security Breach by Blogtrepreneur is licensed under CC by 2.0

Posts pagination

1 2
algorithmiceconomics.com
Theme by Colorlib Powered by WordPress
 

Loading Comments...