Pearson - Always learning

All your resources for Economics

RSS icon Subscribe | Text size

Posts Tagged ‘average costs’

The new approach to improving people’s access to ultrafast broadband – will it work?

Concerns have been expressed about the UK’s relatively poor record of upgrading broadband services so that households can receive ultrafast connectivity. Some commenters have argued that future economic growth prospects will be harmed if the UK continues to lag behind its leading rivals.

Much of the fixed line system that allows people to connect to broadband was originally installed many years ago for the land-line telephone network. The so called ‘final mile’ consists of copper-based wiring that is carried from street cabinets to the premises of the end-user. This wiring is transported via a huge network of telegraph poles and cable ducts (small underground tunnels).

In order for people to gain connectivity to ultrafast broadband this copper based wiring needs to be replaced by fibre optic cables. This is commonly referred to as Fibre to the Premises (FTTP). Unfortunately, the UK has a relatively poor record of installing FTTP. Japan and Korea were forecast to have 70% and 63% coverage by the end of 2015 as opposed to just 2% in the UK.

Why is the UK’s record so poor? Many observers blame it on the structure of the industry. In other network industries, such as those for gas pipeline and electricity grids, the business responsible for managing the infrastructure, National Grid, is a regulated monopoly. This company does not directly supply services to consumers using the network it is responsible for maintaining. Instead, customers are supplied by the retail sector of the industry, where firms compete for their business. This sector includes the so-called ‘big six’ (British Gas; npower; SSE; Scottish Power; EDF; E.On) and a number of smaller suppliers such as Ovo Energy and Ebico.

The structure of the fixed line telecommunications sector is very different. The company that manages the ‘final mile’, Openreach, is a subsidiary of BT. BT also competes with other Internet Service Providers (ISPs), such as TalkTalk and Sky, to supply broadband to customers using this network. Its market share of 32 per cent makes it the largest player in the broadband market. Sky and TalkTalk have market shares of 22 per cent and 14 per cent respectively. Virgin Media also supplies 20 per cent of this market using its own network of ducts and cables.

Given that in most cases ISPs such as Sky and TalkTalk are stuck with the network Openreach provides, BT may have limited incentives to invest. It can still earn a good return from its infrastructure of copper-based wiring and avoid installing expensive FTTP. Dido Harding, the chief executive of TalkTalk, argued that:

“We need to separate Openreach from the rest of BT to create a more competitive, pro-investment market”

Ofcom, in its recent review of the market, has taken a different approach. Rather than creating an entirely separate monopoly business to manage the network (i.e. splitting Openreach from BT), the regulator instead opted for a policy of encouraging competition between different suppliers that deploy fibre optic cables. It states in the report that:

“We believe competition between different networks is the best way to drive investment in high-quality, innovative services for customers.”

This competition could come from ISPs such as TalkTalk and Sky or other smaller network providers such as CityFibre and Gigaclear.

One major problem with this approach is that potential new entrants might be deterred from entering the market because of the very high initial costs involved in building a new network in order to deploy FTTP. In particular, the costs of digging up the roads and laying the ducts are considerable. Matt Yardley, author of a study on the industry, said:

“It is widely accepted that civil works such as digging trenches account for up to 80% of broadband deployment costs.”

One way of reducing these costs and encouraging more competition is to allow rival firms access to the existing ducts and poles that are currently managed by Openreach. Once access has been obtained, these firms could effectively rent space inside the ducts and lay fibre optic cables alongside the existing copper-based wiring. Vodafone reported that a similar policy in Spain had reduced its capital expenditure of building FTTP by 40 per cent compared with constructing its own network of ducts and poles.

Ofcom first introduced this type of policy in 2010 when it launched its Physical Infrastructure Access (PIA) initiative. Unfortunately it has proved to be relatively unsuccessful with very little demand for PIA from rival firms. The success of this type of policy will depend on a number of factors including (1) the prices charged by Openreach to access and rent space inside the ducts; (2) the simplicity of any relevant administration; and (3) the availability and reliability of information about the ducts. With this last point, key issues include:

Where they are located .
How much space is available: i.e. is there enough space for firms to lay fibre optic cables alongside the existing wiring?
What condition they are in: i.e. are they flooded or clogged up with sand and mud, which will involve expensive work to make them usable again?

Firms did complain about the pricing structures and bureaucratic nature of the administration process under the PIA scheme. However, their most significant concerns were about the uncertainty that was created by the lack of information about the ducts and poles. For example, analysts from the consultancy firm, Reburn, argued that if a firm contacted Openreach to try to obtain access to the network it was informed that:

“We don’t know what condition the ducts and poles are in. Please pay £10 000 for a survey. Also unfortunately we are rather busy and we can only start in six weeks.”

Matthew Hare, the chief executive of Gigaclear, argued that it was like going to a shop where the assistant says:

“Give me some money, and I’ll tell you whether you can have it or not.”

In response to these criticisms Ofcom has introduced a number of changes to PIA, which has been re-named Duct and Pole Access (DPA). In particular, it has imposed a new requirement on Openreach to create a database that provides information on the location, condition and capacity of its ducts and poles. The database must be made available to rival ISPs and network providers. DPA must also be provided on the same timescales, terms and conditions to all businesses including other parts of BT – this is referred to as ‘equivalence of inputs’.

The first big test of this policy is in Southend where City Fibre is hoping to deploy 50km of fibre optic cables using DPA. However, reports in the media have suggested that the initial surveys have found very limited capacity in some of the ducts, which would make DPA impossible.

It will be interesting to see how the trial in Southend progresses. If it is successful, then DPA may be viable for about 40 per cent of premises in the UK. If it fails, then Ofcom might ultimately have to force Openreach to be completely separated from BT.

Articles
How the gothic city of York became a broadband battleground The Telegraph, Kate Palmer (22/5/16)
City Fibre first to mount BT challenge after Openreach is told to share network The Telegraph, Kate Palmer (1/3/16)
Challenges as CityFibre Moot Using BT Cable Ducts in Southend-on-Sea ISPreview, Mark Jackson (2/5/16)
CityFibre to build pure fibre infrastructure for Southend Networking (5/4/16)
Ofcom tells BT to open up infrastructure to rivals The Guardian, Rob Davies (26/2/16)

Questions

  1. Draw an average total cost curve to illustrate the economics of building a network of ducts and poles. Label the minimum efficient scale.
  2. To what extent does DPA create a contestable market?
  3. For DPA to deliver productive efficiency, what must be true about the economies of scale of laying fibre optic cables?
  4. In the run-up to Ofcom’s review of the telecoms industry, many commentators described Openreach as being a natural monopoly. To what extent do you agree with this argument?
  5. What are the advantages of marginal cost pricing? What issues might a regulator face if it tried to impose marginal cost pricing on a natural monopoly?
  6. Using a diagram, explain how the network of ducts and poles might be a natural monopoly in rural areas but not in densely populated urban areas.
  7. Discuss how Ofcom has tried to increase the level of separation between Openreach and BT.
Share in top social networks!

The oil industry and low oil prices

Oil prices will remain below $60 per barrel for the foreseeable future. At least this is what is being assumed by most oil producing companies. In the more distant future, prices may rise as investment in fracking, tar sands and new wells dries up. In meantime, however, marginal costs are sufficiently low as to make it economically viable to continue extracting oil from most sources at current prices.

The low prices are partly the result of increases in supply from large-scale investment in new sources of oil over the past few years and increased output by OPEC. They are also partly the result of falling demand from China.

But are low prices all bad news for the oil industry? It depends on the sector of the industry. Extraction and exploration may be having a hard time; but downstream, the refining, petrochemicals, distribution and retail sectors are benefiting from the lower costs of crude oil. For the big integrated oil companies, such as BP, the overall effect may not be as detrimental as the profits from oil production suggest.

Articles
BP – low oil price isn’t all bad new BBC News, Kamal Ahmed (27/10/15)
Want to See Who’s Happy About Low Oil Prices? Look at Refiners Bloomberg, Dan Murtaugh (31/10/15)
Low prices are crushing Canada’s oil sands industry. Shell’s the latest casualty. Vox, Brad Plumer (28/10/15)

Data
Brent spot crude oil prices US Energy Information Administration
BP Quarterly results and webcast BP

Questions

  1. Why have oil prices fallen?
  2. What is likely to happen to the supply of oil (a) over the next three years; (b) in the longer term?
  3. Draw a diagram with average and marginal costs and revenue to show why it may be profitable to continue producing oil in the short run at $50 per barrel. Why may it not be profitable to invest in new sources of supply if the price remains at current levels?
  4. Find out in what downstream sectors BP is involved and what has happened to its profits in these sectors.
  5. Draw a diagram with average and marginal costs and revenue to show why profits may be increasing from the wholesaling of petrol and diesel to filling stations.
  6. How is price elasticity of demand relevant to the profitablity of downstream sectors in the context of falling costs?
Share in top social networks!

Operating in a cloud

“There are ‘incredible economies of scale in cloud computing’ that make it a compelling alternative to traditional enterprise data centers.” According to the first article below, cloud computing represents a step change in the way businesses are likely to handle data or use software. Rather than having their own servers with their own programs, they use a centralised service or ‘public cloud’, provided by a company such as Microsoft, Google or Amazon Web Services. The cloud is accessed via the Internet or a dedicated network. It can thus be accessed not only from company premises but by mobile workers using tablets or other devices and thus makes telecommuting more cost effective.

There are considerable economies of scale in providing these computing services, with the minimum efficient scale considerably above the output of individual users. By accessing the cloud, individual users can benefit from the low average costs achieved by the cloud provider without having to invest in, and frequently update, the hardware and software themselves.

In the case of large companies, rather than using a public cloud, they can use a ‘private cloud’. This is hosted by the IT department in the company and achieves economies of scale at this level by removing the need for individual departments to purchase their own software and servers. Of course, the costs of providing the cloud is borne by the company itself and thus the benefits of lower up-front IT capital costs are reduced. This is clearly a less radical development and is really only an extension of the policy of many companies over the years of having centralised servers holding data and various software packages.

In autumn 2010, EMC Computer Systems commissioned economists at the Centre for Economics and Business Research (cebr) to quantify the full impact that cloud computing will have over the years ahead. According to the report, The Cloud Dividend:

The Cebr’s research calculates that €177.3 billion per year will be generated by 2015, if companies across Europe’s five largest economies continue to adopt cloud technology as expected.

The Cebr found that the annual economic benefit of cloud computing, by 2015, will be:
• France – €37.4 billion
• Germany – €49.6 billion
• Italy – €35.1 billion
• Spain – €25.2 billion
• UK – €30.0 billion

Will the ability of cloud computing to drive down the costs of IT mean that a new revolution is underway? Just how significant are the economies of scale and are they likely to grow as cloud providers themselves grow in size and experience? The following articles look at some of the issues.

Articles
Microsoft: ‘Incredible Economies Of Scale’ Await Cloud Users InformationWeek, Charles Babcock (11/5/11)
Cloud in 2011: A bright new dawn…or a shadow hanging over IT pros? The Register, Lucy Sherriff (28/5/11)
Bubbles and Golden Ages So Entrepreneurial, Nick Hughes (24/5/11)

Reports and information
The Cloud Dividend EMC2
Cloud Computing Wikipedia

Questions

  1. What specific economies of scale are achieved through cloud computing?
  2. Why might the minimum efficient scale of cloud computing services be above the level of output of many companies?
  3. What are the downsides to cloud computing?
  4. How would you set about assessing the statement that we are on the brink of a fundamental revolution in business computing?
  5. Why are customer-heavy sectors, such as financial services, utilities, governments, leisure and retail, expected to buy into the concept fastest?
  6. How can product life cycle analysis help to understand the stages in the adoption of cloud computing?
Share in top social networks!