Towards “Good Spectrum Citizenship”: Some Possibly Useful Heresies
by Robert Horvitz email@example.com http://www.openspectrum.infol
First presented to the Workshop on “Legal, Economic & Technical Aspects of ‘Collective Use’ of Spectrum in the European Community” Brussels, Belgium, 27April 2006 We are big fans of the EU’s framework for electronic communications. This framework is founded on bedrock principles of liberal market economics and distills “best practices” that developed piece-meal over decades, forming them into a coherent package. We especially applaud the policy of “regulating only where necessary” and in particular the policy of “licensing only where necessary” as expressed in the Authorisation Directive.
So it should not be a surprise to see how positively the European Commission’s Terms of Reference for this study on the “collective use of spectrum” treats its subject – asking how the “commons” paradigm can be extended, “what technological development will enable a wider use of shared spectrum … concrete measures to exploit the benefits of the ‘collective use’ approach,” etc. Bravo!
Many of you undoubtedly came to this meeting by train. When we take a long journey by train, we often have a choice of a private cabin, a shared cabin, or open seating without any cabins at all. Railroads know that most people will choose open seating to save money, so most of the seating they provide is in wagons without cabins. But the option of getting an exclusive space for the journey is available to those who want it and who are willing to pay more for it.
12 Volume 34 Number 3 July I August 2006
The analogy may be simplistic, but it might be useful to consider this approach in radio spectrum management. Traditionally, individual spectrum users are not free to choose whether they operate under a licensed or license exempt, exclusive or shared, primary or secondary regime because the regulator and the ITU have already made that choice for them. But when the Commission launched its WAPECS consultation in 2005 a survey of EU members found that “a wide range of frequency bands was identified for WAPECS, the majority being for licence-exempt operation”.
A possibility of choice
Since both licensed and unlicensed bands are being considered for WAPECS – and for WiMAX, and for radio LANs in the 5 GHz band – and perhaps even for Ultra-Wideband – these are situations in which service providers have some possibility to choose among licensing regimes. These are opportunities where the Commission can see what happens when providers are offered a choice, and so better understand which regime is appropriate for various services under various conditions, from the users’ perspective, and learn more about how heterogeneous mixes of spectrum rights within a service interact.
It has proven difficult for the Radio Spectrum Policy Group (RSPG) to determine when a given service should be licensed or license exempt. We think this is something that ought to be decided by the marketplace, and even by individual users, as it is on trains – although it might reduce spectrum utilisation efficiency somewhat and require some new techniques to enable users with different rights and restrictions to coexist. Of course, shared bands with primary and secondary users have dealt with these issues for decades – and found them workable.
We must also comment on the current study’s
obligation to assess “the amount of spectrum to be made available for ‘collective use’ in the future.”
Is licensing the norm?
Many people still have the habit of thinking of radio licensing as the norm, and exemption from licensing as a special case that must be justified by need. This is contrary to EU policy. At the risk of stating what is already well-known to this group, the Authorisation Directive says:
“The granting of specific rights may continue to be necessary for the use of radio frequencies … [but] rights of use should not be restricted except where this is unavoidable in view of the scarcity of radio frequencies and the need to ensure the efficient use thereof.”
In other words, licensing is the exception that must be justified by need, not license exemption. The question that should be asked is:
how much spectrum to aJJocatefor licensed services?
Is there a way to estimate the spectrum needs of all the services requiring the protection of a license? When one rephrases the question this way, one can see how unreasonable it is. Especially now that hybrid systems – combining licensed base stations with unlicensed handsets, as in GSM networks – are so widely deployed. Would you count the GSM bands as licensed or unlicensed? The licensing treatment of frequencies above 25GHz is also still being debated, and the outcome of that debate can shift the size of the allocation estimate by an enormous amount.
The auction incentive
Unfortunately, since licensed spectrum is increasingly awarded by auction, governments have a strong incentive to answer the question of how much licensed spectrum to make available with a simple “as much as possible.” Even without auctions, administrative fees levied for licenses cover a large part – or even the entire – cost of operating a regulatory agency. So to extend the “commons” paradigm, the current study might recommend incentives for the allocation of spectrum for collective and license exempt use, to counterbalance the incentives www.iicom.org
which now reward licensed allocations.
The EC recently released a draft report on license exemption’s impact on the funding of spectrum management. Summarising and analysing a recent survey of regulators in the EU member states, this report complements the study we’re discussing. With regard to converting services from licensed to license exempt, the EC notes that:
“only one risk was mentioned by respondents, while a long list of benefits could be drawn out of all the comments made. This stands in contrast with the current extent of licence exemption or with the relatively low number of administrations that intend to exempt further applications in the future.”
Part of the problem may be that the “long list of benefits”apply mainly to the market or to society at large, not to the regulator. Although license exemption was said to yield savings in staff time and a lightening of the workload, the loss of revenue from license fees apparently counteracts these benefits.
The ECC report also makes the important point that while regulators generally have positive experiences with license exemption, they are are not moving swiftly to de-license more bands, even though that seems to be called for by the EU’s electronic communications framework. We think more Commission leadership is needed to drive this process forward. EC leadership would undoubtedly promote harmonisation, and the regulators say harmonisation helps them de-license.
So let me offer a specific suggestion:
set a deadline for de-licensing the maritime mobile service throughout the EU.
Countries with laws foreclosing that option should move from individual licenses to class licenses or general authorisations by the deadline date.
You may know that last year Ofcom proposed July I August 2006 Volume 34 Number 3 13
SPECTRUM MANAGEMENT (CONT.)
de-licensing ship radios in the UK, but found many users opposed to this, in part because they would still need a radio license when they left UK waters. However, ECC Report 83 points out that Denmark has been gradually de-licensing maritime radio since 1997, while Sweden plans to exempt the use of VHF by private boats this year, and the Netherlands plans to exempt some maritime mobile equipment in the near future. Australia, New Zealand and the US have already taken similar steps.
So here is a service where there is a limited risk of interference, there is already some uncoordinated movement toward de-licensing, and where de-licensing would clearly be more acceptable with a harmonised approach.
The issue of harmonisation
The ECC report also discusses the broader significance of harmonisation, revealing a strange schizophrenia: “Administrations are more likely to embark on exemption when a harmonised CEPT approach is taken,” the report asserts. And yet numerous examples are cited of services that are being de-licensed without harmonisation.
My own feeling is that while harmonisation is desirable, it should not be a prerequisite for delicensing. Countries have enough autonomy in spectrum use that they can exempt devices from licensing when they do not cause transborder interference, even when they do not conform to the international table of frequency allocations, and even without international coord ination.
The movement toward de-licensing is fragile enough that it should not be burdened with a requirement for EC-wide harmonisation. If the US had waited for regional harmonisation, the introduction of WiFi, Ultra Wideband, etc., might have been delayed indefinitely. Mexico did not officially de-license WiFi until last month.
George Soros has often pointed out that open free societies are more complicated than closed totalitarian ones. Therefore, time, effort, learning, strife, debate, money, creativity and work are needed to evolve a closed society into an open one. One cannot simply say, “go now and be free” and expect a sustainable political lifestyle to appear automatically. Volume 34 Number 3 July / August 2006
The same may be true in the radio spectrum. Simply de-licensing a band will not instantaneously produce a peaceful orderly commons filled with polite users causing no interference. Users need to learn how to self-manage their space, although it is tremendous advantage that radios are getting smart enough to handle some aspects of their operation on their own, from channel selection to power level settings.
Device-based and user-generated conventions, strategies and institutions must be cultivated in newly de-licensed bands, as they were in the licensed bands of “collective use” like amateur radio, aeronautical mobile, and so on. Associations of community networks in the WiFi band, and the many user-created websites with educational materials, directories, recommendations and discussions about license exempt equipment are early buds on this tree which facilitate the formation of sustainable commons.
Tiers of spectrum access rights
I would close by drawing your attention to some recent ideas from Kalle Kontson and MichaeIO’Hehir. In their presentation at this year’s ISART symposium in Colorado, they outline a
“regulatory model that rewards the implementation and deployment of spectrum-efficient technologies by offering incentives in the form of progressively expanded tiers of spectrum access rights in proportion to device performance.”
The sharp distinction between licensed and license exempt (which is inflated by the “spectrum as property” model) would be replaced by a continuum of spectrum rights. Actually that has already occurred, but the link to device performance is not yet as direct as it is in Kontson and O’Hehir’s proposal. Nevertheless spectrum access rights already come in many shades and degrees of freedom:
License exempt bands open to a wide variety of applications (e.g. ISM)
License exempt bands open only to one application (cont.) www.iicom.org
” George Soros has often pointed out that open free societies are more complicated than closed totalitarian ones. Therefore, time, effort, learning, strife, debate, money, creativity and work are needed to evolve a closed society into an open one. One cannot simply say, “go now and be free” and expect a sustainable political lifestyle to appear automatically”
www.iicom.org July / August 2006 Volume 34 Number 3 15
SPECTRUM MANAGEMENT (CONT.)
(e.g. cordless phones)
License exempt bands open only to devices with mitigation technology (e.g.5GHz)
License exempt bands where registration is required (high-powered RLANS in Japan)
Class-licensed bands (many countries)
“Light licensed” bands (e.g. in the UK)
Licensed bands accessible to large numbers of users by administrative fee
License bands accessible to a few users through competitive bidding
License bands open only to tested/certified operators.
In addition to helping us see what is already here, Kontson and O’Hehir provide a missing piece of the puzzle: where a transmitting device fits in the continuum of spectrum access rights could be based on a “scorecard” assessing the device’s “spectrum citizenship.” This scorecard would be a refinement of the existing processes of equipment testing and type approval. Instead of a simple binary judgment – “approved” or “not approved” – equipment could be graded by a “standard set of metrics and tools to assess the worthiness of individual devices to reap rewards for good spectrum behavior, and restrict bad behavior.”
How the scorecard might work
Points might be awarded for spectral efficiency, resistance to interference, high data throughput, etc., while points might be subtracted for spurious emissions, lack of automatic transmit power control, the absence of interference mitigation techniques, etc. The net effect would be that higher scoring devices would enjoy more spectrum access rights. Devices falling below some minimum score would have no access rights at all.
A system that dynamically links device performance to spectrum access rights would be challenging to put into practice, particularly as it conflicts with the idea of awarding spectrum rights by auction, trade or payment. But it has Volume 34 Number 3 July / August 2006
its own market-like logic and seems consistent with the trend suggested by WAPECS, expanding existing procedures into a more general and flexible framework. There would be no clash of award systems if this was implemented in the license exempt bands. which need incentives for “good spectrum citizenship” anyway.
“From Consumers to Users: Shifting the Deeper Structures of Regulation Toward Sustainable Commons and User Access” by Yochai Benkler, Federal Communications Law Review, Volume 52, Number 3, pages 561-580 (April 2000) – available online at http://www. law.indiana.edu/ fclj/pubs/v52/no3/benkler1.pdf
“Policing the Spectrum Commons” by Phil Weiser and Dale Hatfield, Fordham Law Review, Volume 74, pages 101-132 (2005) – available online at http://ssrn.com/ abstract= 704741
“Spectrum licensing and spectrum commons – where to draw the line” by Martin Cave and William Webb, presented at “Wireless Communication Policies and Politics: a Global Perspective”, an Annenberg Research Network workshop at the University of Southern California, 8-9 October 2004 – available online at http://arnic. infolWirelessWorkshop/ Spectru mLicensing Cave.pdf
“Study into Mixed Sharing – Converged Solutions (Final Report)” by A. P. Hulbert and Z. Dobrosavljevic, Roke Manor Research Ltd., Report No. 72/04/R/107/U (April 2004) – available online at http://www.ofcom.orq.uk! research/tech nology/other/sss/ay4587. pdf.
“Unlicensed and Unshackled: A Joint OSPOET White Paper on Unlicensed Devices and Their Regulatory Issues” by Kenneth R. Carter, Ahmed Lahjouji and Neal McNeil, FCC Office of Strategic Planning and Policy Analysis, Working Paper number 39 (May 2003) – available online at http://hraunfoss.fcc.qov/edocs public/attachmatchlDOC-234 741A1.pdf
‘OPEN SOURCE’ KNOWLEDGE ON THE NET
Recent issues of Intermedia discussed ‘the Wikipedia phenomenon’ – the now giant free encyclopedia on the Internet. What is the staus of the knowledge created by its users? Here is a view.
Is a million articles proof of authentic information?
By David Shariatmadari When Marcus J. Gilroy-Ware examined the rise of Wikipedia in September last year, the English version of this ground breaking free encyclopaedia had about 730,000 articles. Evidently, a lot of can happen in a few short months. At the beginning of March, the number of English articles hit the 1 million mark and continues to rise. In contrast, the print version of Encyclopaedia Britannica manages a mere 65,000.
But is the Wikipedia phenomenon really so remarkable? We already have Google, a tool that’s made the internet come good on its promise to bring the world’s knowledge to our fingertips. What’s the advantage of an open source project that reflects the whim and the weaknesses of its users? The answer’s simple. Wikipedia, specifically intended as a work of reference, explains things from first principles.
Type “H5N 1” into Google and (apart from its Wikipedia entry) you get a press release by the World Health Organisation, a report about a lawsuit, and news stories from around the world, all pitched to different audiences. Use Wikipedia, and you’ll be able place the information immediately, to see exactly where it fits in to the scheme of things: “Influenza A virus subtype H5N1, also known as A(H5N1) or H5N1 …”. All the technical Jummy Wales of Wikipedia
terms are linked to their own Wikipedia entries, so the knowledge is accessible to nonspecialists without being dumbed down.
As for claims that you can’t trust a word it says, well, you’d certainly be foolish to trust it blindly. The same goes for Google. It’s a question of using your nous: making your judgement based not merely on what it tells you, but on what you know from experience about the ways in which it might be misleading.
So, cards on the table: I’m a fan. But I do have one concern. With no editorial policy as such, is it possible that Wikipedia’s millions of pages don’t manage to cover quite as much ground all those zeros would have us believe? What if half of them are like the article on Millersburg, Indiana – population 868 – or Quail, Texas, which boasts just 33 residents?
Then there’s the piece on Jordanhill railway station, Scotland, which has the somewhat less than newsworthy distinction of being 1029th busiest in the UK (the trainspotters seem to have infiltrated Wikipedia quite thoroug hIy).
As it happens, a bunch of users have banded together to help make sure that Wikipedia reflects more than just the concerns of your average white, male, “technically inclined”, developed world 20-40 year old. They are the brave members of (continues)
July / August 2006 Volume 34 Number 3 17
‘OPEN SOURCE’ KNOWLEDGE ON THE NET (CONT.)
Wikiproject: Countering Systemic Bias. What they’re doing isn’t all that dramatic – they initiate and nuture articles on poorly represented topics and provide a forum for like minded members.
But it’s essential to Wikipedia’s intellectual credibility. “I started this after using Wikipedia for a while and realising it was incredibly biased” says Xed, the project’s founder. It seems that the predilection of its users for sci-fi and fantasy was being indulged at the expense of more serious topics: “I made the point at the time that there was more information on Middle Earth than Central Africa.”
Other examples of yawning gaps aren’t difficult to find. The entry on British glamour model Jordan is more than twice as long as the article on Selim, ruler of the Ottoman empire from 1512 to 1520 and conqueror of Palestine and Egypt. Zyxoas, a Wikipedian from South Africa, cites another typical case:
“Sesotho is a language spoken by some four million people in two countries. When I found the article a couple of years ago it was a two paragraph stub. I thought ‘Hey! That sucks!’ and I dumped a whole bunch of linguistic info in there.”
Sesotho isn’t the only language to have been given short shrift. Compare the three line entry for Yi, the mother tongue of 6 million people in China, to the 260 line whopper for Scots, which has 1.5 million speakers in the UK. If you’re about to counter that the Chinese version will certainly have a more detailed entry, you’d be right, but at around 30 lines it leaves a lot to be desired, especially when you consider that the equally long entry for Klingon, which is nobody’s mother tongue, has an entry of over 200 lines.
Do these kind of imbalances really matter? After all, Wikipedia is a creature of the demographic that views and contributes to it: it mirrors their interests exactly. So if we have very little information on Yi, that’s because there’s no demand for it. Which sounds uncannily like free market economics applied to knowledge.
But just as very few are in favour of genuinely laissez faire capitalism, a lot of people are
18 Volume 34 Number 3 July / August 2006
turned off by the results of a total lack of management when it comes to information.
If, as the encyclopaedia’s founder Jimmy Wales says, we’re doing this – at least in part – “for the child inAfrica who is going to use … reference works produced by our community and find a solution to the crushing poverty that surrounds him”, then Wikipedia can’t just be a free-for-all. A little nudging might well be needed to keep things on the right track. The nudging needn’t come from on high – in fact, nothing could be more collective, more grassroots than Countering Systemic Bias.
It’s difficult to fault the concept. Let the techies from Texas have the articles about their home towns and favourite characters from Stargate Atlantis, but let’s make sure we have stuff on Egyptian political parties and Brazilian writers too. Only that way will Wikipedia achieve its full potential, and as an added bonus its detractors won’t be able to complain that it’s got more on C-list celebs than it has on sultans.
There’s evidently a lot of work to be done and despite the support he’s received from ordinary users, Xed strikes a pessimistic tone:
“There is a lot of utopian feeling around about the internet, especially about projects like Wikipedia. There’s a belief that it will bring the world closer together and herald a new era of people power. I don’t see any evidence for that. The same patterns seen in the mass media are replicating themselves on the internet.”
It would be a shame if that were true. So, if you’re reading this, and you’re an authority on some under-represented area of human experience, you know what to do: get your ideas together, get registered, and contribute.
This article is published by David Shariatmadari and openDemocracy.net under a CreatIve Commons licence. For further discission, see openDemocracy website.
In the Front Row of History?
Journalists Under Fire
By Howard Tumber and Frank Webster
Sage Publications: 2006 This study of how print and TV journalists actually behave when covering warfare revolves around the notion of ‘Information War’. ‘Information War’ has however two separable meanings in the authors’ usage. One meaning relates to the actual conduct of military operations, the increasing reliance on high-tech information technology to guide and control both the weaponry and the intelligence data upon which their control is based.
In this sense, the military are no longer to be identified with a nation’s population at large, who no longer provide the mass citizen armies that were the norm from Napoleon’s time to WW2. Most citizens no longer have experience of warfare or its preparation: soldiering is for a highly trained specialist minority at the margin of society whose reliance (at least in industrialised nations) is almost entirely on information technology to be effective. Hence their academic nicknames as ‘knowledge warriors’ and ‘digital soldiers’.
The second sense lies in the battle for ‘hearts and minds’, but this time it is the hearts and minds of the army’s home state, the mobilisation of public opinion and public support for the current conflict. Without that, in a democracy, and in a world of mass media and instant news, the military strategy will ultimately fail. The ‘war’ lies in the competing evaluations of the current conflict that may be out there, either at home or as between the combatants. This is ‘psych ops’ rather than ‘combat ops’.
The question posed by this interesting and wellresearched book is, what role do journalists play in this type of ‘Information War’? In particular, what role is played by those who used to be called ‘war correspondents’, but no longer like that once proud title? From this book, it appears to be the current fashion to disclaim any perception of being a ‘war junkie’ who gets his or her kicks from the thrill of danger and gunfire. Rather, these journalists prefer to be known just as journalists who happen to covering a war, and who tomorrow might be covering the World Cup.
That the profession (if it is one) is still a dangerous one, is beyond dispute. Between March 2003 and July 2005, 52 journalists were killed on duty in Iraq, with another 21 ‘support staff’ also killed. In the decade 1995-2004, a total of 341 journalists were killed while at their work. But it is notable that of these totals, only 68 (or 20 per cent) were killed in crossfire (bad though that is), while the rest were murdered, often in reprisal for their reporting. It is local journalists covering contentious local stories that run the greatest risk. But cameramen, who have to get very close to the action, also pay a heavy price in terms of lives lost.
Front row of history
But despite or because they are (as one says) ‘in the front row of history’, most journalists questioned in the preparation of this book stick to the now traditional view that their job is to ‘tell it like it is’. This was not always the case. In WW2 war correspondents donned uniform and saw it as their patriotic duty to support the nation’s war effort: they were not outside the conflict, but responsible members of it.
But does ‘telling it like it is’ guarantee that the general public gets a reasonably objective and reasonably comprehensive picture of what is going on? The most obvious limitations are the emphasis on instant news created by rolling all-news TV and radio channels; the increasing control that contemporary communications technology gives to editors and news desks back home over what a journalist actually files, and when; and the simple fact that, with all the reports coming in to any big news desk for instant recycling to the public, ‘my mum knows more than I do about what’s going on’ (to quote one journalist in the field). It is an oddity that ‘mum’ often knows more, much more, that the actual soldiers in the field.
But from the public’s point of view, there is a bigger question mark over the frequent American practice of so-called ‘script approval’. This is the practice by which a proposed script for a piece for TV is emailed back to base for prior approval before actual recording, and is often changed – in effect, say other journalists, ‘censored’. Some argue that it is about maintaining house style. (TO PAGE 22)
www.iicom.org July I August 2006 Volume 34 Number 3 19
BOOK REVIEW (CONT.)
” The main Informa(;on War takes place, in part in the ‘fog of war’, but in larger part in what has been called the ‘transnational public sphere’, a sphere where the weapons
are not as unequal as they are in the actual military conflict, where ‘virtuoso weapons of war’ can ensure massive disparity between combatants. ‘It is a domain where the
digital camera and the Internet web site can playa key role in defining reality”
Above: still from Christian Frey’s documentary about James Nachtwey, was photographer
20 Volume 34 Number 3 July IAugust 2006 www.iicom.org www.iicom.org July IAugust 2006 Volume 34 Number 3 21
BOOK REVIEW (CONT.)
Others call it ‘disgraceful’. On the other hand, information technology in the form of the Internet is increasingly circumventing the traditional channels. Individuals – the so-called ‘citizen journalists’ – are posting their own evidence – text, pictures, voice, eye-witness accounts – on the Internet, and creating their own discussion and opinion groups.
One of the more gruesome sections of this book deals with the fact that the most barbaric acts recorded and distributed by someone on video (e.g. the beheading of hostages) are not shown uncut on mainstream television (nor, for that matter, are the more disturbing pictures of the dead bodies of those killed) but can be found uncut on, of all places, major pornographic websites based in the USA. One such is ogrish.com, which (say the authors) ‘specialises in hard-core pornography, bestiality and ghoulish photographs of accident victims’. This is not a pleasant aspect of westem media.
But are the war-zone journalists simply seekers after the truth, or are they just ‘cogs in the machine’ of what is routinely called ‘perception management’? The authors pose this question, but shy away from answering it head-on. If there is an Information War, say, in the case of Iraq or Iran, then a war has two sides (at least two).
In a way the statements by individual correspondents, honest as they certainly are, tell us little about the wider issues of how perception management’ actually works, and to what effect. It may be unfair to call such journalists ‘hotel room heroes’ sending out ‘herograms’ – although that used to be the tradition. But the question is, how does it add up?
There are the obvious item in the picture, like the system of ’embeds’ whereby individual journalists travel and operate with and among a particular fighting group. On the one hand, this is (or appear to be) to be uniquely present at the frontline, with the attendant risks. On the other hand, as journalists freely admit, it is impossible to be ‘objective’ about the men and women with whom you share danger, food, and the showers. It is a different journalism.
But the main Information War takes place, in part in the ‘fog of war’, but in larger part in what has been called the ‘transnational public sphere’, a sphere where the weapons are not as unequal as they are in the actual military conflict, where ‘virtuoso weapons of war’ can ensure massive disparity between combatants. ‘It is a domain where the digital camera and the Internet web site can playa key role in defining reality’, write the authors.
But as to who exactly is ‘managing perception’ and how successfully, is a matter that the authors somewhat shy away from. Nor do they take a view about whether ‘perception management’ is a short-term strategy doomed to eventual failure – or whether our common experience of war at second hand is sufficiently sanitised by a collective social decision about the norms of reporting and of news presentation that war becomes little different to a video game.
But such studies as this, clearly written and presented, provide a valuable insight into the inner structures of news reporting, and into the motivations of those who run such risks to keep us informed – within certain limits. Academics also are ‘knowledge warriors’, and such books as this are ‘virtuoso’ items in their weaponry.
graphic courtesy of Editor and Publisher, as provided by Google image search
22 Volume 34 Number 3 July I August 2006 www.iicom.org
THE DIGITAL SWITCH-OVER
Going digital· the problem of ‘resisters’, ‘refusniks’ and the last 10% on ASO/DSO day
That’s Digital Switch-Over and Analogue Switch Off day to the unlettered
(With thanks to City University, London, for organising a conference on this important subject) The pace of the switch-over from analogue to digital varies widely across Europe, as the chart on the opposite page (kindly provided by Digital UK, the UK body responsible for publicising the switch over) shows very clearly. Germany is in the van, but the overall time-scale across Europe is far from co-ordinated – even if the stated plans of individual countries are stuck to, which many believe they will not be.
The resulting technical problems are principally about dealing with interference in the border areas of one country from the signals emanating from another country, during the Europe-wide switch-over period lasting perhaps decades. These technical problems have been the subject of the recent European Regional Radio Conference, and their solution – probably a very adequate technical solution – revolves around the notion of ‘interference envelopes’ within which signals on any given frequency at any given site will be restricted or governed by mutual agreements (see Intermedia April 2006 issue) while leaving freedom on how to actually use that frequency.
But there is growing recognition across Europe that there is or may be a social problem as the switch-over progresses – that of ‘the last 10 per cent’. And it is a policy issue, and therefore a political issue, as to how each country deals with this problem. For the switch-over creates the problem, or at least the potential problem, of exclusion – exclusion of those who cannot or will not adapt to digital.
Some refer to this as the ‘late adopters anxiety’. But there are also cost implications – the cost of buying new kit so as to receive digital TV signals is not high for most wage and salary earners, but may be high for poorer and older age groups. There is a debate about whether governments, whatever they may say now about not subsidising users’ switch-over, may be forced to change their minds when the political reality, or political backlash, starts to hit on or near an election time. They may, for example, find it politically expedient to either buy up old equipment or subsidise the new.
Early trials suggest that about half of people over
the age of 75 may need help with managing the switch-over in their homes. There are in today’s societies substantial numbers of people over 85 and indeed over 90, and many if not most of these have rarely used a computer or know how to navigate by on-screen menus. On the other hand, the over-65s spend much more time watching TV than the average, so their need for help may be more pressing – yet at the same time the benefit to them of the switch-over, once accomplished, may be all the greater.
Many thousands of older people never leave their homes, and they may especially benefit from the ‘social’ services that digital could offer – especially if the needs of older people are properly analysed. not least by asking them.
On the other hand, to dismiss these older age groups simply as ‘late adopters’ or ‘vulnerable’ can sound insulting, and create antagonism. Nor is it the whole problem. Many people are still buying TV sets that will, come D-Day (Digital Day), become obsolete, and they may not take kindly to this enforced obsolescence. A large proportion of existing TV sets are still analogue ones – maybe over half.
There are of course groups who are vulnerable in a very basic sense – for example, the disabled, the blind, those on income support, and those who rely on communal TV systems e.g. in social housing groups. The blind or partially sighted have particular problems and needs. Sight problems are of course notably prevalent among older people, those over 75 in particular, but are far from confined to them. Many blind people are on income support in some form or another, so the categories of the ‘vulnerable’ overlap considerably.
Perhaps surprisingly, a large majority of blind people uses TV as their key source of news and entertainment. One way to held blind or sight-impaired people is to add an audio description to the signal, to make the TV more accessible by, in effect, additional narration. The digital switch-over is a chance to increase the proportion of programmes with this type of audio add-on, but there are questions about whether it will happen.
Once again, navigation on the screen relies on read
July I August 2006 Volume 34 Number 3 23
(1) 0. o l…. :J W C ..•…• (f) (1) ..c 0) ..c
(IJ C C o OJ m E !o- (1) ..••••• !oQ) C (IJ Q) (IJ 0.(1) C CO~ ~ CO .0) (1) “‘0 !o~”‘O ::) ~
!oo CO (IJ zgt “‘ON OJ C C en OJ ._ ~ ~(1)~ C ..•…•…..J OJ (1) (1) E 0. !o!o-E~ (1) 0 (1) (9U..c • •
(1) ..•…• (1) 0. E o u ‘::::R o L() N C (1) “‘0
(1) ~ (j)
I I 1 I I I I I I I I I I
(1) > ..•…• (IJ ~~OO o 0….•…• L() .~ U ~ ~ OJ C ..•…•(1) OJ (1) !o- UE 0..!o-E..c (1)0 :J (9UQ.
M ~ 0 N N ~ 0 N ~ ~ 0 N 0.. ~ C) 0 c::: N :e 0′) ~ 0 •• 0 en N ‘~ co Q> 0 >- 0 N I’0 0 N (Q 0 0 N LO 0 0 N
THE DIGITAL SWITCH-OVER (CONT.)
24 Volume 34 Number 3 July I August 2006 www.iicom.org
ing text menus, and the blind need voice output to help them deal with this navigation.
Last but not least, there may be an irreducible percentage (5 per cent?) who will never convert even if and when the switch-over happens – presumably these (mostly retired) people will simply stop watching or using TV – even if they use it now.
Views seems divided between those who say that, when the day comes, people will just take the digital switch-over in their stride, provided only that there is good publicity between now and then, and a good scheme to help the vulnerable 10 per cent is in place: and those who take an alarmist view and say that there will be a forceful back-lash then the reality and costs of the switch-over hits the ordinary consumers, who may feel press-ganged into doing something they don’t see the point of i.e. a protest against the apparent compulsion involved.
Then there is HOTV … All that is additional to the question of whether all the technical preparations go well, not least the conversion of many hundreds of transmitters – which some doubt.
THE DIGITAL SWITCH-OVER
There is also the latent problem of HOTV – should the switch-over be not only to digital, but simultaneously to HOTV? Of course, this would alter all the calculations about spectrum use and the lucrative freeing of spectrum space that is to go to government auction. But it is just possible that consumer trends in the retail marketplace may, between now and Digital Day (which in some countries is still far off), force the issue of HOTV, in HOTV’s favour. That would change all the parameters.
… and broadband
There is no doubt that the switch to digital will and should take place. on any rational calculation, in the interest of the best possible use of a scarce public commodity, the spectrum.
But whether that switch should be entirely driven by TV broadcasting interests, whether it is the right broadcasting standard for the next half-century, and whether there might be beUer social uses for that freed spectrum such as wireless broadband, broadband being the ‘new inclusion’ – all these are unfinished debates.
Chart opposite acknowledged to Digital
The blind and partially sigl1ted actually spend a lot oftime with their TV sets – but need better help to do so:photos RNIB
July I August 2006 Volume 34 Number 3 25
BROADBAND REGULATION This important article first appeared in IDATE’s publication ‘Communications and Strategies’ No 60
Is the USA Dancing to a Different Drummer?
Is the United States in full retreat from internationally recognized regulatory best practice? Or is it instead headed toward some different destination – “dancing to the beat of a different drummer”? Where is this likely to lead?
By J. Scott MARCUS WIK-Consult GmbH, Bad Honnef, Germany
“If a man does not keep pace with his companions, perhaps it is because he hears a different drummer. Let him step to the music which he hears, however measured or far away.” Henry David Thoreau, Walden, 1854 In a widely read white paper that Iwrote while at the FCC in 2002 (FCC, 2002a) . I argued that the then-nascent European regulatory framework for electronic communications should generally reach regulatory conclusions similar to those of the United States. The U.S. and the EU had similar pro-competitive objectives, U,S. regulators over the prior forty years had been consistently reaching conclusions that would have been logical outcomes under the new European system.
In revisiting these themes a scant three years later, I find that subsequent experience no longer supports them. On the one hand, the European system is in full swing, and the system that seemed novel and radical three years ago is generally functioning as was expected and hoped (European Commission, 2004; 4th ZEW Conference, 2004). What has radically changed is telecoms regulatory practice in the United States. The U.S., in a long series of regulatory decisions, has largely abandoned its long-standing regulatory principles and moved in an entirely new direction.
The European sector-specific regulatory system for electronic communications rests primarily on formal mechanisms of market definition, determination of Significant Market Power (SMP), and imposition of proportionate (minimally adequate) remedies if, and only if, SMP is present. These core elements are implemented in a technologically neutral, allembracing framework that is harmonized with European competition law.
26 Volume 34 Number 3 July I August 2006
Telecommunications law and regulation in the United States lacks this elegant formal structure, it emphatically lacks technological neutrality, and it largely pre-empts the operation of competition law (antitrust). Nonetheless, U.S. law implicitly recognizes market power by identifying categories of telecommunications carriers who could reasonably be presumed to possess it.
Moreover, U.S. law and regulation until roughly 2002 generally assigned an array of obligations (obligations similar to European SMP remedies) to carriers who were presumed to possess market power, including: interconnection, non-discrimination, transparency, unbundling of loops (and other elements), accounting separation and controls on pricing.
Consequently, it seemed to me that the U.S. and the EU should generally reach similar regulatory conclusions, despite major differences in their regulatory processes.
That conclusion turns out to have been incorrect. The U.S. subsequently reached one regulatory conclusion after another that would have been implausible or impossible under the European regulatory framework.
In a series of rulings over the past few years, the FCC has systematically deregulated wired facilities, especially those used in support of broadband Internet services. Deregulation is generally viewed globally, and in Europe specifically, as the appropriate response to the emergence of competition. As SMP fades, remedies are no longer necessary.
The concern that must be raised with this series of FCC rulings is that none of them contains any economic analysis worthy of the name. Indeed, in reading the rulings, it is difficult to escape the conclusion that they refrain from rigorous analysis because they know that it would not support the desired conc1usions. Instead, they deregulate in response to non-binding statements of intent on the part of wired incumbents, and in the hope that new technologies might generate sufficient competition at some unspecified time in the distant future to warrant the deregulation granted in the present.
The regulatory system in the U.S. has thus been
characterized in recent years by deregulation, despite the likely presence (at least in some relevant geographic markets) of SMP.
Several other factors are reinforcing an apparent abandonment of pro-competitive principles and a tilt toward the wired incumbents including:
– a series of large-scale mergers, permitted with only minimal conditions imposed on the parties, – an apparent willingness to impose new regulations – in at least one case, harsh and lopsided regulations (MARCUS, forthcoming) – only when they disadvantage new entrants to a greater degree than wired incumbents, as regulatory changes cause financial losses at formerly competitive firms, forcing market exit or acquisition, funds for pro-competitive lobbying decrease correspondingly. In the context of the U.S. regulatory and political system, this creates a feedback loop, reinforcing the regulatory tilt.
So regulators are abdicating at the very moment that the industry is consolidating. Where is all of this likely to lead?
It is important to bear in mind that there are fundamental differences between the U.S. market and those of most European countries. Cable TV is far more prevalent in the U.S. than in most European countries. In the U.S., the suppression of competitive market entrants leads, not to monopoly, but rather (for the foreseeable future) to duopoly. More precisely, it leads to a series of non-overlapping geographically specific duopolies for wired broadband services at the retail level in most parts of the United States, and to continued decline in an already patently ineffective wholesale market for wired broadband access.
This is not to suggest that the FCC’s commissioners all woke up one morning, miraculously and simultaneously inspired with the notion that duopoly was the very thing that America needed. To the contrary, these effects are more likely to have been inadvertent than intentional. Intent aside, a long series of U.S. government actions (discussed in the following sections of this paper) have tended to strengthen wired incumbents at the expense of new market entrants. These decisions are mutually consistent and synergistic; moreover, they appear to have had considerable collective effect.
This is not a cheerful result. In economic terms, duopoly is not a good thing. It is not something to be sought out (except perhaps by the duopolists). The best that one can say is that it is a lesser evil than mono 01.
For the moment, one must say in fairness that the economic results to date are mixed. In terms of consumer welfare, they may not be as negative as one might otherwise anticipate. The future implications depend heavily on the success or failure of local telephony incumbents with video services transmitted over Fiber to the Home (FTIH) – if FTIH is widely deployed and widely adopted, and to the extent that it leads to effective competition for tripleplay services, the U.S. might conceivably wind up with an enviable electronic communications environment.
Nonetheless, this is a radically different long-term vision from that of Europe. Moreover, it is a duopolistic world in which neither market forces nor regulation can be presumed to adequately protect consumer welfare.
The next section of this paper describes the FCC’s deregulatory rulings, which fail to properly analyse possible SMP. We then briefly consider two recent orders that appear to impact new entrants more than incumbent fixed providers. The subsequent section contains a few brief remarks on the trend towards industry consolidation. We then consider the economic implications of the system that appears to be on the horizon.
Deregulation failing to account for SMP
In the interest of brevity, we confine ourselves to only the most noteworthy proceedings, and to those direclly relevant to wired broadband Internet access:
Shared access to DSL The FCC eliminated the obligation for incumbents to provide shared DSL access to competitors (FCC, 2003) . Prior to its elimination, this program had effectively spurred deployment and competition (KAHN, 2001).
Unbundling obligations for last mile fiber The FCC decided not to require loop unbundling for fiber-to-the-premises, ostensibly in order to spur deployment.
Internet access via cable modem Access to the Internet sold bundled with cable modem access was declared to be an information service, making it by default exempt from common carrier regulation. Possible SMP associated with last mile facilities was not addressed (FCC, 2002b).
Internet access via DSL Access to the Internet sold bundled with DSL access was declared to be an information service, making it by default exempt from common carrier regulation (FCC, 2005a).
July I August 2006 Volume 34 Number 3 27
BROADBAND REGULATION (CONT.)
Non-discrimination obligations and obligations to offer DSL at wholesale These obligations were eliminated for all wired broadband connections offered by telecommunications carriers (FCC, 2005) .The FCC asserts that the wholesale market for DSL and cable modem Internet access services is effective, and will remain so in the absence of regulation.
To put this in a European context, these FCC proceedings have had the collective effect of breaking most of the rungs on the “ladder of investment” whereby new entrants would seek to progressively grow their businesses. European broadband adoption and deployment took off in the 2003-2005 period through the combined effect of (1) local loop unbundling (LLU), (2) shared access, (3) bitstream access, and (4) resale (European Commission, 2004; IEEE Communications Magazine, 2005). In the United States, the only rung that solidly remains is the unbundling of copper loops. European experience strongly suggests that LLU alone is not sufficientto ensure a robustly competitive market.
In general, these proceedings were justified on the basis of encouraging broadband deployment. None contains an SMP analysis that an economist would credit. If a European member state had notified it of such an analysis, the Commission would have sent the country home packing with polite – or perhaps not so polite – instructions to come back when they had done the job properly.
This reflects an important difference between telecoms regulation in the EU and the U.S. In Europe, the regulator is required to make decisions that are transparent and objective; furthermore, meta-regulation at the European level provides standards by which those decisions are to be made (Directive 2002/20.EC). In the U.S., by contrast, nothing prevents the supposedly expert agency from making subjective decisions (TABELLlNI, 2002).
Courts may intervene if these decisions are contrary to law, but the courts lack a consistent basis for intervening where decisions are flawed in a public policy sense. They could in principle reject a decision based on flawed analysis as “arbitrary and capricious”, but only if the judges understand the subject matter well enough to recognise the flaws. In practice, there is a strong tendency to defer to the nominally expert agency.
The combined effect of these proceedings has been to substantially eliminate all regulatory obligations associated with last mile Internet access facilities, at both wholesale and retail levels, without consideration of whether SMP might be present in the underlying transmission facilities or no!. Moreover, deregulation was carried out in a manner that makes it particularly difficult to subsequently reimpose remedies should they prove to be necessary. Volume 34 Number 3 July I August 2006
The recent FCC decision eliminating wholesale obligations specifically argues that wholesale competition is not essential to effective retail competition (and then fails to rigorously analyse either). In Europe, we strive to ensure effective markets at the wholesale level, in order to avoid the need to regulate at the retail level.
These effects were reinforced by other FCC actions that served to benefit wired incumbents and to impede new entrants who offered traditional voice services. A particularly noteworthy ruling eliminated the most popular and cost-effective form of local loop unbundling, UNE-P . Other FCC proceedings effectively deregulated prices on private lines within a metropolitan area. The combined effect of these two rulings has been to increase the cost of a number of critical inputs that Internet service providers (ISPs) need, and to cause many actual or potential competitive suppliers to the ISPs (including MCI and AT&T) to exit the market or be acquired.
It is not my intent to argue the absolute rightness or wrongness of any particular one of these decisions. Rather, I observe that they collectively set a course that is very much at odds with pre-existing U.S. policy, and also with European notions of regulatory best practice.
Had such a regulatory course been followed in Europe, competition law might have provided a partial correction. That is not possible in the United States. Pursuant to a number of court cases, competition law is largely pre-empted where sectorspecific regulation is possible. More specifically, the provisions of the Communications Act do not constitute a separate cause of antitrust action. It is also worth noting that competition law in the United States differs in many ways from that of Europe – for example, if a firm has achieved market power through legal means, it is not illegal to charge a monopoly price.
Hobbling new entrants
The record on this point is not clear-cut, but there are reasons to suspect that this may be an emerging problem. Some of what follows is speculative. In recent years, it has been rare for new obligations to be imposed on wired telephone providers or on cable TV providers. Both industries maintain effective lobbying organizations.
Nonetheless, two significant orders impose new obligations on Voice over IP (VoIP) providers that interconnect with the Public Switched Telephone Network (PSTN), and one of them also imposes obligations on broadband Internet access providers. The first requires interconnected VolP providers and broadband Internet access providers to instrument their networks in advance in order to facilitate any
requests that they may get for lawful intercept (such as wiretaps). The second requires interconnected VolP providers to fully support the most enhanced form of access to emergency services (E-911) by means of access to the E-911 services provided by wired incumbents.
That both orders were adopted at all suggests that wired incumbents did not lobby aggressively against them. A possible reason suggests itself immediately: the wired incumbents had already internalized most of the cost associated with these regulations, and thus had no reason to oppose them. On the one hand, they had expected to eventually be subject to these mandates; on the other, their implementations drew heavily on capabilities already in place for their conventional PSTN operations.
The costs to new entrants, however, are significant. Thus, incumbents could reasonably conclude that these orders provided them, all things considered, with a significant competitive advantage.
Some regulatory obligations for lawful intercept and for access to emergency services were entirely appropriate. What is striking in the case of the emergency services order, however, is the degree to which it imposes harsh, lopsided, even Draconian regulation on new market entrants (MARCUS, forthcoming). Given the VolP industry’s active engagement with the emergency services community, and their significant investment in customer education on this point, it is difficult to understand the rationale.
The order provides a time frame of only 120 days to fully implement the system. (By contrast, mobile operators have been working on E-911 systems for many years.) It recognizes that the order, byeffectively forcing new entrants to use the E-911 access of the wired incumbents, creates incumbent market power, but it fails to adequately address the issue. It required VolP providers to obtain confirmation from 100% of their customer base that the customer had read and understood any limitations in the provider’s emergency service capabilities within just 30 days of publication of the order – a requirement so extreme as to be unenforceable. (The date has already been extended twice.)
The order recognizes the technical infeasibility of reliably determining the physical location of nomadic VolP users, but goes on to impose the same aggressive 120-day implementation schedule for an error-prone and incompletely specified system of self-registration.
It is not yet clear whether this apparent willingness to impose regulation that has the effect of hobbling new entrants should be viewed as a trend, or merely as an anomaly. “A single swallow doth not a summer make” .Two swallows? Perhaps still not. We should watch the skies to see if more swallows appear.
The elimination of regulatory support for competitive providers appears to have had a significant financial impact on them. The results vary from firm to firm, with some becoming less profitable or unprofitable, some being forced into bankruptcy, some exiting the market, and some choosing to be acquired. Most notable among recent acquisitions are:
-Cingular’s acquisition of AT&T Wireless, -SSC’s acquisition of AT&T, -Verizon’s acquisition of Mel.
Two years ago, the U.S. mobile telephone industry included six nationwide players, and was characterised by intense competition. Today, with the Cingular/AT&T Wireless and the SprinUNextel mergers, there are four nationwide players – still enough to provide a reasonable level of competition. But the two largest firms are both controlled by fixed incumbents (Verizon Wireless by Verizon, Cingular by SBC and Bell South). Competition is probably still quite adequate; however, there are grounds for wondering whether this complex cross-ownership landscape will serve to limit the degree to which the competitive mobile industry serves as a check on the ability of wired incumbents to exploit market power.
A few years ago, the United States had a vigorous competitive market for long distance services comprised primarily of AT&T, MCI, Sprint, and WorldCom (prior to the WorldCom/MCI merger). This market has eroded through normal evolutionary processes, not as a regulatory failure. Instead of constituting a distinct service, long distance became merely a feature of mobile and fixed telephony. AT&T and MCI were motivated to be acquired partly by the decline in their respective core long distance markets, and partly by the decline in their competitive local provider (CLEC) business precipitated by the elimination of UNE-P noted earlier.
The absorption of AT&T and MCI by local wired incumbents will probably prove to be problematic for another reason. AT&T and MCI were the only firms other than the incumbents that operated significant metropolitan fiber access rings in most major cities in the United States. The mergers will result in substantial increases in the cost of private lines within metropolitan areas, which will increase costs for new entrants, thus impeding market entry. The Department of Justice mandated divestitures, but they are grossly inadequate.
Firstly, they relate to just 789 buildings across the U.S. Secondly, the divestitures totally ignore the disincentives for Verizon to compete aggressively with SBC, and vice versa. Thirdly, the undertakings permit separate purchasers in each metropolitan area,
www.iicom.org July I August 2006 Volume 34 Number 3 29
BROADBAND REGULATION (CONT.)
thus enabling the divesting parties to ensure that no purchaser acquires a sufficient footprint to compete effectively. The FCC obtained some additional merger undertakings, but even should those commitments prove to be effective they are scheduled to lapse in 24 to 30 months (FCC, 2005b). Higher costs for new entrants appear to be in the cards.
The net impact of industry consolidation, in a political economy sense, is quite marked. There are very few companies of any size remaining that have an interest in furthering pro-competitive regulation. Many of those that remain are ill-equipped to make substantial lobbying expenditures. The commercial interests of cable TV providers and of wired telephony incumbents will tend to be primarily to maintain, enhance or create market power.
In the context of the United States, this has to be viewed as a serious concern. The system is sensitive to lobbying dollars. Furthermore, there is little prospect of consumer advocacy groups correcting the imbalance – they lack not only money, but also privileged access to decision makers. The risk of a self-reinforcing cycle of regulatory capture is worrying.
To a European, it is natural to assume that the potential evolution of the U.S. broadband marketplace – and thus eventually of the totality of the U.S. communications market – into local duopolies will necessarily result in a massive loss of consumer welfare.
It is important to remember (again) that there are appreciable differences between the U.S. communications marketplace and those of most European countries. In most European member states, cable TV has only limited deployment and adoption. In many, the wired fixed telephony incumbent controls the largest mobile operator. The wired incumbent is, to all intents and purposes, the only game in town.
In the United States, competition between the cable TV industry (and broadcast satellite) with the telephony world provides a richer tapestry. Voice substitution with a still robustly competitive mobile industry serves as a further competitive check. Moreover, while long distance appears to be disappearing as distinct market with the absorption of AT&T and MCI, it may be too early to predict exactly how things will play out in the end.
With all of this in mind, we proceed to consider likely developments in various segments of the U.S. telecommunications market going forward. Firstly, on the wired broadband side, competitive providers have never been a major force. Moreover, the FCC’s systematic elimination of pro-competitive regulation appears to have been effective – the 30 Volume 34 Number 3 July I August 2006 www.iicom.org
market share of competitive providers (CLECs) has been flat or slightly declining over the past several years. The following graph of the relative proportions of DSL lines provided by RBOCs, other ILECs, and CLECs is based on FCC data.
Hlgh.Slleed ADSL Lines
“.000.000,–.–.–.–.–.–.–.–,–,–, “.000.000t-r–t—t—t–+-+-+-+-+-+-+–+–+–+-++++-+-+-+-+-+~ l2.mo.lXDt-r–t—t—t–+-+-+-+-+-+-+–+–+–+-++++-+-+-+:,… lO.OOO.lXDt-r–t—t—t–+-+-+-+-+-+-+–+–+–+-+++–t-7
i j 8,CDO,OCD ..000.lXDtt±~~Efttttjjjj “.000,000 2.CDO.lXD-b-;;f:;o~~+–++++++++++++++-l–l–l–+-+-l
As a percentage of all ADSL lines, CLEC lines were at 5.4% in June of 2003. They have steadily declined since, arriving at a paltry 4.3% by December of 2004.
CLEC P••.cenl or ADSL HIOI••Spoo,. Lines
.0″ .0″t-H++H-++-H~d:.~H==Ft==R=t==t=:Ft9-+~~d:±jM++_H~ 40″ +-H-+-hofLf-++-t-i-+-+–t–H-++-H-++-H-++-t-i-l–+-t-1H–+-+-H-l
To put these figures into a European context, consider that the corresponding overall figure for the EU25 is 30% (and increasing over time), and that by this metric the United States has achieved less competitive market entry for wired ADSL broadband services than 21 of the 25 EU member states, including all EU15 member states. The best that can be said of these results is that the United States has achieved a higher percentage of competitive DSL penetration to date than Slovenia, Estonia, Cyprus and Latvia.
This competitive DSL supply at the wholesale level is essential in most European countries, as there tend to be few alternatives to the wired telephony www.iicom.org July IAugust 2006 Volume 34 Number 3 31
BROADBAND REGULATION (CONT.)
network, France, for example, has obtained superb results in recent years thanks to shared access and bitstream access, which collectively represent a third of the market. Cable modem competition could not have driven sufficient competition in France – only 9% of the domestic broadband market reflects alternatives other than DSL.
DSL re “II line” ••.•••!’ket,.h”re, JulV 20 4
q'”, •… •…
( ~.~ .,’ “” ,.:r” . – ,,~ ..” “‘ =- ‘.~ ,..'” ” z:: . ‘I’ ,r; ‘” “‘ . ,~.- “‘ .I-;” ‘
t. . “” • .’.> ~ “,: ” ‘,,” … “”1- ….” .•.. ~, ..,.. .•.”- • .•.90:’\ 1%1′(‘1 ,,~ ..
• I ‘t[:;L15.IIIV N :ii
Frmm W’l'”edBroadband Adoption 7/2004
CIIlI., r TTH. WLL PL£ Clhft’ 425.(00:. g~
~c••~I”I[lSU.IIl. 2.’DO.~” .•.
At the same time. European member states that are fortunate enough to enjoy significant cable TV (or fiber) broadband deployment tend to experience exceptionally good broadband roll-out as a result of the combined effects of inter-modal competition and pro-competitive regulation of DSL facilities. This is especially true of Belgium, the Netherlands, and Denmark, all of which enjoy significantly higher overall broadband penetration than the United States. Their experiences would appear to contradict any suggestion that the widespread availability of cable broadband necessarily implies that the regulator should suppress the wholesale market on the DSL side,
Despite the lack of competitive ADSL supply at a wholesale level in the United States, most geographic areas enjoy a second source of broadband supply at the retail level. As of December 2004, 56.4% of the 38+ million high-speed lines (over 200 Kbps in at least one direction) in the United States were based on a cable TV service (coaxial cable) . This results in the gross market structure for wired
32 Volume 34 Number 3 July I August 2006
and wireless broadband shown in figure 5 below.
C.U.modtl”” ‘2’.J”il.,nlo ‘7’0
Cable and telephone company executives appear to be increasingly inclined to mutually view each other as their most significant competitors. In a recent interview, Ed Whitacre (CEO of SSC), said: “I think the cable companies will be the biggest competitor across the footprint” ,
One might not anticipate aggressive rivalry between cable and DSL under conditions approximating duopoly; nonetheless, prices for entry-level DSL service are low by global standards. Basic DSL with unlimited usage is typically available from large RBOCs on a promotional basis at prices in the neighbourhood of USD15 Oust over EUR12) per month.
This is, to be sure, a price for a slow service – of 13,817,280 high-speed DSL lines reported in December, 2004, only 5,695,548 provided speeds of at least 200 Kbps in both directions. This implies that 58.8% of all DSL services in the United States are slower than 200 Kbps in one direction (generally upstream). The price nevertheless remains impressive.
Retail prices for entry-level DSL services have been declining in the United States. Whether they will continue to do so is unclear.
At present, there are a number of possible indications of real rivalry between cable operators and wired telephony incumbents. Cable operators have gradually amassed a customer base of some 3.7 million conventional voice customers over their cable plant (FCC, 2005c). This establishes the leading cable operators as significant CLECs.
Conversely, some of the largest RBOCs have committed substantial investments in FTIH, and indications to-date are that they are making those investments and offering production services to customers. They are looking to establish so-called triple-play offers, as are cable operators, and to use them to lock in customers before the market settles into a new form. Potentially, both telephony carriers and cable operators will compete for customers with a package comprising voice, data, and video.
For now, however, there are more questions than answers about the long-term prospects of rivalry between cable and telephony. Is the present rivalry real, or are operators merely putting on a show for the regulators? Is the rivalry likely to be sustainable over time? Will price competition continue, or will the industry settle into a comfortable duopoly pattern with supracompetitive rents? Will FTIH continue to roll out? Will operators succeed in getting the municipal franchise rights that they
need? Will telephone carriers get the rights that they need to distribute the content that consumers want to watch? Will customers accept FTIH as a substitute for cable TV and DBS satellite? Will the incredible bandwidth of FTIH ultimately provide telephone companies with a “trump card” in their rivalry with cable operators?
Returning to the wholesale market for broadband Internet access, it is difficult to analyse the significance of media other than DSUtelephony. The FCC either fails to systematically capture and report the data that would be needed to make an assessment, or else it systematically fails to capture and report it.
It is consequently not altogether clear whether cable is relevant to the wholesale market for wired broadband (FCC’s claims to the contrary notwithstanding). Cable operators have never been under an overall regulatory obligation to provide wholesale services to unaffiliated ISPs. Historically, the ISP Earthlink was explicitly given certain rights to third party access in the AOL-Time Warner merger, and was able to build a customer base on that foundation. The record indicates that other ISPs, who also had rigllts in that merger agreement, were unable to make effective use of them. There were contracts, but no subscribers to speak of.
The cable industry may have concluded other contracts as well, but there are no indications that the number of subscribers through third party access to cable is sufficient to have a meaningful economic impact on rivalry at the wholesale level.
Broadband alternatives may be important in the long run, but in the short term they do not appear to have much impact on the competitive landscape in the United States. 3G has played no significant role in the United States to date. Wireless alternatives such as WiFi and eventually WiMax might be important in the long run, but no one (including the FCC) has decent data on deployment to date, and it is doubtful whether they can provide any effective competitive constraint on wired broadband prices today.
(The WiFi service in the Starbucks coffee shop across the street from my apartment in Washington, DC, did not represent a meaningful substitute for my Verizon DSL service). Broadband over powerline might be significant in time, or it might not. All of these options have varying degrees of long term potential, but in the short term none of them provides an effective competitive constraint.
In short, the competitive landscape in the United States differs from that of most European countries in some important respects. Some consequences of current U.S. regulatory trends can be predicted with a moderate degree of confidence, but many others remain highly speculative. The FCC’s assessments in the series of proceedings noted earlier should be
www.iicom.org July IAugust 2006 Volume 34 Number 3 33
BROADBAND REGULATION (CONT.)
viewed as exceedingly optimistic and na’lve, but the possibility that the FCC will ultimately get lucky cannot be excluded.
A few years ago, it appeared that telecoms regulation in the U.S. and in the EU would routinely reach similar results, despite different underlying regulatory processes. Today, by contrast, it seems clear that the U.S. is moving in a different direction than Europe, and also in a different direction than U.S. regulatory policy prior to about 2002. The U.S. is, at best, dancing to the beat of a different drummer.
On balance, the current U.S. approach seems more likely than not to lead to a less competitive environment than that enjoyed in the EU. It has already led to a vastly less competitive environment at the intramodal wholesale level. At the same time, the U.S. environment differs from that of Europe in ways that might possibly serve to mitigate the potential negative impact. Many scenarios are possible, many of them bad, but not all of them irredeemably bad.
A serious concern is that many of the actions that have been undertaken may be difficult to reverse. Even if it were to somehow find the political will, it is not clear that the FCC would have the statutory authority to effectively reverse certain of the changes that it has made. Nor is the Congress likely to provide a quick fix.
Given the complexities of the political process in the United States, and the asymmetries emerging in the profitability (and thus the lobbying capabilities) of market participants, any actions taken by Congress are more likely to exacerbate problems than to correctthem. The U.S. would thus appear to be committed to a trajectory from which it would be difficult to reverse or correct its course. References
CRANOR Lorrie Faith & WILDMAN Steven S. (Eds) (2003): Rethinking Rights and Regulations: Institutional Responses to New Communications Technologies, MIT Press.
Directive 2002/20.EC of the European Parliament and of the Council of 7 March 2002 on a common regulatory framework for electronic communications networks and services (Framework Directive), Official Journal of the European Communities, L108, April 24th, 2002.
at the 4th ZEW Conference on the Economics of Information and Communication Technologies, Mannheim, Germany, July 2004. Available at: ftp://ftp.zew.de/pub/zew-docs/div/IKT04/Paper_ Marcus-‘ nvited. pdf
FCC (Federal Communications Commission): – (2002a): Office of Strategic Planning and Policy Analysis (OSP), Working Paper 36, “The Potential Relevance to the United States of the European Union’s Newly Adopted Regulatory Framework for Telecommunications”, July. – (2002b): Cable Modem Declaratory Ruling and Notice of Proposed Rulemaking, March 14th – (2003): In the Matter of Review of the Section 251 Unbundling Obligations of Incumbent Local Exchange Carriers; Implementation of the Local Competition Provisions of the Telecommunications Act of 1996; Deployment of Wireline SelVices Offering Advanced Telecommunications Capability, generally referred to as the Triennial Review Order (hereinafter TRO~, adopted February 20th, 2003, released August 21s , 2003, starting at §255. – (2005a): In the Matters of Appropriate Framework for Broadband Access to the Internet over Wireline Facilities; Universal Service Obligations of Broadband Providers Review of Regulatory Requirements for Incumbent LEC Broadband Telecommunications Services; Computer III Further Remand Proceedings: Bell Operating Company Provision of Enhanced Services; 1998 Biennial Regulatory Review Review of Computer III and ONA Safeguards and Requirements; Conditional Petition of the Verizon Telephone Companies for Forbearance Under 47 U.s.C. § 160(c) with Regard to Broadband Services Provided Via Fiber to the Premises; Petition of the Verizon Telephone Companies for Declaratory Ruling or, Alternatively, for Interim Waiver with Regard to Broadband Services Provided Via Fiber to the Premises; Consumer Protection in the Broadband Era (hereinafter Computer Inquiries Order), adopted August 5, released September 23. – (2005b): “FCC Approves SBC/AT&T and Verizonl MCI Mergers (Corrected)”, October 31st – (2005c): Local Telephone Competition: Status as of December 31st, 2004, July, p. 3.
IEEE Communications Magazine (2005): “Broadband Adoption in Europe”, April.
KAHN Alfred E. (2001):, “Whom the Gods Would Destroy, or How Not to Regulate”, AEI-Brookings Joint Center for Regulatory Studies, p. 23.
ECTA (European Competitive Telecommunications Association) (2004): Annual Review. European Commission (2004), 10th Implementation Repo/t, December.
“Europe’s New Regulatory Framework for Electronic Communications in Action”, presented 34 Volume 34 Number 3 July I August 2006
MARCUS J.S. (forthcoming), “Voice over IP (VoIP) and Access to Emergency Services: A Comparison between the U.S. and the European Union”, in IEEE Communications Magazine.
TABELLINI Guido (2002): “The Assignment of Tasks in an Evolving European Union”, Centre for European Policy Studies, January. www.iicom.org
“Today, by contrast, it seems clear that the U.S. is moving in a different direction than Europe, and also in a different direction than U.S. regulatory policy prior to about 2002.
The U.S. is, at best, dancing to the beat of a different drummer.
On balance, the current U.S. approach seems more likely than not to lead to·a less competitive environment than that enjoyed in the EU.”
www.iicom.org July I August 2006 Volume 34 Number 3 35
THE GLOBAL DEBATE ON GOVERNANCE
Creative disorder or adaptive strain? By Rex Winsbury Itseems a world away and time since Nokia was a simple maker of rubber galoshes and motor car tyres (see old Nokia poster right), even though the blue rubber-covered Nokia mobile phone which I acquired. But Nokia’s transformation is only one example of the dramatic pace of change. Today, we live in interesting times for the Internet, both for testing its technical resilience, and for the struggle over its control or neutrality.
The World Cup, which will be well on it way by the time this issue of Intermedia reaches readers, will be a big test of the Net’s resilience as millions of football fans try to watch the matches live in real-time on their laptops or mobile phones, probably in office hours when they ought not to be. The World Cup will in particular be a test of those organizations providing live online streaming of matches, and of the office networks over which many users may in practice watch them, devouring, says one IT manager, ‘gargantuan amounts of bandwidth’.
Meanwhile, in the USA, where the really critical decisions about Net policy are made because that is where the Net giants like Google, Yahoo, Microsoft and eBay are headquartered, there is at the time of writing a battle-royal going on about several new pieces of legislation that are going through the US legislative procedure, alongside at least one key court case. There is to be a new telecoms bill, alongside a new copyright act. while the cable company cablevision is being sued by Hollywood over its new timeshifting programme recording service.
Meanwhile, other battles are being fought elsewhere, such as (in Europe, and in France particularly) the battle over whether France and other countries can continue to subsidise their cinema industry, and, as part of the Television Without Frontiers discussion, the battle over whether the European rules covering traditional ‘linear’ broadcasting should be extended to new media of the ‘pull’ variety.
The telecom bill recently provoked this extraordinary leiter from the eBay CEO Meg Whitman, who could be an influential voice in the Net Neutrality debate. (See last issue of Intermedia for discussion of Net neutrality and its importance). In addition to heading one of the most prominent Internet companies, Whitman also is a leading Republican. She recently sent this email message to many eBay users:
Dear XXXX, As you know, I almost never reach out to you personally with a request to get involved in a debate in the U.S. Congress. However, today I feel I must. Right now, the telephone and cable companies in control of Internet access are trying to use their enormous political muscle to dramatically change the Internet. It might be hard to believe, but lawmakers in Washington are seriously debating whether consumers should be free to use the Internet as they want in the future. The phone and cable companies now control more than 95% of all Internet access. These large corporations are spending millions of dollars to promote legislation that would divide the Internet into a two-tiered system. The top tier would be a ‘Pay-to-Play’ high-speed toll-road restricted to only the largest companies that can afford to pay high fees for preferential access to the Net. The bottom tier – the slow lane – would be what is left for everyone else. If the fast lane is the information ‘super-highway’, the slow lane will operate more like a dirt road. Today’s Internet is an incredible open marketplace for goods, services, information and ideas. We can’t give that up. A two-lane system will restrict innovation because start-ups and small companies – the companies that can’t afford the high fees – will be unable to succeed, and we’ll lose out on the jobs, creativity and inspiration that come with them. The power belongs with Internet users, not the big phone and cable companies. Let’s use that power to send as many messages as possible to our elected officials in Washington. Please join me by clicking here right now to send a message to your representatives in Congress before it is too late. You can make the difference. Meg Whitman President and CEO, eBay Inc.
Net Neutrality So the batlle-lines have been drawn. But the attitude of the US cable and telephone companies
36 Volume 34 Number 3 July I August 2006 www.iicom.org
THE GLOBAL DEBATE ON GOVERNANCE
has come under fire, not just over the principle of who controls access to the Net, but also over the price, penetration and competition in the broadband market compared to, say, Europe.
Critics of the telecom and cable companies say that broadband penetration rates in the USA are low, and prices high, by comparison, and that this is the result of lack of competition. Indeed, the FCC chairman has said that while other services such as broadband, cellular and long-distance phone calls have dropped in price over the past 10 years, cable TV rates have actually increased 80 per cent.
But the telcos and cable companies argue that the idea of Net Neutrality, while fine sounding as a principle, is in practice a lobbying movements by the big names in e-commerce, such as eBay, Google and Amazon. But what proponents of Net Neutrality argue is that there should be a legal provision somewhere that would prevent a broadband network from favoring content and applications in which they have a financial interest. Clearly, there are principles and business models here which will have world-wide repercussions.
The new copyright law What the critics of the new copyright law say is that it runs the risk of stifling innovation (a common charge against almost any form of regulation) but also of making almost any incidental, server, cache, network, or buffer copy of, say, a piece of music, if made over digital networks and/or on personal computers, will be subject to copyright and must be licensed. This would be, they argue, a danger to all types of ‘fair usage’.
The Cablevision case Cablevision plans to operate a so-called ‘network digital video recorder’, the essence of which is that consumers would be able to record and play back programmes that are stored, not on equipment in their homes, but on the networks’ own servers. Whereas recording in the home is legal as ‘fair use’, argue the Hollywood studios and TV networks, the cable operator itself has no license to store programmes.
If Cablevision wins the case, many other cable operators are expected to do the same, to extend their appeal to customers. Such a spread would, say some, have profound effects on advertising (because ads can be
www.iicom.org July I August 2006 Volume 34 Number 3 37
THE GLOBAL DEBATE ON GOVERNANCE (CONT.)
fast forwarded) and on viewing habits, not because the principle is new, but because the whole business of time-shifting so as to store and play-back later, and of axing the ads, would become much simpler and more ‘in your face’ to the viewer.
London v. Brussels? Meanwhile, in Europe, there is at the time of writing a fierce argument about how the 1989 Television Without Frontiers directive should be updated – perhaps by the end of this year. The European Commission wants to extend full broadcast-type regulations to all forms of ‘linear’ services like TV viewed over the Internet or on mobile telephones. But it also wants to institute regulation, but lighter regulation, on ‘non-linear’ services such as video-on-demand where it is the consumer that generates the request for the material. The UK’s Ofcom, for one, thinks that this extension of regulation, however light, would (once again) stifle innovation. The Commission replies that this involves a complete misreading of the proposals and over-states the impact of the changes to the rules.
Such matters are getting quite urgent, in that technical trials are already under way to determine what might be the global standard for TV transmissions to mobile telephones. The rivals include both a Korean candidate (T-DMB, already in use in South Korea) and a UK standard (DAB-IP), which sends TV and radio signals over the Internet. Both use Digital Audio Broadcasting techniques.
Effect on the Press Meanwhile, it is interesting to note that, overall, the fast development of the Internet has not had the devastating impact on the printed press that many expected. The World Press Association reports that in 2005 some 439 million copies of paid-for daily papers were sold, up by a small but significant haifa-percent but continuing a longer term trend upwards. But more striking is that in Europe, the trend is downwards, and the global increase is due to rising sales in Asia, particularly in China and India. The Association was particularly preoccupied with the phenomenon of the ‘citizen journalist’, the reader who creates his or her own news (see also article in this issue of Intermedia on Wikipedia).
The global influence of the USA How might these developments impact the rest of us? The model below was produced by the Global Business Network for a recent meeting of top ‘futurologists’. In 10 years time, said a majority of the participants, we will all be in one of the bottom two squares. This is not quite as bad as it looks, since it will be a more decentralized, networked world reliant much more on bottomup organizations. One may hope that the current round of regulatory and legal arguments in communications will be resolved in this spirit.
CentJr-otli1.ed, Hiel’l’lrchi I. “Top OOWI'”
Eura ian New eal Amer”ca C ntury
Global Economic and Cullur IlnnuellCe of tile U.s. d: ~ ~ ~ ~ ~
~ o III
Dec reasi ns
38 Volume 34 Number 3 July I August 2006 www.iicom.org
i n t e r J..J JJ J~ ~ J~J
International Institute of Communications
FORTHCOMING IIC EVENTS
UK CHAPTER MEETING, JULY, LONDON DIGITAL RIGHTS MANAGEMENT – CRISIS OR HOT AIR? The next meeting of the UK Chapter will look at some of the key issues surrounding the ongoing debate about digital rights management. Details of the date, programme and speakers to be posted on www.iicom.org.
2006 INTERNATIONAL REGULATORS FORUM, 16 & 17 SEPTEMBER, CYBERJAYA, KUALA LUMPUR Hosted by the Malaysian Communications and Multimedia Commission, this unique global Forum open only to regulators will look at issues such as fixed and mobile communications, NGNs and digitalisation of content and media literacy and consumer protection. The meeting is intimate, high-powered and draws speakers and participants from across the world from the telecommunications, broadcasting and converged communications sectors. For further information please email firstname.lastname@example.org.
**Book by 14 July and benefit from an early booking discount. Special rates apply for regulators attending the IIC Annual Conference as well.**
IIC 37TH ANNUAL CONFERENCE, 18 & 19 SEPTEMBER, KUALA LUMPUR Reaping the Communications Dividend – Promoting Business, Empowering Consumers and Serving Citizens Supported by the Malaysian Communications and Multimedia Commission and the Malaysian Government
Sponsored by Maxis, Astro and Radio Televisyen Malaysia
Official carrier Malaysia Airlines
Telecommunications, broadcasting and the media sector overall are undergoing tumultuous changes as new technology and applications come on stream. The convergence of mediums for text, sound and pictures and the daunting speed of access to information and its dissemination, makes for a world quite removed from what we have known for over half a century.
Technological change is breaking down traditional patterns of behaviour in the lives of individuals as well as communities and new lifestyles are being created. As digital technology diminishes distance and overcomes geography, the boundaries of communities and cultures, already weakened by globalisation, face further erosion by the flow from outside of more varied news, opinion and programming. Protagonists claim the result will be a wider belief in common values and the ability to give expression for the first time to the common interests of people living far apart while critics point to lost cultures and the dangers of narrowing sources of information.
Yet the possibilities now offered by new technology and applications also present for the first time the potential for those without access to information and to communications tools to benefit personally, socially and economically from them.
The 2006 Annual Conference will examine the dynamics of recent communications developments and assess their potential benefits and impact on business, consumers and citizens. It will explore the issues facing regulators, governments and the sector at large as new markets emerge and new services, content and technologies are made available to the mass market.
**Book by 14 July and benefit from an early booking discount.** For further information please email email@example.com or visit the event website www.iicom.org/conference.
TELECOMS AND NEW MEDIA FORUM, NOVEMBER, WASHINGTON, DC Details of the date, programme and speakers to be posted on www.iicom.org.