Or, I never thought of myself as a narcissist but…
Thanks to the folks at HEAnet, here’s a link to the video of the talk “It’s peering, Jim…” that I gave at the recent INEX meeting in Dublin, where I discuss topics such as changes in the US peering community thanks to Open-IX and try to untangle what people mean when they say “Regional Peering”.
The talk lasts around 20-25 minutes and I was really pleased to get around 15 minutes of questions at the end of it.
I also provide some fairly pragmatic advice to those seeking to start an IX in Northern Ireland during the questions. 🙂
Following on from yesterday’s post on the upcoming ITU-T WCIT conference, and what that might mean for how you can use the Internet, and how much you have to pay to use it, Google have also launched their own #freeandopen campaign, with this video which is their take on what top-down Government regulation could mean for you…
With less than two weeks to the WCIT (World Conference on International Telecommunications), and what that means for the internet (see “Is the Internet Facing a Perfect Storm”), the amount of coverage online is increasing rapidly.
Basically in one sentence, some Governments would like to see the ITU (a closed, top-down, Governmental organisation) take the lead in regulating the (open, collaborative, co-operative, bottom-up) Internet.
This video from the folks at accessnow.org explains it pretty well…
Lots of people I know have been working seriously hard in the background to educate those with a vote in the ITU, and keep the Internet based around open standards and collaborative governance.
Let’s face it, if the Internet wasn’t based around open standards, so many of the things which are part of our every day lives could be very different, or simply wouldn’t even exist.
Fortunately, as far as we know in the UK, our Government is still an advocate of the “light touch” and industry self-regulation when it comes to the Internet, but that doesn’t make what’s going to go on behind closed doors in Dubai in a couple of weeks any less of a concern.
It’s only available in a handful of markets at the moment, and the BBC’s tech correspondent, Rory Cellan-Jones, did many articles for TV and Radio yesterday, while conducting countless speedtests, which he has extensively blogged about.
Some of the comments have been that it’s no better in terms of speed than a good 3G service in some circumstances, while others complain about the monthly cost of the contracts.
Get locked-in to 4G(EE)?
The initial cost for the early adopters was always going to attract a premium, and those who want it will be prepared to pay for it. It’s also worth noting that there are no “all you can eat” data plans offered on EE’s 4G service. Everything comes with an allowance, and anything above that has to be bought as “extra”.
The most concerning thing as far as the commercial package goes are the minimum contract terms.
12 months appears to be the absolute minimum (SIM only), while 18 months seems to be the offering if you need a device (be it a phone, dongle or MiFi), and 24 month contracts are also being offered.
Pay As You Go is not being offered on EE’s 4G service (as yet), probably because they’ve no incentive to, because there’s no competition.
Are EE trying to make the most of the headstart they have over competitors 3, O2 and Voda and capture those early adopters?
Rory Cellan-Jones referred in his blog about problems with reduced performance when in buildings.
A number of factors affect the propagation of radio waves and how well they penetrate buildings and other obstacles, such as the nature of the building’s construction (for instance a building which exhibits the properties of a Faraday Cage would block radio signals, or attenuate them to the point of being useless), but also the frequency of the radio signal.
Longer wavelengths (lower frequencies) can travel further and are less impacted by having to pass through walls. I’m sure there’s an xkcd on this somewhere, but the best I can find is this….
The reason EE were able to get a steal on the other mobile phone companies was because OFCOM (the UK regulator, who handle radio spectrum licensing for the Nation) allowed EE to “refarm” (repurpose) some of their existing allocated frequency, previously used for 2G (GSM), and convert it to support 4G. The 2G spectrum available to EE was in the 1800 Mhz range, as that was the 2G spectrum allocated to EE’s constituent companies, Orange and T-Mobile.
Now, 1800 does penetrate buildings, but not as well as the 900 Mhz which are the 2G spectrum allocated to Voda and O2.
Voda are apparently applying to OFCOM for authority to refarm their 900 Mhz spectrum for 4G LTE. Now, this would give a 4G service which had good propagation properties (i.e. travel further from the mast) and better building penetration. Glossing over (non-)availability of devices which talk LTE in the 900 Mhz spectrum, could actually be good for extra-urban/semi-rural areas which are broadband not-spots?
Well, yes, but it might cause problems in dense urban areas where the device density is so high it’s necessary to have a large number of small cells, in order to limit the number of devices associated with a single cell to a manageable amount – each cell can only deal with a finite number of client devices. This is already the case in places suce as city centres, music venues and the like.
Ideally, a single network would have a situation whereby you have a high density of smaller cells (micro- and femto-cells) running on the higher frequency range to intentially limit (and therefore number of connected devices) it’s reach in very dense urban areas such as city centres, and a lower density of large cells (known as macro-cells) running on lower frequencies to cover less built-up areas and possibly better manage building penetration.
But, that doesn’t fit with our current model of how spectrum is licensed in the UK (and much of the rest of the world).
Could the system of spectrum allocation and use be changed?
One option could be for the mobile operators to all get together and agree to co-operate, effectively exchanging bits of spectrum so that they have the most appropriate bit of spectrum allocated to each base station. But this would involve fierce competitors to get to together and agree, so there would have to be something in it for them, the best incentive being cost savings. This is happening to a limited extent now.
The more drastic approach could be for OFCOM to decouple the operation of base stations (aka cell towers) from the provision of service – effectively moving the radio part of the service to a wholesale model. Right now, providing the consumer service is tightly coupled to building and operating the radio infrastructure, the notable exception being the MVNOs such as Virgin (among others), who don’t own any radio infrastructure, but sell a service provided over one of the main four.
It wouldn’t affect who the man in the street buys his phone service from – it could even increase consumer choice by allowing further new entrants into the market, beyond the MVNO model – but it could result in better use of spectrum which is, after all, a finite resource.
Either model could ensure that regardless of who is providing the consumer with service, the most appropriate bit of radio spectrum is used to service them, depending on where they are and which base stations their device can associate with.
The Internet has become a massive part of our everyday lives. If you walk down a British high street, you can’t fail to notice people staring into their phones rather than looking where they are going! I did see a comment on TV this week that you have a 1-in-10 chance of tripping and falling over when walking along looking at your phone and messaging…
There are massive pushes for faster access in countries which already have widespread Internet adoption, both over fixed infrastructure (such as FTTC and FTTH) and wireless (LTE, aka 4G), which at times isn’t without controversy. In the UK, the incumbent, BT, is commonly (and sometimes unfairly) criticised for trying to sweat more and more out of it’s copper last mile infrastructure (the wires that go into people’s homes), while not doing enough to “future-proof” and enable remote areas by investing in fibre. There’s also been problems over the UK regulator’s decision to allow one mobile phone network get a head-start on it’s competitors in offering LTE/4G service ahead of them, using existing allocated radio frequencies (a process known as “spectrum refarming”).
Why do people care? Because the Internet helps foster growth and can reduce the costs of doing business, and it’s why the developing countries are working desperately hard to drive internet adoption, along the way having to manage the threats of “interfering” actors who either don’t fully understand or fear change.
However, a bigger threat could be facing the Internet, and it’s coming from multiple angles, technical and non-technical. A perfect storm?
IPv4 Resource Exhaustion
The existing addressing (numbering) scheme used by the Internet is running out
A secondary market for “spare” IPv4 resources is developing, IPv4 addresses will have a monetary value, driven by lack of IPv6 deployment
On a Global level through the ITU, who, having disregarded the Internet as “something for academics” and not relevant to public communications back in 1988, now want to update the International Telecommunication Regulations to extend these to who “controls the Internet” and how.
All of these things threaten some of the basic foundations of the Internet we have today:
The Internet is “open” – anyone can connect, it’s agnostic to the data which is run over it, and this allows people to innovate
The Internet is “transparent” – managed using a bottom-up process of policy making and protocol development which is open to all
The Internet is “cheap” – relatively speaking, Internet service is inexpensive
These challenges facing the Internet combine to break all of the above.
Close the system off, drive costs up, and make development and co-ordination an invite-only closed shop in which it’s expensive to participate.
Time and effort, and investing a little money (in deploying IPv6, in some regulatory efforts, and in checking your business model is still valid), are the main things which will head off this approaching storm.
Adopting IPv6 should just be a (stay in) business decision. It’s something operational and technical that a business is in control of.
But, the regulatory aspect is tougher, unless you are big enough to be able to afford your own lobbyists. Fortunately, if you live in the UK, it’s not reached “write to your MP time”, not just yet. The UK’s position remains one of “light touch” regulation, largely letting the industry self-regulate itself through market forces, and this is being advocated to the ITU. There’s also some very bright, talented and respected people trying to get the message through that it’s economically advantageous not to make the Internet a closed top-down operated system.
Nevertheless, the challenges remain very much real. We live in interesting times.