Paging Air New Zealand, please report to the naughty corner.

A lot of folks who know me will know that I’ve held Air New Zealand in high regard for several years, that I really rated their inflight product and service, and would choose to fly Air NZ from Heathrow over to LA over other airlines such as BA or Virgin, as well as use them for flying to NZ itself.

I was particularly a big fan of their Pacific Premium Economy product – loads of leg room, and an inflight service which was deserving of the title “Premium” – offering Business class meals and fine NZ wines. I thought it represented very good value for money, and made the Virgin Premium Economy product, and especially the BA World Traveller Plus product look positively economy by comparison.

On a recent trip down to New Zealand (my third in as many years) in January, for the NZNOG meeting in Wellington, I was able to experience Air New Zealand’s new long-haul Premium Economy product on their much hyped new Boeing 777-300 aircraft on one of it’s first long haul flights.

Sadly, while the product was innovative, I was not impressed. I felt underwhelmed and disappointed with the experience, compared to that I would have received on the older plane. What’s worse is that it wasn’t all teething troubles. Sure, there were some teething troubles. But many were what I see as basic issues with the new product.

I found the new “SpaceSeat” anything but spacious – it felt confining, with the TV screen mere inches from your face, and at a weird angle compared to the seatback (and therefore your body), so you have to turn your neck or sit sort of twisted into a side-saddle position to try and be comfortable and watch a film at the same time.

The “seat pocket” as provided was a joke – it was made of a solid material (rather than an elasticated netting), and wasn’t even big enough to fit a book in.

The area of the aircraft I was sat in felt incredibly hot, stuffy and uncomfortable. Despite the crew setting a cooler temperature, where I was sat, it never seemed to cool down or get any sort of noticeable airflow.

I’m not normally a claustrophobic person, but I can only describe what I felt as being “freaked out” by the environment created by the cabin – from the stuffy air to the TV in your face, to the lack of space – and this was in a window seat!

I thought the personal space on the new product rather inferior to that you get on Air NZ’s 747-400 planes.

The fact is, Air NZ have, for some reason, crammed the rows of seats very tightly. I would probably be inclined to pay a little more if a row or so of seats was pulled out and spread between the other seats. Just to get that TV a bit further away from my face.

The inflight service was of a reduced quality compared to the old service, at least as far as I was concerned. It was very slow, due to the fiddly nature of the service, and because of the seat being angled away from the aisle, so tight up to the back of the seat in front, it wasn’t really possible to have any sort of interaction with the crew member serving you from the aisle, as this would involve being able to turn your head through much greater than 90 degrees.

This meant that it was much more difficult to experience the “soft side” of the service of the great Air NZ crews, as you couldn’t easily make eye-contact with them. They just became this sort of disembodied arm and hand pushing food and drinks in front of you. It also took over two (and more like three) hours to serve the main meal.

This seems to fly in the face of what Air NZ were aiming for, which was a more personal service!

It’s fair to say that on the older aircraft, Air NZ didn’t have the best PE seat in the sky, but I think they had the best Premium Economy soft product, in terms of the food and level of service. Air NZ seem to have gutted that great soft product, in order to provide what they percieve as a “better seat”.

There are other comedy errors, such as the location of the Premium Economy galley (over the wing) meant that it couldn’t have a hot water tap. If passengers ordered coffee or tea, it had to be brought from the other galleys – meaning staff walking through the cabin with jugs/flasks of hot water from the other galleys – not made easy because the aisles have been made narrower!

The changes don’t just affect the Premium Economy product, either. The quid-pro-quo of the “Economy Skycouch” product is that the Economy cabin is seated 10-across, which doesn’t sound bad, until you realise this is on a Boeing 777, which most other airlines, including Air NZ themselves on their 777-200s, only seat 9-across.

The aisles are noticeably narrower – more folk in the aisles will notice they get bumped – as are the seats themselves. A friend travelled in the back on the 777-300 and found it unbearably uncomfortable, having to sit with their shoulders “tucked in”. I can understand this on a 20-minute commute to Central London, but not on a 13 hour flight from LA to NZ.

The seat pitch (the space between the seats) in Economy has also been cranked down from 34″ on the 747-400 to 31-32″ on the 777-300. Air NZ have gone from one of the best Economy products in the sky to one of the most unbearably cramped in one fell swoop. Feels like a step backwards, and it’s not just me. There’s plenty of discussion about it on that perennial thorn-in-the-side of airlines, Flyertalk.

When the product was launched, it was accompanied by a lot of fanfare about the months of painstaking research that has gone on behind the scenes. If there has been all this research, how can the product be full of what I (as a 100K mile per year traveller) regard as such schoolboy errors.

Also interesting to observe is that there has been what I percieve to be an astroturfing campaign about how great the new products are via their social media outlets such as their Twitter account, yet nothing about the nightmares that I know for a fact they have been having with the new service. Oh, and what is it with that dreadful muppet character, Rico? How is that related to (what should be) high quality air travel?

So, not really enjoying this flight much, I contacted Air NZ to offer my feedback on the flight.

Sadly, after waiting about 6 to 8 weeks, all I got was a dreadful, bland, canned reply which basically indicated a “head in the sand” approach, that there couldn’t really be anything seriously wrong with their wonderful new product, could there, and these were all flukes which would be fixed next time I flew. Like I believed that.

They may as well have just said “You are free to take your business elsewhere”. Well, sadly, that’s what I’ve done on my next trip to California.

On 1st April, the last 747-400 operated NZ1 will leave Heathrow, and the next day London gets the “downgrade” to 777-300 service. The regulars won’t know what’s hit them.

Update – Thursday 31st March 2011

So, a few folks thought I was just having a rant here. Perhaps because it sounded a bit ranty, or I wasn’t explicit about something I wanted to get across:

What’s really disappointed me here is that an organisation which seemed to be switched on yet still be able treat it’s customers with good old-fashioned respect, and in the past seemed to have a great grasp of what people wanted, could have gone off the rails quite so spectacularly with a string of apparently shallow and unpopular moves.

The problem with the IETF

There’s been some good efforts to fix the hiatus that’s been perceived to exist between the Internet operator community and the IETF recently. I hope I’m not giving them the kiss of death here… šŸ™‚

A sense of frustration had been bubbling for a while that the IETF had become remote from the people who actually deploy the protocols, that IETF had become the preserve of hardware vendors who lack operational experience, and it’s no wonder they ship deficient protocols.

But, it can’t have always been that way right? Otherwise the Internet wouldn’t work as well as it does?

Well, when the Internet first got going, the people who actually ran the Internet participated in the IETF, because they designed protocols and they hacked at TCP stacks and routing code, as well as running operational networks. Protocols were written with operational considerations to the fore. However, I think people like this are getting fewer and fewer.

As time went by, the Internet moved on, a lot of these same folk stopped running networks day-in-day out, and got jobs with the vendors, but they stayed involved in the IETF, because they were part of that community, they were experienced in developing protocols, and brought operational experience to the working groups that do the development work.

The void in the Network Operations field was filled by the next generation of Network Engineers, and as time has gone by, fewer and fewer of them were interested in deveoping protocols, because they were busy running their rapidly growing networks. Effectively, there had been something of a paradigm shift in the sorts of people who were running networks, which differed from those who had been doing it in the past For the Internet to grow the way it did in such a short time, something had to change, and this was it.

At the same time, the operational engineers were finding more and more issues creeping into increasingly complex protocols. That’s bad for the Internet, right? How did things derail?

The operational experience within the IETF was suffering from two things – 1) it was becoming more and more stale the longer that key IETF participants didn’t have to run networks, and 2) the operator voice present at IETF was getting quieter and quieter, things suggested by operators had been largely rejected as impractical.

Randy Bush had started to refer to it as the IVTF – implying that Vendors had “taken over”.

There have been a few recent attempts to bridge that gap – “outreach” talks and workshops at operations meetings such as RIPE and NANOG sought to get operator input and feedback, however trying to express this without frustration hasn’t always been easy.

However, it looks like we’re getting somewhere…

Rob Shakir has currently got a good Internet Draft out aimed at building a bridge between the ops community who actually deploy the gear and the folks who write the protocol specs and develop the software and hardware.

This has been long overdue and needs to work. It looks good, and is finding support from both the Vendor and Ops communities.

It’s a “meta-problem” here is that one cannot exist without the other, it’s a symbiotic and mutually beneficial relationship that needs to work for a sustainable Internet.

I wonder if it’s actually important for people on the protocol design and vendor side to periodically work on production networks to ensure that they have current operational knowledge, and not relying on that from 10 years ago?

Internet Access as a right and the Egyptian Internet Shutdown

I’m not going to do any in depth analysis (I’ll leave that to my good friends at Renesys)- it’s everywhere – but unless you’ve been comatose for the last few days, you can’t have helped notice the situation in Egypt.

Being an internet geek, I’m still going to focus on the country’s decision to take itself offline – killing it’s Internet connectivity to the rest of the world.

Firstly, while it may have slowed down the ability for folks in Egypt to communicate rapidly with the rest of the world, and potentially organise demonstrations, it also seems to have managed to drive folk who might have otherwise stayed in front of their screens out onto the streets, where they can either generally protest at the Mubarak regime, or specifically protest about being isolated from something they now take for granted.

It certainly gives the Police something to do…

The “kill-switch” mechanism appears to have been pretty simplistic, and non-technical in implementation. It is highlighted from the Renesys, RIPE Labs, and other analysis that the main Egyptian ISPs seem to have been called on in turn by folks from the Mukhabarat (the Egyptian equivalent of the secret service) and instructed to shut down external connectivity – by taking down interfaces or BGP peers.

The lack of centralised technical measures required shows that it’s not necessarily difficult for any administration to do this – either using existing instruments in law, or just having enough agents and judges to churn out the court orders and pay folks a visit.

However, the other thing to consider is that some countries are now starting to treat the Internet as an essential service and almost fundamental right, like access to water and power.

I’ve just on the way from a visit to New Zealand (where I participated in the NZNOG ’11 conference). The NZ Government is currently embarking on a process of using Government subsidy – with the premise that this will get paid back over time – to bootstrap open access FTTH implementations in major urban areas in NZ, to the extent they should bring 75% of the country’s 4.5M inhabitants within easy reach of high-speed broadband.

The motivation behind such a move is that reliable high-speed internet access will be a cornerstone of economic growth, but that comes with the corollary that it becomes an expectation of the consumer, just like they expect the power or water supply not to go off unless it’s a genuine emergency (such as the flooding in Queensland, Australia).

However, it seems like the Government involvement could become a double edged sword, as they investment threatens to come with various regulatory strings attached – the change in funding, from a private, entrepreneurially-built infrastructure, means that the Government feels like it has a right to have a say. Remains to be seen how much of one yet, but there’s already dangerous talk of “mandatory peering”, and that sort of ilk.

As far as I can tell, there’s no talk of a massive comic-strip style busbar “kill switch” being built in NZ, and the NZ Government seem to appear like moderate and reasonable folk, but will the investment in UFB be brought to bear when the NZGovt want Internet providers to accede to their desires in blocking content or controlling access?

Back to the “Internet access as a right” for the closing few words: Flipping the question on it’s head, how would you feel if the Government shut the power off to your neighbourhood because they felt like it served their needs?

Finally, a touch of irony. I composed this blogpost from 37000ft over the Texas, thanks to the reliable and affordable in-flight internet access they provide on board Virgin America.

(While on board, I saw the announcement that Mubarak has appointed his Intelligence chief as Vice President. Says a lot, right?)

That’s how richly woven through our often already complex lives ready internet access has become. Think back a few years. People’s expectations are already changing.

Update – 31st Jan 2011

Vikram Kumar, Chief Exec of InternetNZ (the association that engages in technical and public policy issues on behalf of the NZ Internet community) has just blogged an article about the possibilities for a take-down of NZ’s external connectivity to the Rest of the World. Summary: probably unlikely.

Update – 2nd Feb 2011

Internet access in Egypt was restored in the last 12 hours, and there’s coverage of this on the Renesys Blog

Using Social Networking to build Corporate Kudos

Many companies hUAL's Twitter person jumps on report of problems...ave leapt on the Social Networking bandwagon as part of their marketing and public relations strategy. They have staff for whom posting on things like Twitter and Facebook is a major part of their day.

Why wouldn’t you? It’s an easier way of getting information out to, and interacting with, your customers (and potential customers).

Airlines have been fairly quick to catch on to this – it’s a great way of rapidly disseminating service info during disruption, and collecting rapid feedback from pax.

One of my industry colleagues recently tweeted at United Airlines because he saw something that he thought UAL HQ should know about. Obviously the stress of the weather-related disruption hitting the area was getting the better of both pax and airline employees alike…

“@UnitedAirlines your ground staff is yelling at an old man since 15 mins airport IAD gate D6 flight UA7599 Time Jan 27, 2011 1439”

UA’s Twitter person was responding within about half-an-hour…

“@mhmtkcn The Dulles manager will follow up. Today. Thanks for the heads up.”

Two things to take away:

1) Speed of contact and response – this was quicker than sending email, or probably trying to phone someone up in UAL to let them know this was happening. Getting the message to the right person isn’t always easy. The Social Media team can act as a rallying point for this info.

2) The positive response from the UA Twitter scribe – “This will be followed up today” – does a lot to show that someone in what could otherwise be percieved as a large, faceless, inaccessible, uncaring corporation, does give a damn, that the user can get their attention, and get something done.

This simple action shows how social networking can bridge the communications gap that often exists between large companies and it’s clients, and does a lot to raise UA’s kudos amongst those who saw the message.

Comcast-NBCU Deal Approved

Yesterday, the FCC approved Comcast‘s proposed purchase of the controlling stake in NBC Universal, but with some conditions – such as giving up the Board position (and control) in online video service Hulu, despite still owning a good share of it, along with other regulatory controls – limits on exclusivity of NBCU produced content and how it should operate according to the FCC’s Open Internet Principles – to try and keep a rein on what the merged entity can and cannot do, to try and retain a reasonably level playing field in the market (to the relief of other cable operators, online video distributors).

It looks like quite the regulatory straitjacket, which has no doubt also cost public money to develop, and will cost more to enforce.

Despite this, all I can hear in the back of my mind is the “thud, thud, thud” of the Stay Puft Marshmallow man‘s footsteps, coming to crush anything in it’s way under it’s squishy feet.

Auntie Beeb on Net Neutrality

Earlier this week Andy D suggested that I might be listening to too much Radio 4.

I don’t necessarily think that’s been a bad thing, as that means I caught a couple of items on Net Neutrality. Indeed, the Beeb seems to be showing increasing interest in this area, and wouldn’t you, if there was something which threatened your editorial freedom?

Imagine for a minute that Sky Broadband subscribers got ultra fast access to Sky News (and other News Corp) content, while poor old Auntie (among other content providers) got packet-shaped, throttled and capped to a crawl? Now you’ll see what the fuss is about.

Anyway, Radio 4 has recently discussed Net Neutrality on two occasions in the past few weeks.

Firstly during the long-running consumer affairs programme “You and Yours“, on the 7th October, there was a brief discussion (will open a link to BBC iPlayer) which included comments from ISOC’s Leslie Daigle.

This week, on Monday 17th October, there was a further segment on the subject (iPlayer link) in the “Click On” programme – fast forward to around the 17 minute mark – which contains the fantastic quote of “Put three geeks together in a room and you’ll get four definitions of Net Neutrality”.

I’m not sure if that says more about the issue, or more about the geeks. šŸ™‚

There’s also a rather nice BBC blog article from Erik Huggers (Director, BBC FM&T) which incorporates and sums up nicely elements from both the above articles, including that despite the appearance of freedom of choice and competition in the UK consumer broadband market, it isn’t all it’s cracked up to be, due to triple-play lock-ins or the sheer aggro factor.

The closing paragraph talks about “thin-end-of-the-wedge” concerns about this gradually creeping in through the backdoor if the regulators don’t use tools in their power to manage this contentious issue.

While the BBC, as a major content player, do have a vested interest in preserving their editorial freedom and equal opportunity to distribute their content, there’s a lot of sense behind it too.

Whither (UK) Regional Peering – Pt.1

Just last month, in mid-September, Andy Davidson brought up the switch at IXLeeds, the latest UK regional IXP.

You’ll note I say “the latest”, but how many non-London UK IXPs can you name off the top of your head? Not many, I’ll wager. Fewer that are still operating, too. No, the LINX PoP in Slough doesn’t count in my picture of non-London!

This is the problem: It’s often said that there isn’t the level of regional IP peering going on in the UK that there probably should be for redundancy reasons. The majority of IP peering in the UK happens in London, and when it isn’t happening in London, it’s probably happening in Amsterdam instead.

Let’s face it, on an island that’s ~15ms round-trip top-to-bottom, we’re less likely to peer to reduce latency, especially when the architecture of the incumbent wholesale DSL platform doesn’t encourage networks to do little beside haul all broadband customer traffic to a central point before dispersing it.

Previous attempts at establishing regional IXPs in the UK have had varying levels of success. The most successful to date, in terms of number of participants and achieving critical mass is probably MANAP – which was founded in Manchester in 1997.

Unlike LINX which did survive (well, successfully resisted) a demutualisation attempt, MANAP only sort-of did. It allowed it’s infrastructure to be taken over by a company funded by the local Regional Development Association, and the exchange became a service provided over the infrastructure which was no longer dedicated to IXP operations, but also carried other traffic and provided other services.

The MANAP that exists today is not the same exchange, it has been subsumed into the NWIX platform and operates as Edge-IX, a distributed exchange which is present in both Manchester, elsewhere in the Northwest, and in many other locations, including those in London’s Docklands that it was initially intended to provide redundancy for. It’s has a different flavour, and has lost some elements of it’s “regionality”.

What distinguishes it from a carrier, other than the Edge-IX services being non-profit, while the NWIX ones are?

I’m not suggesting that this is a better or worse model, just different, and probably not regional anymore. If this, i.e. reinventing yourself as an inter-regional IX, is the only way a regional IXP in the UK can survive, then we’ll find it very challenging to reach position of sustainable regional peering in the UK. Could things have been different in Manchester?

You may be questioning what issue I have with a “wide-area” exchange point, distributed over a large geographic area? The main concern is shared fate. A disruption that would otherwise be localised, spreading easily. I can probably write a whole article on that. Maybe I will another day…

So, why would a quick hop over the Pennines to Leeds be any different?

Manchester itself is also at risk of being unattractive as a location for regional IXs – with the recent purchase of IFL by Telecity Group there is very little organisational diversity or competition in the Manchester co-lo market – there’s existing facilities such as Vialtus Serverbank, and recent new entrant Ice Colo. Folks in the Manchester area were very quick on the social networks to state their fears about anticipated price rises and few options as a result of the lack of choice.

The Leeds scene is rather different, with lots of smaller, entrepreneurial companies active in the metro area. This is a double-edged sword, as while it results in competition in the co-lo market which folks like, it also meant that IXLeeds couldn’t be present everywhere the potential IX participants wanted to connect, certainly from day one. There’s a future aspiration to expand within the metro.

One of the early strengths in IXLeeds is that has a good community feel behind it, including the involvement of folks who have experienced peering in Manchester, while the Yorkshire RDA have been involved from the outset in getting folks together, but (so far) haven’t felt the temptation to get in the driving seat, instead choosing to play the role of facilitator.

There’s a will to succeed, so hopefully they will reach the critical mass that is required to sustain the exchange.

A concern I have is the lack of international capacity into the Leeds area, Manchester is in a better position here due to the independant (from London) Transatlantic connectivity arriving in the area.

That said, while international bandwidth a something of a pre-requisite for a national exchange point, is that actually necessary for a successful regional IX?

Then again – what are the success criteria for an IX, especially a regional one? A graph that forever goes up and to the right? They probably are and should be different from a national IX. Is the regional IX not being satisfied with it’s lot, and wanting to be like it’s larger neighbour, what actually destroys it? Maybe that’s another article in itself?

I’d say it depends on how non-London-centric the early IXLeeds ISPs are, how much of their traffic is delivered locally, and how much traffic they have between each other that they might normally route through London.

If my previous experience is anything to go by – such as opening a new PoP for an already successful IX – these things usually “slow start” – so that means patience is required.

I’m going to come back to this topic in the coming weeks, I’ll try and write about some of the side issues I’ve threatened to cover above, and maybe touch on a missed opportunity.

Still think they should have called it the Rhubarb Internet Exchange. Even if it was just to confuse people. šŸ™‚

Beginning of the end for hotspots?

The Dutch Telecoms Regulator has announced it will require Dutch hotels to register as ISPs (Slashdot article).

Despite the fact that the hotel usually doesn’t own the wifi infrastructure in the hotel, and certainly isn’t an ISP in the normal sense, the Dutch regulator’s rationale is that the hotel is reselling the ISP service – i.e. is a VISP.

I don’t see this is always the case, as the hotels in the .nl, from my experience, don’t rebrand the ISP services as their own. However, they often collect the money and charge it to your room folio.

I suspect the meta question here is what does this mean for hotspots generally, especially the ones which are currently free?

Does this drive up their costs significantly enough to either a) cause free hotspots to charge or b) shut the hotspot down, because the costs of the bureaucracy aren’t recovered?

What happens if I let someone else use a MiFi that I own? Am I an ISP too?

Seems more thought is needed!