What have been doing since your formal retirement in July of last year?
Prior to retirement, I was appointed as a visiting professor at the university of Surrey, and I’m still actively engaged with that, particularly looking at the way from 5G to 6G. I’m on the steering board for the 6G innovation centre, trying to help make sure we know where we’re heading and that research is aligned with standards and future needs.
In addition to that I have a small consultancy and do small pieces of work, spread between commercial companies and non-profit organisations. I have a particular affinity for non-profits because that’s where I’ve been for the last 30-odd years of my career.
You have been heavily involved in the 3GPP broadband standardisation process. Can you talk about the evolution of that, and the increasing role being played by mission-critical organisations?
We’ve been working on mission-critical standards based on different mobile ‘generations’ for more than a decade now. Obviously prior to that, we had TETRA and other technologies.
Over 10 years ago, there was an ‘a ha!’ moment, with the realisation that we had to move on. I then became involved in the very early discussions with TCCA members saying, well, what can we do? We want to use mobile generations of standards for our work, but a lot has to change.
Honestly, at that point, there was a lot of scepticism about whether that change would even be possible. So, one of my roles was to shepherd the work as a go-between for TCCA, 3GPP and ETSI.
We tried to promote the overall benefit of that approach, and make sure that all parties could find a middle ground. That’s what standards is all about. Everyone has to give a bit.
How similar is the current environment to what was envisaged when the standardisation process started?
That’s the interesting thing, because the whole timeline concept was entirely different from what actually happened. Here we are 10 years later, and having spent a decade writing the standards, we haven’t actually seen mass deployment of this approach within the mission-critical community.
Of course, there are very good reasons for that, and to me that’s one of the lessons that repeats itself in standardisation. Timing is everything, but you can never really predict the timing need.
What were the specific factors that caused the mission-critical timeline to change so drastically?
One of the reasons is that conditions keep changing. For instance, if we reflect on how the world looked in 2012, it’s completely different from today.
At the time, the community I was working with was very clear that this was for blue light – police, fire ambulance – and if we get the job right, we can then worry about other things. But now the world has changed significantly, and some major world events have occurred.
We’ve come full circle, realising that mission-critical has to be treated as a whole, and not just as a solution for blue-light services. Personally, I’m not disappointed by that.
There’s an obvious lesson to learn here. Going forward, don’t try to predict what’s going to happen, because you’re probably going to get it wrong.
What does this change in attitude and focus mean for the standardisation process in real terms?
It means that rather than writing standards that are specific to a user group, you have to take a much higher-level view. If we can describe them in a more generic way, and have a common solution that suits everyone, we’re going to dramatically increase the market size as well as reducing costs.
The problem with that is it inevitably takes a little bit longer. You have to gather all the requirements from different communities, all of whom have a different language and terminology. In every case, you end up spending time trying to distil out of the communities exactly what it is they’re trying to do.
Then you start writing standards that deliver that functionality.
What have been the other key challenges, other than the needs of individual verticals?
The size of the market is an obvious one. Most [commercial] handset vendors deal in hundreds of millions of devices, so they’re not really interested in delivering just a couple of hundred thousand.
We need to make this a market that can be supplied at reasonable cost, and the more ‘bespoke’ it becomes, the less attractive it is from an economic point of view. Or else, we go back where we started when a handset cost two or three thousand pounds, which is not really where we want to be.
With that in mind, there have also been obvious challenges in terms of functionality, the big-ticket item being device-to-device. That’s something which had already been discussed in relation to mobile systems and put to one side. We could never get agreement that it was a good idea.
Likewise, the idea of having nodes – base stations – that would continue to work when there was no core network attached to them. These were big changes to the architectural designs of mobile systems, and again, timing was crucial.
At the time that these ideas were being put forward, we had only just completed the set of standards called 4G. And in 4G, this had never been raised as something that was particularly needed.
Then suddenly, after that standard is completed, we get all these new [mission-critical] requirements necessitating a complete change of thinking. At which point, the community looks and says, what’s in it for us?
That was the conceptual barrier that we had to overcome, and it took quite a lot of diplomacy. There were some key characters who were instrumental in evangelising the fact that there are things in life more important than making money. And keeping civilians safe is one of those.
Did the vendors buy that argument?
They did, and not just the manufacturers but the mobile operators as well. A lot of that was also to do with the appearance of governments in 3GPP, who started to realise that if they wanted things to change, they would have to be vocal in promoting that change.
From the manufacturer point of view, it’s quite natural that if your government has a particular view, you as an industry within that nation pay attention to it. And we could see that happening. The classic case was FirstNet in the US, which was very government-led.
3GPP is quite a brutal place because it has such a massive work programme. The leaders have to carefully consider how much time they devote to each subject, otherwise there would be endless talk with no progress.
At what point did next-generation mission-critical communications become a real point of interest for governments?
There were a couple of major public safety incidents, for instance, the terrible ferry disaster that took place in Korea in 2014. [One of the points of learning] of that was that the communications were a mess. The Korean government said to industry, we don’t care how you do it, but you’re going to solve it.
The only sensible way to do that was by aligning to the new standard which was being developed in 3GPP. Sometimes it takes terrible events like that to make people realise that it’s not all about profit and loss.
And of course, the times we’re in now – with the conflicts that are going on – place even greater stress on our mission-critical systems. This is why we’re seeing more thinking about the use of drones and satellites.
They were always considered a just nice to have in the past, but not really part of the mainstream. Wind the clock forward and all of a sudden they’re among the most important pieces of work on the programme. The focus has changed considerably over the past 10 years.
You mentioned that you’re involved in the development of the standardisation for 6G. How far along is that, and in what ways is it likely to resemble what we’ve seen over the past decade with 4G and 5G?
Honestly, it’s history repeating itself. There are certain things which need to be in place when you start a mobile ‘generation’. And if you want to have standards available by 2030, you need to start that roughly six to eight years beforehand.
In the first instance, you need a framework characterising what it is you’re trying to achieve, which has just been delivered by the ITU [International Telecommunication Union]. That sets out a very broad picture of what we’re trying to do, and how we’re going to get there. It’s agreed by participating governments, so it has huge international support.
That was the trigger for 3GPP to define its own timeline about how the standards are going to be written. We’ve already scheduled May as the first hearing for market representatives, trying to get their input on what they might need. That’s together with representatives from various R&D programmes.
The first meeting within 3GPP itself will then take place in March of next year. That will take the form of a workshop where we actually start to gather 6G requirements and understand how they would fit into some sort of programme.
So, the work is already beginning. What we don’t want is to see, however, is a mad rush. We have the rest of this decade to complete the work – but history tells us that it takes several years to agree and write commercial standards that are fit for commercialisation.
What has the ITU framework laid out?
Of course, there are the usual things like speed and performance. But there are other interesting metrics as well, such as energy efficiency and the sustainability of future systems. That has become a huge subject, and will continue to be.
6G also takes us away from ‘communications’ into a new world that you might call sensing. This is one of the big philosophical changes.
We envisage a system that’s fundamentally designed for people or machines to communicate with each other, and one that will be designed for sensing. Location and obtaining information from devices will be crucial.
Also, devices that have no onboard power systems, but rather power themselves either by energy harvesting or some kind of mechanism. At worst, you’ll have to change the battery every 10 years.
It will be interesting to have a study about what 6G could potentially do for mission critical. I would expect that to happen now.
Do you see parallels with the 4G/5G standardisation process in terms of verticals’ involvement?
The community that we have currently will need to continue ad infinitum, because you don’t want mission-critical systems getting old and out of date. They need to be as current as possible.
That means continually upgrading systems to make sure that they’re leading edge, something which our mission-critical technologies should always be. That means constant involvement [on the part of mission-critical stakeholders] in the setting of requirements.
I would absolutely expect organisations such as TCCA to be in the discussion from the very beginning.
With that in mind, what do you see as being the key points of learning from the previous 10 years? Is there anything that the sector needs to do differently going forward?
The lesson to learn is that you’re there for the long term, so you need to think with a long-term view.
Honestly, I want to really congratulate the industry on staying with it, because it was very difficult to get their voice heard.
They’ve made a huge impact. And now mission critical has become so much more important than it was, say, 10 years ago. All credit to them.