Categories: MobilityWorkspace

Bell Labs: We Will Need Billions Of Small Cells

Since the birth of the very first automated cellular network in 1979, the main priority for operators has been to increase coverage. But over the next five years, the focus will slowly but surely shift to capacity.

The need for more bandwidth is mainly the result of the proliferation of devices that can consume high quality digital content, and the arrival of the Internet of Things. The challenge is to rapidly increase capacity of the existing networks while keeping the costs down, so the operators could afford the upgrade. Spectrum management is also an issue, with organisations like Ofcom maintaining an iron grip on the radio waves.

This is where small cells will save the day. Holger Claussen, head of the department for autonomous networks and systems research at Alcatel-Lucent’s Bell Labs, told TechWeekEurope that in the near future, network operators will have to deploy hundreds of small cells every day just to keep up with the demand for bandwidth, and it’s only innovation in this field that can stave off a capacity crisis.

The past

Bell Laboratories was established by Alexander Graham Bell in 1925, as Bell Telephone Laboratories. Part of AT&T for most of its life, it has given the world such building blocks of modern life as the transistor, the laser, the C++ programming language, wireless LAN networks and UNIX – the ancestor of Apple’s OS X and Linus Torvald’s Linux.

The lab’s scientists have won seven Nobel Prizes, most recently in 2009 when Willard Boyle and George Smithand were recognised for the creation of the charge-coupled device (CCD) image sensor, used in many popular digital photography and cinematography cameras.

In 2006, Bell Laboratories’ parent company, Lucent Technologies, signed a merger agreement with Alcatel. Since then, the lab has moved away from basic science and focused on networking, nanotechnology and software as the R&D subsidiary of the newly created Alcatel-Lucent.  This has also given it a European presence.

The Autonomous Networks and Systems Research department has been working with small cells for more than a decade, shaping the debate around capacity issues and solving problems that are yet to emerge. We visited Bell Labs in Dublin, where Claussen’s unit is based.

“Over many years, we have looked at self-optimisation techniques to make these cells plug-and-play deployable, and that work has led to the femtocells, that we now see deployed in millions. They integrate themselves automatically into existing cell networks,” explains Claussen.

Today, femtocells provide much needed coverage in rural homes, and increase capacity in places with above average numbers of users, such as shopping malls, office buildings, stadiums and city centres. But to really solve the capacity issue, we need thousands, eventually millions of these devices, Claussen believes.

“Within the next few years, we will probably have to increase capacity by a factor of 30. That’s a low estimate. There’s some more spectrum available for the LTE bands, but it’s clearly not enough. We have to learn to re-use spectrum, and this means small cells,” says the network expert.

Because small cells have a much lower coverage radius, they can exploit the same parts of spectrum over and over again without interference. The most recent developments have even enabled co-channel small cells to use the same frequencies as traditional cellular base stations (also known as macrocells). Claussen says the process is tricky, but it lets operators squeeze the most out of the limited supply of radio spectrum.

The future

Even if we had enough spectrum to satisfy all data transfer needs, we would still need to reinvent the basics of a cellular network, says Claussen. In some countries traditional telecommunications infrastructure is responsible for up to 1.5 percent of total power consumption. Multiply that by 30, and you can see why Bell Labs network researchers are busy developing network optimisation software, while across the hall, the thermal management group is thinking up new ways to cool the tower equipment, without using even more energy.

Another pressing issue is planning. With so many different pieces of infrastructure, how do you know where to deploy these cells? “It mainly comes down to user localisation, building up demand maps and power maps, taking into account constraints such as backhaul, connectivity and availability,” says Claussen.

“Connecting fibre to a lot of these places is very expensive, so we are also looking at alternative backhaul technologies, such as wireless backhaul. Sometimes ‘line of sight’ is OK, then you can go with microwave free space optical links.

“Recently, there has been some work on large-scale antenna arrays, where you have hundreds of antennas and do beam-forming. If the number of transmitter antennas is much larger than the number of receiving terminals, you can get very high capacity. When this method is used for backhaul, a lot of channel estimation problems go away.” TechWeek saw a demonstration of this technology, courtesy of Alcatel, back in 2011.

Claussen says that all small cells are going to become progressively smaller as their total quantity increases, because the power requirements go up with (roughly) the square of the coverage radius. So in the future, we might have multiple small cells placed in every room to support our digital lifestyles.

“At some point, you end up with one user per cell, and then making the cells smaller doesn’t really help you anymore. This means you need more bandwidth. Millimetre wave might be one solution, and visible light communication is another interesting area to look at.”

To this end, Claussen’s group is cooperating with professor Harald Haas from University of Edinburgh on light-to-RF converters –  a technology that uses an optical link to provide both data and power to a small cell. The concept builds on earlier work by Haas’s team, using conventional LED lightbulbs to broadcast data. The university has already built a working prototype capable of transmitting information at over a Gigabit per second and supporting multiple HDTV streams.

At the same time, Bell Labs is developing self-learning network software that features “genetic evolution of algorithms” and can be applied to macrocells. Not only are these networks more power efficient, they will also allow network operators to differentiate their services. So what’s more important – stability and coverage, or ultrafast speed? In about five years, the choice will be yours.

What do you know about Europe’s role in Tech history? Take our quiz!

Max Smolaks

Max 'Beast from the East' Smolaks covers open source, public sector, startups and technology of the future at TechWeekEurope. If you find him looking lost on the streets of London, feed him coffee and sugar.

Recent Posts

Craig Wright Sentenced For Contempt Of Court

Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…

2 days ago

El Salvador To Sell Or Discontinue Bitcoin Wallet, After IMF Deal

Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…

2 days ago

UK’s ICO Labels Google ‘Irresponsible’ For Tracking Change

Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…

2 days ago

EU Publishes iOS Interoperability Plans

European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…

3 days ago

Momeni Convicted In Bob Lee Murder

San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…

3 days ago