Wednesday, April 20, 2016

Link Budget, Joules Budget and User Capacity IV: Receiver Sensitivity and Internet of Small Things (IoST)

There are many parameters or specifications defined for quantifying the performance of a receiver design and implementation. Among them, the most notable includes the reference sensitivity power level (REFSENS) and the Total Isotropic Sensitivity (TIS), which are widely used for specify how sensitivity a receiver is. More sensitive a receiver is, less power it requires for maintaining a reliable communication and better performance it is believed to have.

In general, REFSENS measures the performance of the receiver module with considering its down-converting performance, demodulation/decoding capability and self-generated interference/noise. Per 3GPP definition, REFSENS specifically denotes the minimum mean power applied to each applicable receive antenna port at which the throughput shall meet or exceed the requirements, which is not less than 95% of the maximum throughput of the specified reference measurement channel. See §7.3, Annexes A.2.2 (for FDD), A.2.3 (for TDD) and A.3.2 of 3GPP TS 36.101 and 3GPP TS 36.521-1 .  For example, the FDD QPSK REFSENS for a two-antenna UE operating in a 10 MHz Band 13 channel is at least -94 dBm and in a 10 MHz Band 4 channel, it is -97 dBm.  In both cases, 50 resource blocks with payload 5160 bits and 1 code block per sub-frame are allocated.  This means the corresponding peak data rate is (5160+24)/0.001 = 5.184 Mbps, the maximum achievable spectral efficiency is 5,184,000 / (180,000 x 50) = 0.58 bit/second/Hz.


From the Shannon curves plotted above, the required minimum SNR is -3.0 dB.  Therefore, considering a RFIC with a noise figure of 7.0 dB, the best REFSENS number will be
Best REFSENS = Effective_Noise_Floor + Required_SNR
=Thermal_Noise_Floor + RFIC_Noise_Figure + Required_SNR
= -104.5 dBm + 7.0 dB - 3.0 dB
= -100.5 dBm
As such, there is a margin of 6.5 dB for the band 13 and 3.5 dB for the band 4.

Further, per 3GPP TS 36.101 and 3GPP TS 36.521-1, the QPSK REFSENS for category 0 HD-FDD UE is -92.3 dBm in a 10 MHz Band 13 channel and -95.3 in a 10 MHz Band 4 channel, both with 36 RBs (i.e., 6.48 MHz) and a target coding rate of 1/10.  As such, the corresponding peak spectral efficiency is 0.16 bit/second/Hz. On the other hand, per Shannon theory, the best category 0 REFSENS number is -108.3 dBm.  The QPSK REFSENS for category M1 HD-FDD UE is -100 dBm in a 10 MHz Band 13 channel and -103 dBm in a 10 MHz Band 4 channel, both with 6 RBs (i.e., 1.08 MHz) and a target coding rate of 1/3. The corresponding peak spectral efficiency is 0.58 bit/second/Hz and the best HD-FDD category M1 QPSK REFSENS number is -109.6 dBm, which is 1.3 dB better than the category 0 using 6 times of the bandwidth.

Different from REFSENS, which quantifies the receiver sensitivity at the output of the antenna and the input to the antenna port and RF connector, TIS quantifies the sensitivity at the input to the antenna. If everything else is the same, the difference between these two is the antenna and the connector to it. Fundamentally, TIS measures the average sensitivity of a mobile device in the downlink band. It is a function of the antenna, the receiver module, and operation environment defined by the test cases. It is equal to the conducted receiver sensitivity of a receiver when this is degraded by the radiation efficiency of the antenna as well as any other disturbances guided through the antenna.  As such, per §8.2 of 3GPP TR 37.902 and §6.8 of CTIA Test Plan for Wireless Device Over-The-Air Performance, TIS similarly denotes the spatially averaged minimum RF power level resulting in a data throughput not less than 95% throughput of the maximum throughput for each test case defined for REFSENS measurements. 

However, the



Friday, April 8, 2016

Link Budget, Joules Budget and User Capacity III: Trade-Offs & Limits of IoT Link Budget and Battery Life

Less than two years ago, I had a client who challenged me to improve the downlink sensitivity of a LTE receiver by 10 dB while maintaining its small form factor and battery life. At the beginning, I was so puzzled by her intention and tried hard to convince her that the mobile network in whole may not benefit much from such a single improvement. There are many other factors, e.g., battery life, downlink-uplink coupling, device cost, inter-cell interference and user capacity, to consider.  One year later, I was again amazed by the rapid rise of so many internet of things (IoT) systems each pretty much claiming a  ~170 dB link budget and a ~10 years battery life, in addition to its ultra-low device cost, enormous user capacity supporting tens-thousand devices per base station per channel user, and, did I mention this,  in a very noisy unlicensed ISM band. goo.gl/pJb0KF.  Think about this, a household Alkaline AA battery can only supply up to 3900 mWh / 8760 h =  0.45 mW averaged power for one year.  On the other hand, a regular RF Power Amplifier possibly has a maximum power consumption of more than 1 Watt. (For PA power efficiency, see goo.gl/5rBk2a) Jeez, all these sounds too good to be true to me ...  There is a catch, isn't there? What are the bottom lines for the IoT link budget and battery life?

By definition, link budget is the ratio between the signal powers at the receiver antenna output and the transmitter antenna input. It can be calculated through Friis transmission equation with accounting all the gains and losses across the whole transmitter and receiver chain.  From a Friis equation and a signal processing perspective, link budget (in dB) improves logarithmically with increasing signal spreading gain and reducing signal bandwidth, in addition to transmitted signal power. The catch, however, is as you increase the signal spreading gain or coding gain in time domain or reduce the signal bandwidth in frequency domain, the device's battery life itself is reduced linearly.  In other words, if you want to improve the cell coverage of a IoT system,  then the battery life of served IoT devices will be shortened, except to use a larger battery.

Now the question becomes what the bottom lines or limits are for the link budget and battery life tradeoffs.  As we know, many communication system design bottom lines are well modeled and determined by information theory.  From an information theory standing-point, the minimum energy per bit to noise power spectral density ratio Eb/No = Es/r/N0 is determined by Shannon equation to be greater than -1.56 dB, whatever the coding and modulation schemes are used.  This means, for a reliable transmission, e.g., making a connection or sending Yes or No, between a transmitter and a receiver, the received signal energy per bit shall be greater than the Shannon minimum energy requirement, which is 1.56 dB below the noise floor. Accordingly if we consider a IoT system with a bandwidth of 1 kHz, a spreading gain of 64, a 4-antenna receiver and 10 dB co-channel interference, the Shannon maximum link budge is about 177 dB. For details and other assumptions, see the spreadsheet linked below. From the example presented in the spreadsheet, a 10-year battery life is possible if a IoT device is assumed to send only 200 symbols every hour, nothing else, and is equipped with at least 7 AA Li-FeS2 batteries, each having 30% more energy than a regular Alkaline AA battery has.  I should also mention this,  though the discussed usage case can be used as a benchmark, it is not very useful or realistic for many practical applications.