Tuesday, July 17, 2018

What Does The IoT Ultra-High Density Requirement Mean?

IoT networks are expected to connect billions of devices in the next several years.  One ambitious 5G requirement is to serve massive Internet of Things (IoT).  Accordingly, it demands an ultra-low energy (10+ years of battery life), deep coverage and ultra-high density (~1 million nodes per sq. Km?).  As such, 5G networks require each base station to be able to receive a high volume of access requests from end-device in a short period, say 30 minute.  Are these really necessary and possible?


Obviously these requirements are highly expected for the IoT networks operated in high population density cities, where a large portion of IoT end-devices will be deployed.  For example, as shown in the table below, the population density in Chennai, India is 25,854 per Km2.  If in average it is assumed that one IoT device per capita and the coverage of one IoT base station is 4 Km in radius, the number of the served IoT devices per base station is expected to be 25,854 x 3.14 x 16 = 1,298,905.  This means, considering a 30-minute period and assuming that there is no retransmission in a 4 km IoT network deployed in Chennai, India, the expected access capacity is 1,298,905 access per 30 minutes.

The top 20 city districts with the highest population densities. Source: Wikipedia

On the hand, for LoRaWan, if assuming a 125kHz channel bandwidth, spreading factor of 12, and 23dBm RF output, then the shortest access packet length is 172ms preamble + 262ms payload = 434ms access packet.  As such the maximum non-collision access density is 33.2 access packets per kHz and the maximum non-collision access capacity is 331,600 accesses per 30 minutes per 10MHz. This means that the expected access capacity demand for Chennai, India is FOUR times of the no-collision maximum access capacity of the LoRaWAN network. In other words, LoRaWAN may not be able to handle the traffic demand by  Chennai, India.

And similarly for LTE, if assuming LTE PRACH Format 3 and 23dBm RF output, then the maximum non-collision access density is 1,728 access packets per kHz and the maximum non-collision access capacity is 17,280,000 accesses per 30 minutes per 10MHz. This means that the expected access capacity for Chennai, India is 7.5% of the no-collision maximum access capacity of a LTE network.

Why is there such a big difference?  Obviously it is because of the system design, which includes too much hardly avoidable overhead.



Thursday, July 13, 2017

Evolve Random Access Channels for IoT: II Existing Access Probe Designs


Access Channel Enhancements for 1x Rel. F
One Eighth Rate Access Probes for Smart Terminals
Evolve Random Access Channels for IoT: I Introduction
Evolve Random Access Channels for IoT: III Slotted ALOHA Models
Evolving Random Access Channel for IoT: IV Dumb Access Probes and Smart Access Probes


The IoT traffic characteristics are known to be substantially different from those of smartphones. Mobile networks were traditionally designed and optimized to transport connection-oriented traffic, where each connection is expected to be continuous with low latency. On the other hand, many IoT services are intended to be of low throughput, short duration and delay tolerant and characterized to be “connectionless.” As such while the existing mobile network operators start studying and optimizing their networks to accommodate the dynamics of growing IoT traffic, some companies start developing possible alternatives for IoT services.



LoRaWAN

LoRaWAN (low power wide area network) is a connectivity technology proposed by LoRa Alliance for battery-operated applications. It is designed to support only low- rate burst data packets between servers and end-devices. As such, it doesn’t including the connection-oriented signaling to save power. For example, there is no synchronization mechanism between receivers and transmitters. The communications between receiver and transmitter are completely ad hoc.  Further in order to increase its link sensitivity for extended coverage, LoRaWAN employed spread spectrum and coding techniques to reduce the minimum demodulation and detection SINR (Signal to Interference and Noise Ratio) at the expense of increasing signal transmission duration and collision probability.  LoRaWAN uses a pure ALOHA protocol to connect end-devices with their server. It allows an end-device to start a data exchange immediately when the data arrives. For example, its low power Class A device starts the communication from sending a uplink access probe embedded with a data payload and followed by two short downlink receive windows for waiting and receiving corresponding downlink data from the server.


LTE RACH (Random Access Channel)

LTE can support the IoT data traffic with extreme low duty cycles through its slotted ALOHA access probes, in which active user equipments (UEs) start their transmission synchronously on predefined access boundaries. In LTE, UEs use an uplink channel, Physical Random Access Channel (PRACH), to request a radio resource assignment for sequential data exchange during the initial access to the system. The PRACH is allocated on each frame with up to 16 different RACH configurations and up to 10 PRACH resources per frame per access cycle. The case of 10, with one PRACH resource in each subframe, is designed for situations in which the access load is high.  For the network of a large coverage, each frame supports 3 PRACH resource, each including 3 consecutive subframes. Similar to a slotted-ALOHA protocol, the transmission on PRACH is shared by all active UEs within the same sector. Firstly, a UE randomly choose one of the maximum of 64 preamble sequence (Msg 1) and sends it to the eNodeB.  A collision can occur at the eNodeB when two or more UEs choose identical preamble sequences and send them at the same time. Preamble transmission may also fail due to insufficient transmission power. Depending on the configuration of a eNodeB, 3GPP defines the minimum PRACH detection performance of each eNodeB. For example, the PRACH missed detection requirement for a eNodeB with 8 Rx antennas should be no less than -21 dB in additive white Gaussian noise (AWGN) channel with the false alarm rate less than 0.1% and the successful rate not less than 99%.

Wednesday, April 20, 2016

Link Budget, Joules Budget and User Capacity IV: Receiver Sensitivities for Internet of Small Things (IoST)

There are many parameters or specifications defined for quantifying the performance of a receiver design and implementation. Among them, the most notable includes the reference sensitivity power level (REFSENS) and the Total Isotropic Sensitivity (TIS), which are widely used for specify how sensitivity a receiver is. More sensitive a receiver is, less power it requires for maintaining a reliable communication and better performance it is believed to have.

In general, REFSENS measures the performance of the receiver module with considering its down-converting performance, demodulation/decoding capability and self-generated interference/noise. Per 3GPP definition, REFSENS specifically denotes the minimum mean power applied to each applicable receive antenna port at which the throughput shall meet or exceed the requirements, which is not less than 95% of the maximum throughput of the specified reference measurement channel. See §7.3, Annexes A.2.2 (for FDD), A.2.3 (for TDD) and A.3.2 of 3GPP TS 36.101 and 3GPP TS 36.521-1 .  For example, the FDD QPSK REFSENS for a two-antenna UE operating in a 10 MHz Band 13 channel is at least -94 dBm and in a 10 MHz Band 4 channel, it is -97 dBm.  In both cases, 50 resource blocks with payload 5160 bits and 1 code block per sub-frame are allocated.  This means the corresponding peak data rate is (5160+24)/0.001 = 5.184 Mbps, the maximum achievable spectral efficiency is 5,184,000 / (180,000 x 50) = 0.58 bit/second/Hz.


From the Shannon curves plotted above, the required minimum SNR is -3.0 dB.  Therefore, considering a RFIC with a noise figure of 7.0 dB, the best REFSENS number will be
Best REFSENS = Effective_Noise_Floor + Required_SNR
=Thermal_Noise_Floor + RFIC_Noise_Figure + Required_SNR
= -104.5 dBm + 7.0 dB - 3.0 dB
= -100.5 dBm
As such, there is a margin of 6.5 dB for the band 13 and 3.5 dB for the band 4.

Further, per 3GPP TS 36.101 and 3GPP TS 36.521-1, the QPSK REFSENS for category 0 HD-FDD UE is -92.3 dBm in a 10 MHz Band 13 channel and -95.3 in a 10 MHz Band 4 channel, both with 36 RBs (i.e., 6.48 MHz) and a target coding rate of 1/10.  As such, the corresponding peak spectral efficiency is 0.16 bit/second/Hz. On the other hand, per Shannon theory, the best category 0 REFSENS number is -108.3 dBm.  The QPSK REFSENS for category M1 HD-FDD UE is -100 dBm in a 10 MHz Band 13 channel and -103 dBm in a 10 MHz Band 4 channel, both with 6 RBs (i.e., 1.08 MHz) and a target coding rate of 1/3. The corresponding peak spectral efficiency is 0.58 bit/second/Hz and the best HD-FDD category M1 QPSK REFSENS number is -109.6 dBm, which is 1.3 dB better than the category 0 using 6 times of the bandwidth.

Different from REFSENS, which quantifies the receiver sensitivity at the output of the antenna and the input to the antenna port and RF connector, TIS quantifies the sensitivity at the input to the antenna. If everything else is the same, the difference between these two is the antenna and the connector to it. Fundamentally, TIS measures the average sensitivity of a mobile device in the downlink band. It is a function of the antenna, the receiver module, and operation environment defined by the test cases. It is equal to the conducted receiver sensitivity of a receiver when this is degraded by the radiation efficiency of the antenna as well as any other disturbances guided through the antenna.  As such, per §8.2 of 3GPP TR 37.902 and §6.8 of CTIA Test Plan for Wireless Device Over-The-Air Performance, TIS similarly denotes the spatially averaged minimum RF power level resulting in a data throughput not less than 95% throughput of the maximum throughput for each test case defined for REFSENS measurements. 

However, the



Friday, April 8, 2016

Link Budget, Joules Budget and User Capacity III: Trade-Offs & Limits of IoT Link Budget and Battery Life

Less than two years ago, I had a client who challenged me to improve the downlink sensitivity of a LTE receiver by 10 dB while maintaining its small form factor and battery life. At the beginning, I was so puzzled by her intention and tried hard to convince her that the mobile network in whole may not benefit much from such a single improvement. There are many other factors, e.g., battery life, downlink-uplink coupling, device cost, inter-cell interference and user capacity, to consider.  One year later, I was again amazed by the rapid rise of so many internet of things (IoT) systems each pretty much claiming a  ~170 dB link budget and a ~10 years battery life, in addition to its ultra-low device cost, enormous user capacity supporting tens-thousand devices per base station per channel user, and, did I mention this,  in a very noisy unlicensed ISM band. goo.gl/pJb0KF.  Think about this, a household Alkaline AA battery can only supply up to 3900 mWh / 8760 h =  0.45 mW averaged power for one year.  On the other hand, a regular RF Power Amplifier possibly has a maximum power consumption of more than 1 Watt. (For PA power efficiency, see goo.gl/5rBk2a) Jeez, all these sounds too good to be true to me ...  There is a catch, isn't there? What are the bottom lines for the IoT link budget and battery life?

By definition, link budget is the ratio between the signal powers at the receiver antenna output and the transmitter antenna input. It can be calculated through Friis transmission equation with accounting all the gains and losses across the whole transmitter and receiver chain.  From a Friis equation and a signal processing perspective, link budget (in dB) improves logarithmically with increasing signal spreading gain and reducing signal bandwidth, in addition to transmitted signal power. The catch, however, is as you increase the signal spreading gain or coding gain in time domain or reduce the signal bandwidth in frequency domain, the device's battery life itself is reduced linearly.  In other words, if you want to improve the cell coverage of a IoT system,  then the battery life of served IoT devices will be shortened, except to use a larger battery.

Now the question becomes what the bottom lines or limits are for the link budget and battery life tradeoffs.  As we know, many communication system design bottom lines are well modeled and determined by information theory.  From an information theory standing-point, the minimum energy per bit to noise power spectral density ratio Eb/No = Es/r/N0 is determined by Shannon equation to be greater than -1.56 dB, whatever the coding and modulation schemes are used.  This means, for a reliable transmission, e.g., making a connection or sending Yes or No, between a transmitter and a receiver, the received signal energy per bit shall be greater than the Shannon minimum energy requirement, which is 1.56 dB below the noise floor. Accordingly if we consider a IoT system with a bandwidth of 1 kHz, a spreading gain of 64, a 4-antenna receiver and 10 dB co-channel interference, the Shannon maximum link budge is about 177 dB. For details and other assumptions, see the spreadsheet linked below. From the example presented in the spreadsheet, a 10-year battery life is possible if a IoT device is assumed to send only 200 symbols every hour, nothing else, and is equipped with at least 7 AA Li-FeS2 batteries, each having 30% more energy than a regular Alkaline AA battery has.  I should also mention this,  though the discussed usage case can be used as a benchmark, it is not very useful or realistic for many practical applications.

Tuesday, October 22, 2013

How to Airplay Music from iPhone to Windows Computer via Bluetooth?

Airplay Music, A.K.A. AirTunes, is part of proprietary AirPlay protocol stack developed by Apple Inc. It is designed to use UDP and RTSP network control protocol for streaming audio over IP network.  So far, Airplay can only be used between airplay-enabled Apple devices and/or licensed third-party audio equipments.  As such, to AirPlay music from your iPhone to a Windows laptop, you need an AirPlay audio bridge like Apple AirPort Express.  In this blog, I want to show you that you can do the same thing via a Bluetooth connection instead of an AirPlay audio bridge.  And it, indeed, is very simple.

Most likely your Windows computer comes with a Broadcom Bluetooth chipset.  You can confirm this through the device management application of Windows.  If this is the case, go to Broadcom Bluetooth Software Download and download updated Bluetooth for Windows software.  

After install the updated Bluetooth software, you should be able to pair your iPhone to the computer.  Then, you can AirPlay music from your iPhone to the computer.


Saturday, December 1, 2012

Hack Apple TV for Watching Chinese TVs and Videos

[Hack Patriot Box Office for Watching Chinese Videos and TVs]
Figure 1.  Apple TV
Apple TV (ATV) is a multi-function set-top box designed by Apple Inc. originally as a networked media player for streaming multimedia contents to a television or an external monitor.  Now, it seemingly evolves into a iPod Touch without a touch screen but with a relatively inexpensive MSRP price tag of $99.  (Indeed, if you have an old generation iPod Touch, you may think about mirroring or airplay it onto a television instead.)  On March 7, 2012, Apple Inc. released a 3rd generation ATV (ATV3)  that includes a 32 nm ARM Cortex-A9 Apple A5 SoC (the same process as used in a 5th generation iPod Touch, a new iPad 2 and an iPad Mini) and 512MB mobile DDR2 memory (the same amount of memory as used in an iPhone 4).  An ATV3 has a buit-in 6 W power supply with no cooling fan necessary.  Unfortunately, until now, there is no public released jailbreak available for an ATV3, so modified software or apps such as XBMC cannot run on it.

Because Apple TV (ATV) is such an elegant device with a small footprint and intuitive to use functionalities, I know many people want to use it for watching all kinds of internet videos, e.g., those Chinese TVs and videos widely available over the internet.  Hereafter, a small hack or trick will be explained for this purpose.  A nice feature of this hack is that no modification on either the hardware or the software is necessary on your Apple TV.  No jailbreaking is required and therefore, the manufacture warranty is fully preserved.  

In this blog, I will explain a set of configurations applicable to an ATV3 with Apple TV iOS 4.4+ updates, which include the app, iTunes Movie Trailers.  The set of configurations include several ATV client side configurations and two optional web server side configurations.  If you can fluently read Chinese, here are references to this hack:
Before starting the configurations, it may be necessary for you to: 
  1. verify that your ATV is powered and properly connected to your television;
  2. verify that you can navigate the Apple TV interface;
  3. verify that your ATV is properly connected to your network;
  4. verify that the network is properly configured.  
For "How to start your ATV?," you may check online references such as http://support.apple.com/kb/HT2280  or http://support.apple.com/manuals/#appletv.


In fact, the original Trailer app is a link pointing to a XML file hosted on a apple server: http://trailers.apple.com/appletv/index.xml.  Apparently, a Chinese ATV fan figured a way to set up his own DNS server@210.129.145.150 and to configure an ATV to direct the original Trailer link to his own XML file.  In the following, the set of ATV client side configurations teach you how to set up your Apple TV and redirect he original Trailer link to a new XML file hosted on a Chinese server.  After this, another set of ATV server side configurations will teach you how to further personalize the new XML file.

ATV Client Side Configurations

  1. ATV Network Configurations.
    1. From the main Apple TV menu,  choose Setting -> General ->  Network -> Configure TCP/IP -> Manual
    2. Don't change and keep your workable "IP Address"
    3. Don't change and keep your workable "Subnet Mask"
    4. Don't change and keep your workable "Router"
    5. When prompted, change "DNS Address" to "210.129.145.150"
    6. Press the Menu button on the remote once to return to the previous screen.
  2. ATV iTune Configuration
    1. From the main Apple TV menu, choose Settings -> iTunes Store. 
    2. Confirm "Location" is "United States".  Otherwise, scroll to highlight "United States" and select it.
    3. Press the Menu button on the remote once to return to the previous screen.
  3. Find an UID of your ATV and/or register for an account for later server side configurations

    1. From the main Apple TV menu, choose Trailers and you may find that a homepage of the Apple app, Trailers, becomes a new homepage, on which there is a "Apple TV" link listed in the middle next to three popular Chinese video sites such as iQiyi, Sohu video and PPTV listed on the top.
    2. From the main Apple TV menu, choose Trailers -> Apple TV -> Personal -> Personal Links
    3. Find a sentence "个人 xxxxxxxxx " and this xxxxxxxx is the UID for your ATV.
    4. Alternatively from the main Apple TV menu, choose Trailers -> Apple TV -> Personal, you can register an account

Web Server Side Configurations (Optional)

After you finish the above client side configuration on your Apple TV,  you should be able to find several pre-loaded apps or links displayed after you click the Trailer app. Among the pre-loaded apps, there is a "Personal app" or "个人 app" in the middle of the screen.  This "Personal app" or "个人 app" provides a directory storing links to your favorite programs or TV sources.  Here are two approaches for configuring your "Personal app" or "个人 app:"  
  1. Approach 1:  If you have found your UID from your ATV and want to use it for the server side configurations, go to http://www.atvttvv.net/login.html 
  2. Approach 2: If you have registered an account on your ATV, you can go to http://serv.tttnt.com:8580/login.html

Friday, April 27, 2012

How to Improve Forward Link Positioning for Cellular Networks? III. Hearability and Accuracy

How to Improve Forward Link Positioning ... ? I. Introduction
1x HDP Enhancements
Enhanced Location Based Services Support in cdma2000
Enhance Downlink Positioning in WiMAX/16m
How Wide A Widband Channel Should Be?
IEEE ICC 2008 Tutorial, Location Based Services for Mobiles
Location Based Services for Mobiles: I. Introduction

Hearability Issue

Hearability of a forward-link positioning system usually is quantified by how many reference signals a terminal may utilize to make a positioning fix in a pre-defined positioning duration.  In theory, a terminal need measure parameters of only 4 different reference signals for a precise three-dimension fix.  However, more reference signals a terminal can use, more diversity benefits a terminal may use for a more accurate positioning fix.

A hearability issue of a cellular positioning network generally is a dimension limitation issue.  It mostly is due to limitations of network geometry and network deployment.  In other words, it is a network issue.  For example, for a given cellular network, say a CDMA2000 1x RTT network or a WCDMA network, its hearability mainly depends on a network topology of the cellular network and a frequency reuse factor of the cellular network.  The network topology including network sectorization may affect achievable DoP values for positioning.  The frequency reuse factor may have a significant impact on co-channel interference to a terminal, which in turn relates to the positioning accuracy achievable by the terminal.  Hearability of an exemplary CDMA2000 1x RTT network is shown in Figure 1.

Figure 1. The hearability of CDMA2000 1x Pilots for AFLT, IEEE ICC 2008 "cdma2000 Highly Detectable Pilot" 
On the other hand, since major considerations for an actual deployment of cellular network base stations are voice and data service capacity, environmental impact and financial limitations, etc.,  a mobile phone network usually is not optimized for mobile positioning in nature.

Accuracy Issue

Positioning accuracy of cellular network forward link positioning is a dimension limitation issue.  Mainly the accuracy is limited by a frequency reuse factor and available bandwidth.  In general, given a certain positioning duration, wider bandwidth received reference signals have, more uncorrelated signal samples a terminal can obtain.  On the other hand, it is known that an achievable SNR highly depends on the frequency reuse factor of a cellular network.  More particularly, a CRLB of the achievable positioning accuracy is asymptotically linear to the number of uncorrelated signal samples and SNR value in dB.

In addition, from a signal processing or a receiver design perspective, a correlation between received signal samples largely depends on a sampling frequency on the received signals and achievable multipath resolution.  Multipath resolution is a function of both a channel delay profile and a bandwidth of received signals.  For example, a statistic delay profile of an exemplary cellular network is shown in Figure 2.  Additional discussions on the statistic delay profile can be found in another blog, "How Wider A Wideband Channel Should be?".   In general, wider the bandwidth of a transmit signal is and  higher the multipath resolution of a channel is achievable.

Figure 2. A statistic model of delay spread. 

Additional Reference

[1] E. Sousa, V. Jovanovic, C. Daigneault, “Delay spread measurements for the digital cellular channel in Toronto”, IEEE Trans. on Vehicular Technology, Nov 1994
[2] J. Ling, D. Chizhik, D. Samardzija, R. Valenzuela, “Wideband and MIMO measurements in wooded and open areas”, Lucent Bell Laboratories,
[3] K. Baum, “Frequency-Domain-Oriented Approaches for MBWA: Overview and Field Experiments”, Motorola Labs, IEEE C802.20-03/19, March 2003
[4] L. Greenstein, V. Erceg, Y. S. Yeh, M. V. Clark, “A New Path-Gain/Delay-Spread Propagation Model for Digital Cellular Channels,” IEEE Transactions on Vehicular Technology, VOL. 46, NO.2, May 1997, pp.477-485.
[5] A. Algans, K. I. Pedersen, P. Mogensen, “Experimental Analysis of the Joint Statistical Properties of Azimuth Spread, Delay Spread, and Shadow Fading,” IEEE Journal on Selected Areas in Communications, Vol. 20, No. 3, April 2002, pp. 523-531.
[6] Spatial Channel Model AHG (Combined ad-hoc from 3GPP & 3GPP2), “Spatial Channel Model Text Description ”, 3GPP, 2003