Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
—Estimating mobile user speed is a problematic issue which has significant impacts to radio resource management and also to the mobility management of Long Term Evolution (LTE) networks. This paper introduces two algorithms that can estimate the speed of mobile user equipments (UE), with low computational requirement, and without modification of neither current user equipment nor 3GPP standard protocol. The proposed methods rely on uplink (UL) sounding reference signal (SRS) power measurements performed at the eNodeB (eNB) and remain efficient with large sampling period (e.g., 40 ms or beyond). We evaluate the effectiveness of our algorithms using realistic LTE system data provided by the eNB Layer1 team of Alcatel-Lucent. Results show that the classification of UE's speed required by LTE can be achieved with high accuracy. In addition, they have minimal impact to the central processing unit (CPU) and the memory of eNB modem. We see that they are very practical to today's LTE networks and would allow a continuous and real-time UE speed estimation.
The coexistence of small cells and macro cells is a key feature of 4G and future networks. This heterogeneity, together with the increased mobility of user devices can generate a high handover frequency that could lead to unreasonably high call drop probability or poor user experience. By performing smart mobility management, the network can pro-actively adapt to the user and guarantee seamless and smooth cell transitions. In this work, we introduce an algorithm that takes as input sounding reference signal (SRS) measurements available at the base station (eNodeB in 4G systems) to estimate with a low computational requirement the mobility level of the user and with no modification at the user device/equipment (UE) side. The performance of the algorithm is showcased using realistic data and mobility traces. Results show that the classification of UE speed to three mobility classes can be achieved with accuracy of 87% for low mobility, 93% for medium mobility, and 94% for high mobility, respectively.
—This paper presents an online algorithm for mobile user speed estimation in 3GPP Long Term Evolution (LTE)/LTE-Advanced (LTE-A) networks. The proposed method leverages on uplink (UL) sounding reference signal (SRS) power measurements performed at the base station, also known as eNodeB (eNB), and remains effective even under large sampling period. Extensive performance evaluation of the proposed algorithm is carried out using field traces from realistic environment. The on-line solution is proven highly efficient in terms of computational requirement, estimation delay, and accuracy. In particular, we show that the proposed algorithm can allow for the first speed estimation to be obtained after 10 seconds and with an average speed underestimation error of 14 kmph. After the first speed acquisition, subsequent speed estimations can be obtained much faster (e.g., each second) with limited implementation cost and still provide high accuracy.
This work proposes the development of an energy estimation algorithm for LTE mobile access networks. The LTE network environment and the eNodeBs power consumption models were developed with a view to implementing an energy estimation algorithm that will estimate the energy consumption of the LTE access network. The energy estimation algorithm for the LTE eNodeBs was developed and implemented in MATLAB environment. The daily energy consumption of the LTE access network was simulated and analysed in respect to 37 eNodeBs which was used as a case study. The daily energy consumption of the LTE access network was evaluated while varying the energy load proportionality constant (í µí²) which ranges from í µí¿ − í µí¿. The daily minimum and maximum energy consumption of the LTE access network were found to be í µí¿í µí¿ í µí²í µí±¾í µí² and í µí¿í µí¿í µí¿í µí¿ í µí²í µí±¾í µí² for the energy load proportionality constant of í µí² = í µí¿ and í µí² = í µí¿ respectively. Th...
2019
In cellular networks, the emergence of machine communications such as connected vehicles increases the high demand of uplink transmissions, thus, degrading the quality of service per user equipment. Enforcing quality-of-service in such cellular network is challenging, as radio phenomena, as well as user (and their devices) mobility and dynamics, are uncontrolled. To solve this issue, estimating what the quality of transmissions will be in a short future for a connected user is essential. For that purpose, we argue that lower layer metrics are a key feature whose evolution can help predict the bandwidth that the considered connections can take advantage of in the following hundreds of milliseconds. The paper then describes how a 4G testbed has been deployed in order to investigate throughput prediction in uplink transmissions at a small time granularity of 100 ms. Based on lower layer metrics (physical and mac layers), the main supervised machine learning algorithms are used, such as...
Proceedings of the ACM SIGCOMM 2013 conference on SIGCOMM, 2013
With lower latency and higher bandwidth than its predecessor 3G networks, the latest cellular technology 4G LTE has been attracting many new users. However, the interactions among applications, network transport protocol, and the radio layer still remain unexplored. In this work, we conduct an in-depth study of these interactions and their impact on performance, using a combination of active and passive measurements. We observed that LTE has significantly shorter state promotion delays and lower RTTs than those of 3G networks. We discovered various inefficiencies in TCP over LTE such as undesired slow start. We further developed a novel and lightweight passive bandwidth estimation technique for LTE networks. Using this tool, we discovered that many TCP connections significantly under-utilize the available bandwidth. On average, the actually used bandwidth is less than 50% of the available bandwidth. This causes data downloads to be longer, and incur additional energy overhead. We found that the under-utilization can be caused by both application behavior and TCP parameter setting. We found that 52.6% of all downlink TCP flows have been throttled by limited TCP receive window, and that data transfer patterns for some popular applications are both energy and network unfriendly. All these findings highlight the need to develop transport protocol mechanisms and applications that are more LTE-friendly.
International Journal of Embedded and Real-Time Communication Systems, 2010
This article provides a detailed profiling of the layer 2 (L2) protocol processing for 3G successor Long Term Evolution (LTE). For this purpose, the most processing intensive part of the LTE L2 data plane is executed on top of a virtual ARM based mobile phone platform. The authors measure the execution times as well as the maximum data rates at different system setups. The profiling is done for uplink (UL) and downlink (DL) directions separately as well as in a joint UL and DL scenario. As a result, the authors identify time critical algorithms in the protocol stack and check to what extent state-of-the-art hardware platforms with a single-core processor and traditional hardware acceleration concepts are still applicable for protocol processing in LTE and beyond LTE mobile devices.
Proceedings of the 17th ACM International Symposium on Mobility Management and Wireless Access - MobiWac '19, 2019
In 4G networks, the emergence of machine communications such as connected vehicles increases the high demand of uplink transmissions, thus, degrading the quality of service per user equipment. Enforcing quality-of-service in such cellular network is challenging, as radio phenomenon, as well as user (and their devices) mobility and dynamics, are uncontrolled. To solve this issue, estimating what the quality of transmissions will be in a short future for a connected user is essential. For that purpose, we argue that radio metrics are key features whose evolutions can help predicting the bandwidth that the considered connections can take advantage of in the following hundreds of milliseconds. The paper then describes how a 4G testbed has been deployed in order to study the correlation between radio noise and throughput in uplink transmissions. Based on radio measurements, the main supervised machine learning algorithms are used, such as Random Forest and Support Vector Machine to predict the uplink received bandwidth. For a specific user service, we are able to predict the end-to-end received bandwidth, i.e. the amount of received data on the server side during a specific period at a very low scale of 100 ms. Results also prove that uplink bandwidth predictions are less accurate compared to bandwidth prediction for downlink based on radio measurements.
International Journal of Electronics and Communication Engineering, 2020
The current and future cellular mobile communication networks generate enormous amounts of data. Networks have become extremely complex with extensive space of parameters, features and counters. These networks are unmanageable with legacy methods and an enhanced design and optimization approach is necessary that is increasingly reliant on machine learning. This paper proposes that machine learning as a viable approach for uplink throughput prediction. LTE radio metric, such as Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), and Signal to Noise Ratio (SNR) are used to train models to estimate expected uplink throughput. The prediction accuracy with high determination coefficient of 91.2% is obtained from measurements collected with a simple smartphone application.
International Journal of Computing and Network Technology (ISSN: 2210-1519). Publisher: Scientific Publishing Center, University of Bahrain. Volume 2 Issue 3. pp. 79-83, 2014
This paper provides analyses of the performance of radio parameters necessary for efficient Long term evolution (LTE) radio planning: through numerous simulations in different transmission modes and network scenario. It mainly highlights the throughput, Block Error Rate (BLER) with respect to Signal-to-Noise Ratio (SNR) along with changed UE mobility on the physical layer and in network context covering different simulation environments.
Computers & Electrical Engineering, 2014
Estimating average throughput and packet transmission delay for worst case scenario (cell edge users) is crucial for LTE cell planners in order to preserve strict QoS for delay sensitive applications. Cell planning techniques emphasize mostly on cell range (coverage) and throughput predictions but not on delay. Cell edge users mostly suffer from throughput reduction due to bad coverage and consequently unexpected uplink transmission delays. To estimate cell edge throughput a common practice on international literature is the use of simulation results. However simulations are never accurate since MAC scheduler is a vendor specific software implementation and not 3GPP explicitly specified. This paper skips simulations and proposes an IP transmission delay and average throughput analytical estimation using mathematical modeling based on probability delay analysis, thus offering to cell planners a useful tool for analytical estimation of uplink average IP transmission.
This is a survey paper. In this paper various methods used for power estimation in wireless communications have been discussed. Wireless communications has become an inseparable part of our life. Power consumption is one of the major factors that decide the communication system quality. Accurate power estimation has an important role for power control and handoff decisions in mobile communications. Window based weighed sample average power estimators are commonly used due to their simplicity. In practice, the performances of these estimators degrade severely when the estimators are used in the presence of correlated samples. In this paper performances of the local mean power estimators namely, sample average, optimum unbiased and maximum likelihood estimators and Kalman Filter are analyzed in the presence of correlated samples. The variance of the estimators is used as performance measures.
Mobile Information Systems, 2016
Power consumption is a key factor in how final users rate the quality of service in mobile networks; however, its characterization is a challenging issue due to the many parameters involved and the complexity of their dependencies. Traditional battery drain testing in the field does not provide a suitable environment to reach accurate conclusions. In this paper we address this problem providing a controlled environment, more compact and accurate than those currently found in the literature, designed to measure the effects that different factors have on the global energy consumption.
A frequently asked question among LTE radio planners is that of how to determine the maximum acceptable LTE radio interface load that should not be exceeded in order to maintain some targeted user data throughput. This white paper summarizes some simple formulas for calculation of downlink user throughput as a function Physical Resource Block (PRB) utilization. The formulas are expressed in terms of standardized 3GPP KPIs and are hence computable from network performance counters. Examples from live LTE networks are given to illustrate the usefulness.
2013 IEEE 18th International Workshop on Computer Aided Modeling and Design of Communication Links and Networks (CAMAD), 2013
This paper proposes a new method to model the instantaneous uplink (UL) energy efficiency (EE) of a mobile terminal when it is transmitting to a base station using the Long Term Evolution (LTE) technology. It is known that the transmitted power can vary significantly, mainly depending on the used modulation code scheme (MCS), or the path loss between the user equipment (UE) and the eNodeB (eNB), or even the number of resource blocks (RBs) used to send the data. Even though this transmitted power, also known as irradiated power, does not correspond to the final power consumption during an UL transmission, its variability has an impact on it. Unlike existing models, this paper considers the power consumed in the radio frequency chain and the power consumed by the baseband processing. The proposed model is expected to improve the accuracy of the theoretic measurements of both the power consumption and the EE of UL transmissions in LTE systems.
2012 IEEE International Symposium on Performance Analysis of Systems & Software, 2012
With the proliferation of mobile phones and other mobile internet appliances, the application area of baseband processing continues to grow in importance. Much academic research addresses the underlying mathematics, but little has been published on the design of systems to execute baseband workloads. Most systems research is conducted within companies who go to great lengths to protect their intellectual property. We present an open-source LTE Uplink Receiver PHY benchmark with a realistic representation of the baseband processing of an LTE base station, and we demonstrate its usefulness in investigating resource management strategies to conserve power on a TILEPro64. By estimating the workload of each subframe and using these estimates to control power-gating, we reduce power consumption by more than 24% (11% on average) compared to executing the benchmark with no estimation-guided resource management. By making available a benchmark containing no proprietary algorithms, we enable a broader community to conduct research both in baseband processing and on the systems that are used to execute such workloads.
IEEE Transactions on Vehicular Technology, 2015
Wireless Personal Communications, 2020
Future high densification wireless networks come with high handoff rates, which require knowledge of mobile speed. Mobile speed estimation is crucial for optimizing handover to reduce call drops and network signaling flow, optimize traffic scheduling, improve quality of service, achieve resource optimization, mobility load balancing, channel quality feedback enhancement, and energy efficiency. In this paper, we present a low complexity mobile speed estimation model using count of peaks and troughs of the received signal envelop. We simulated the model in Matlab ® and our result shows that the model has a maximum error of 0.25 m/s. The model has two advantages. First, it does not require measurement of the received signal power; it only counts envelop peaks and troughs. Second, the model is independent of DC offset inherent in the radio receivers. However, the model has one limitation-it does not give the crossing component of a mobile's velocity.
Computer and Information Science, 2008
Accurate power estimation has an important role for power control and handoff decisions in mobile communications. Window based weighed sample average power estimators are commonly used due to their simplicity. In practice, the performances of these estimators degrade severely when the estimators are used in the presence of correlated samples. In this paper performances of the three local mean power estimators namely, sample average, optimum unbiased and maximum likelihood estimators, are analysed in the presence of correlated samples. The variance of the estimators is used as performance measures. Finally, the simulation results show that the performances of the optimum unbiased and maximum likelihood estimators are very good as compared to the performance of the sample average estimator.
This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited. In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier's archiving and manuscript policies are encouraged to visit: http://www.elsevier.com/authorsrights a b s t r a c t Estimating average throughput and packet transmission delay for worst case scenario (cell edge users) is crucial for LTE cell planners in order to preserve strict QoS for delay sensitive applications. Cell planning techniques emphasize mostly on cell range (coverage) and throughput predictions but not on delay. Cell edge users mostly suffer from throughput reduction due to bad coverage and consequently unexpected uplink transmission delays. To estimate cell edge throughput a common practice on international literature is the use of simulation results. However simulations are never accurate since MAC scheduler is a vendor specific software implementation and not 3GPP explicitly specified. This paper skips simulations and proposes an IP transmission delay and average throughput analytical estimation using mathematical modeling based on probability delay analysis, thus offering to cell planners a useful tool for analytical estimation of uplink average IP transmission.
With lower latency and higher bandwidth than its predecessor 3G networks, the latest cellular technology 4G LTE has been attracting many new users. However, the interactions among applications, network transport protocol, and the radio layer still remain unex-plored. In this work, we conduct an in-depth study of these interactions and their impact on performance, using a combination of active and passive measurements. We observed that LTE has significantly shorter state promotion delays and lower RTTs than those of 3G networks. We discovered various inefficiencies in TCP over LTE such as undesired slow start. We further developed a novel and lightweight passive bandwidth estimation technique for LTE networks. Using this tool, we discovered that many TCP connections significantly under-utilize the available bandwidth. On average , the actually used bandwidth is less than 50% of the available bandwidth. This causes data downloads to be longer, and incur additional energy overhead. We found that the under-utilization can be caused by both application behavior and TCP parameter setting. We found that 52.6% of all downlink TCP flows have been throttled by limited TCP receive window, and that data transfer patterns for some popular applications are both energy and network unfriendly. All these findings highlight the need to develop transport protocol mechanisms and applications that are more LTE-friendly.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.