Papers by Milena Radenkovic

IOSR Journal of Engineering, 2013
This paper explores feasibility and performance characteristics of delay tolerant data disseminat... more This paper explores feasibility and performance characteristics of delay tolerant data dissemination in highly congested and dynamic urban area. We look into London scenario during the Olympic Games as highly challenging as it has significantly increased traffic and communication demands added over a short period of time to the already saturated and densely populated city. We investigate two different mobility scenarios that we run on the map of London Olympic area, a Random Movement Scenario and Work Day Movement Scenario, to test how effective opportunistic data transmission can be. We focus on intelligent probabilistic forwarding mechanisms and compare them to other state of the art delay tolerant protocols across a range of metric such as the number of delivery success, latency and overheads. Our results show that Spray and Wait has the best performance with the Random Scenario but the worst performance with the Work Day Movement Scenario whereas the intelligent probabilistic protocols has the lowest with Random Scenario but it gives better performance than Spray and Wait in Work Day Movement Scenario.

2009 Fifth International Conference on Wireless and Mobile Communications, 2009
We are concerned with routing data in networks where the topology ranges from dense to sparse and... more We are concerned with routing data in networks where the topology ranges from dense to sparse and mostly connected to mostly disconnected. Connection orientated routing algorithms developed for connected environments fail in disconnected environments, due to the instability of these networks. Similarly, forwarding strategies designed to disseminate data hop-by-hop in disconnected environments fail in connected environments, as they send more packets than is required, resulting in congestion. Our algorithm exploits the re-occuring patterns in connectivity arising from the typical routine structure of day-today life. We present a functionality overview of the three components of our proposal: contact driven source routing, disconnection tolerant data forwarding and packet scheduling for energy efficiency. We evaluate our early emulations (simulations in ns-2 with real world data) of disconnection tolerant data forwarding, our results show that source routing can be extended to improve its performance in disconnected environments.

Frontiers in Marine Science, Jan 19, 2023
Oceans at a depth ranging from~100 to~1000-m (defined as the intermediate water here), though poo... more Oceans at a depth ranging from~100 to~1000-m (defined as the intermediate water here), though poorly understood compared to the sea surface, is a critical layer of the Earth system where many important oceanographic processes take place. Advances in ocean observation and computer technology have allowed ocean science to enter the era of big data (to be precise, big data for the surface layer, small data for the bottom layer, and the intermediate layer sits in between) and greatly promoted our understanding of near-surface ocean phenomena. During the past few decades, however, the intermediate ocean is also undergoing profound changes because of global warming, the research and prediction of which are of intensive concern. Due to the lack of threedimensional ocean theories and field observations, how to remotely sense the intermediate ocean from space becomes a very attractive but challenging scientific issue. With the rapid development of the next generation of information technology, artificial intelligence (AI) has built a new bridge from data science to marine science (called Deep Blue AI, DBAI), which acts as a powerful weapon to extend the paradigm of modern oceanography in the era of the metaverse. This review first introduces the basic prior knowledge of water movement in the~100 m ocean and vertical stratification within the~1000-m depths as well as the data resources provided by satellite remote sensing, field observation, and model reanalysis for DBAI. Then, three universal DBAI methodologies, namely, associative statistical, physically informed, and mathematically driven neural networks, are elucidated in the context of intermediate ocean remote sensing. Finally, the unique advantages and potentials of DBAI in data mining and knowledge discovery are demonstrated in a top-down way of "surface-to-interior" via several typical examples in physical and biological oceanography. KEYWORDS deep blue artificial intelligence, intermediate ocean, ocean remote sensing, associative statistical neural network, physically informed neural network, mathematically driven neural network Frontiers in Marine Science frontiersin.org 01

Energy-Aware Opportunistic Charging and Energy Distribution for Sustainable Vehicular Edge and Fog Networks
The fast-growing popularity of electric vehicles (EVs) poses complex challenges for the existing ... more The fast-growing popularity of electric vehicles (EVs) poses complex challenges for the existing power grid infrastructure to meet the high demands at peak charging hours. Discovering and transferring energy amongst EVs in mobile vehicular edges and fogs is expected to be an effective solution for bringing energy closer to where the demand is and improving the scalability and flexibility compared to traditional charging solutions. In this paper, we propose a fully-distributed energy-aware opportunistic charging approach which enables distributed multi-layer adaptive edge cloud platform for sustainable mobile autonomous vehicular edges which host dynamic on-demand virtual edge containers of on-demand services. We introduce a novel Reinforcement Learning (Q-learning) based SmartCharge algorithm formulated as a finite Markov Decision Process. We define multiple edge energy states, transitions and possible actions of edge nodes in dynamic complex network environments which are adaptively resolved by multilayer real-time multidimensional predictive analytics. This allows SmartCharge edge nodes to more accurately capture, predict and adapt to dynamic spatial-temporal energy supply and demand as well as mobility patterns when energy peaks are expected. More specifically, SmartCharge edge nodes are able to autonomously and collaboratively understand when (how soon) and where the geo-temporal peaks are expected to happen, thus enable better local prediction and more accurate global distribution of energy resources. We provide multi-criteria evaluation of SmartCharge against competitive protocols over real-world San Francisco Cab mobility traces and in the presence of real-world users' energy interest traces driven by Foursquare San Francisco dataset. We show that SmartCharge successfully predicts and mitigates congestion in peak charging hours, reduces the waiting time between vehicles sending energy demand requests and being successfully charged as well as significantly reduces the total number of vehicles in need of energy.

Energy-Aware Opportunistic Charging and Energy Distribution for Sustainable Vehicular Edge and Fog Networks
2020 Fifth International Conference on Fog and Mobile Edge Computing (FMEC), 2020
The fast-growing popularity of electric vehicles (EVs) poses complex challenges for the existing ... more The fast-growing popularity of electric vehicles (EVs) poses complex challenges for the existing power grid infrastructure to meet the high demands at peak charging hours. Discovering and transferring energy amongst EVs in mobile vehicular edges and fogs is expected to be an effective solution for bringing energy closer to where the demand is and improving the scalability and flexibility compared to traditional charging solutions. In this paper, we propose a fully-distributed energy-aware opportunistic charging approach which enables distributed multi-layer adaptive edge cloud platform for sustainable mobile autonomous vehicular edges which host dynamic on-demand virtual edge containers of on-demand services. We introduce a novel Reinforcement Learning (Q-learning) based SmartCharge algorithm formulated as a finite Markov Decision Process. We define multiple edge energy states, transitions and possible actions of edge nodes in dynamic complex network environments which are adaptively resolved by multilayer real-time multidimensional predictive analytics. This allows SmartCharge edge nodes to more accurately capture, predict and adapt to dynamic spatial-temporal energy supply and demand as well as mobility patterns when energy peaks are expected. More specifically, SmartCharge edge nodes are able to autonomously and collaboratively understand when (how soon) and where the geo-temporal peaks are expected to happen, thus enable better local prediction and more accurate global distribution of energy resources. We provide multi-criteria evaluation of SmartCharge against competitive protocols over real-world San Francisco Cab mobility traces and in the presence of real-world users' energy interest traces driven by Foursquare San Francisco dataset. We show that SmartCharge successfully predicts and mitigates congestion in peak charging hours, reduces the waiting time between vehicles sending energy demand requests and being successfully charged as well as significantly reduces the total number of vehicles in need of energy.
Towards Peer-to-Peer Access Grid
Springer eBooks, 2005
ABSTRACT
This paper describes the design, development and evaluation an interest driven overlay on the top... more This paper describes the design, development and evaluation an interest driven overlay on the top of our congestion aware forwarding protocol (CAFe) for social opportunistic networks. We show that P2P filecasting on top of Cafe achieves high success ratio of answered queries and high availability of intermediary nodes while maintaining fast downloads.
Towards Peer-to-Peer Access Grid
Lecture Notes in Computer Science, 2005
ABSTRACT

IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
A subsurface chlorophyll maximum is an important ecological feature of planktonic ecosystems. Alt... more A subsurface chlorophyll maximum is an important ecological feature of planktonic ecosystems. Although the vertical profiles can be determined through the implementation of biogeochemical (BGC)-Argo buoy, this method is not compatible with the ocean observation requirements of high-resolution spatiotemporal measurements. Here, we demonstrate that deep learning can proficiently fill in these observational gaps when combined with sea surface data from ocean color radiometry. First, the sparse vertical profile data of BGC-Argo is fused with sea surface data to construct the benchmark dataset for deep learning. Second, encouraged by the idea of dense numerical representations, the comprehensive model combined with coupled embedding and bidirectional gated recurrent unit is proposed to inverse the vertical profile with BGC-Argo and satellite data. Then, the in-depth spatiotemporal analysis of the subsurface chlorophyll maxima phenomenon is performed by the parametric equation method and deep learning method as well. Finally, extensive experiments in the Northwest Pacific were conducted to demonstrate the effectiveness of the proposed methodology. The impressive results indicate that the proposed method can compensate for the lack of sparse in situ observations of chlorophyll concentration, the determination coefficient is increased by more than 20%. This study is of great significance to marine ecology and provides important insight into artificial intelligence in the study of subsurface oceanic phenomena. Index Terms-Biogeochemical (BGC)-Argo, bidirectional gated recurrent unit (bi-GRU) network, deep embedding, remote sensing, subsurface chlorophyll maxima.

IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Wild fish recognition is a fundamental problem of ocean ecology research and contributes to the u... more Wild fish recognition is a fundamental problem of ocean ecology research and contributes to the understanding of biodiversity. Given the huge number of wild fish species and unrecognized category, the essence of the problem is an open set fine-grained recognition. Moreover, the unrestricted marine environment makes the problem even more challenging. Deep learning has been demonstrated as a powerful paradigm in image classification tasks. In this article, the wild fish recognition deep neural network (termed WildFishNet) is proposed. Specifically, an open set fine-grained recognition neural network with a fused activation pattern is constructed to implement wild fish recognition. First, three different reciprocal inverted residual structural modules are combined by neural structure search to obtain the best feature extraction performance for fine-grained recognition; next, a new fusion activation pattern of softmax and openmax functions is designed to improve the recognition ability of open set. Then, the experiments are implemented on the WildFish dataset that consists of 54 459 unconstrained images, which includes 685 known classes and 1 open set unrecognized category. Finally, the experimental results are analyzed comprehensively to demonstrate the effectiveness of the proposed method. The in-depth study also shows that artificial intelligence can empower marine ecosystem research.
Instant deep sea debris detection for maneuverable underwater machines to build sustainable ocean using deep neural network
Science of The Total Environment
Fourth edition of the International Conference on the Innovative Computing Technology (INTECH 2014), 2014
MANETs are open and cooperative networks and can be formed quickly and without any complex infras... more MANETs are open and cooperative networks and can be formed quickly and without any complex infrastructure. These are very useful characteristics for fast and easy connectivity, however this poses severe security threats. In this paper, we only focus on the security threats posed to most popular MANET routing protocol AODV by black hole and flooding attacks. A Simulation study has been conducted in ns-3 to compare the performance of preventive schemes FAP and AMTT in case of flooding and black hole attacks on MANET's. The performance is analyzed based on throughput, message delay and routing overhead.
Lecture Notes in Computer Science, 2004

Cognitive radio based smart grid: The future of the traditional electrical grid
Ad Hoc Networks, 2016
The traditional electrical grid is currently undergoing a range of modernization effort s and bec... more The traditional electrical grid is currently undergoing a range of modernization effort s and becoming a smarter grid [1] . In the traditional electrical grid, energy is distributed from the generation plants to the consumers via large nationwide transmission and distribution networks. Information monitoring and management in these traditional electrical networks is typically limited to the distribution networks that distribute electrical power within a city to the individual consumers. Due to rising demands, aging infrastructure, reliability concerns, and the emergence of renewable energy sources, the smart grid (SG) concept is being introduced [2] . Typically, there are three architectural building blocks of the smart grid. First, Home Area Networks (HANs), which connect the devices within the consumer premises, such as smart meters, distributed renewable energy sources, and Plug-in Electric Vehicles. Second Neighborhood Area Networks (NANs), which interconnect multiple HANs, and communicate the collected information to Wide Area Networks (WANs). Third, WANs, which serve as communication backbone. The smart grid will be equipped with state-of-theart information and communication technologies (ICT) and smart devices, such as smart meters, wireless sensor nodes, and load balancing through real time demand side management, pervasive computing, sensing devices, broadband communication, and intelligent management techniques [3-10] . Additionally, wireless sensor nodes along with actuator networks can be very useful to give access to remote sites and places where human intervention is not possible [11, 12] . Such information and novel communication technologies have the potential to significantly improve the efficiency, effectiveness, reliability, sustainability, and stability of the electrical grid. The smart grid will adopt several communication technologies to fulfill the wide range of functionalities expected from the modern electricity grid. These communication technologies range from both wired and wireless communication technologies, such as Bluetooth,

2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), 2020
We trained artificial neural network (ANN) models to classify peripheral blood mononuclear cells ... more We trained artificial neural network (ANN) models to classify peripheral blood mononuclear cells (PBMC) in chronic lymphoid leukemia (CLL) patients. The classification task was to determine differences in gene expression profiles in PBMC pre-treatment (with ibrutinib) and on days 30, 120, 150, and 280 after the start of treatment. Twelve datasets represented clinical samples containing a total 48,016 single cell profiles were used to train and test ANN models to classify the progress of therapy by gene expression changes. The accuracy of ANN classification was >92% in internal cross-validation. External cross-validation, using independent data sets for training and testing, showed the accuracy of classification of post-treatment PBMCs to more than 80%. To the best of our knowledge, this is the first study that has demonstrated the potential of ANNs with 10x single cell gene expression data for detecting the changes during treatment of CLL.

e-Science and the Grid are not the same; the large-scale movement of data and the exploitation of... more e-Science and the Grid are not the same; the large-scale movement of data and the exploitation of computation is not the same as the creation, performance and management of an in silico experiment. The notion of the marshalling of resources and creation of virtual organisations begins to bring in a flavour of science, but something more is needed over and above the classic Grid to enable e-Science. This paper looks at the requirements of e-Science from the user's perspective. The my Grid project aims to provide a toolkit of services that comprise the Information Grid and the applications that sit there upon. The aim is to provide a set of services that have the facilities to enable bioinformaticians (in particular) to perform in silico experiments using applications built upon components from a Grid enabled middleware layer. This paper introduces the my Grid project and explores the nature of an in silico experiment for the bioinformatics domain. The paper then reviews the general user requirements for an empirical e-Scientist. We then introduce a biological scenario, where bench experiments are coupled to in silico experiments, which we have used to drive the user requirements capture in my Grid. Then, the my Grid workbench, an application that demonstrates the functionality of my Grid is reviewed. Finally, we match the current status of my Grid to our general requirements and explore how we can use the current implementation to drive the capture of further, more detailed user requirements.

A scaleable and adaptive audio service to support large scale collaborative work and entertainment
We describe an audio-service for collaborative network applications, designed to support many sim... more We describe an audio-service for collaborative network applications, designed to support many simultaneous audio sources and to operate across the Internet. Our service introduces and exploits a new technique called distributed partial mixing to dynamically adapt to varying numbers of speakers and network congestion. A collection of networked partial audio mixers is arranged as a distributed graph. Each partial mixer adaptively mixes subsets of its input audio streams into one or more mixed streams, which it forwards along with any unmixed streams. This reduces network traffic but at the cost of also reducing audio quality. Wide range of network experiments demonstrate how distributed partial mixing dynamically manages the trade-off between responsiveness, stability and TCP-fairness, and thus performing effective congestion control while maximising the audio quality for the end user. Distributed partial mixing has been integrated within a new generation large scale collaborative virtual platform (MASSIVE-3) and is since in use for Inhabited TV applications.

Journal of Sensor and Actuator Networks
The increased interest in autonomous vehicles has led to the development of novel networking prot... more The increased interest in autonomous vehicles has led to the development of novel networking protocols in VANETs In such a widespread safety-critical application, security is paramount to the implementation of the networks. We view new autonomous vehicle edge networks as opportunistic networks that bridge the gap between fully distributed vehicular networks based on short-range vehicle-to-vehicle communication and cellular-based infrastructure for centralized solutions. Experiments are conducted using opportunistic networking protocols to provide data to autonomous trams and buses in a smart city. Attacking vehicles enter the city aiming to disrupt the network to cause harm to the general public. In the experiments the number of vehicles and the attack length is altered to investigate the impact on the network and vehicles. Considering different measures of success as well as computation expense, measurements are taken from all nodes in the network across different lengths of attack...

This thesis is concerned with supporting natural audio communication in collaborative environment... more This thesis is concerned with supporting natural audio communication in collaborative environments across the Internet. Recent experience with Collaborative Virtual Environments, for example, to support large on-line communities and highly interactive social events, suggest that in the future there will be applications in which many users speak at the same time. Such applications will generate large and dynamically changing volumes of audio traffic that can cause congestion and hence packet loss in the network and so seriously impair audio quality. This thesis reveals that no current approach to audio distribution can combine support for large number of simultaneous speakers with TCP-fair responsiveness to congestion. A model for audio distribution called Distributed Partial Mixing (DPM) is proposed that dynamically adapts both to varying numbers of active audio streams in collaborative environments and to congestion in the network. Each DPM component adaptively mixes subsets of its...
Proceedings of the Workshop on Grid Computing and Its Application to Data Analysis (GADA) PC Co-chairs Message
Uploads
Papers by Milena Radenkovic