Papers by Panagiotis Tsakalides

IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Regions around the world experience adverse climate change induced conditions which pose severe r... more Regions around the world experience adverse climate change induced conditions which pose severe risks to the normal and sustainable operations of modern societies. Extreme weather events, such as floods, rising sea-levels and storms, stand as characteristic examples that impair the core services of the global ecosystem. Especially floods have a severe impact on human activities, hence early and accurate delineation of the disaster is of top-priority since it provides environmental, economic, and societal benefits and eases relief efforts. In this work, we introduce OmbriaNet, a deep neural network architecture, based on Convolutional Neural Networks (CNNs), that detects changes between permanent and flooded water areas by exploiting the temporal differences among flood events extracted by different sensors. To demonstrate the potential of the proposed approach, we generated OMBRIA, a bitemporal and multimodal satellite imagery dataset for image segmentation through supervised binary classification. It consists of a total number of 3.376 images, Synthetic Aperture Radar (SAR) imagery from Sentinel-1, and multispectral imagery from Sentinel-2, accompanied with ground truth binary images produced from data derived by experts and provided from the Emergency Management Service of the European Space Agency Copernicus Program. The dataset covers 23 flood events around the globe, from 2017 to 2021. We collected, co-registrated and pre-processed the data in Google Earth Engine. To validate the performance of our method, we performed different benchmarking experiments on the OMBRIA dataset and we compared with several competitive state-of-theart techniques. The experimental analysis demonstrated that the proposed formulation is able to produce high-quality flood maps, achieving a superior performance over the state-of-theart.We provide OMBRIA dataset, as well as OmbriaNet code at: https://github.com/geodrak/OMBRIA.

Sensors
Forecasting the values of essential climate variables like land surface temperature and soil mois... more Forecasting the values of essential climate variables like land surface temperature and soil moisture can play a paramount role in understanding and predicting the impact of climate change. This work concerns the development of a deep learning model for analyzing and predicting spatial time series, considering both satellite derived and model-based data assimilation processes. To that end, we propose the Embedded Temporal Convolutional Network (E-TCN) architecture, which integrates three different networks, namely an encoder network, a temporal convolutional network, and a decoder network. The model accepts as input satellite or assimilation model derived values, such as land surface temperature and soil moisture, with monthly periodicity, going back more than fifteen years. We use our model and compare its results with the state-of-the-art model for spatiotemporal data, the ConvLSTM model. To quantify performance, we explore different cases of spatial resolution, spatial region ext...

Abstract—The systematic collection of data has become an<br> intrinsic process of all aspec... more Abstract—The systematic collection of data has become an<br> intrinsic process of all aspects in modern life. From industrial<br> to healthcare machines and wearable sensors, an unprecedented<br> amount of data is becoming available for mining and information<br> retrieval. In particular, anomaly detection plays a key role in a<br> wide range of applications, and has been studied extensively.<br> However, many anomaly detection methods are unsuitable in<br> practical scenarios, where streaming data of large volume arrive in nearly real-time at devices with limited resources. Dimension-<br> ality reduction has been excessively used to enable efficient pro-<br> cessing for numerous high-level tasks. In this paper, we propose a computationally efficient, yet highly accurate, framework for<br> anomaly detection of streaming data in lower-dimensional spaces,<br> utilizing a modification of the symbolic aggregate approximat...

Remote Sensing
Multispectral sensors constitute a core earth observation imaging technology generating massive h... more Multispectral sensors constitute a core earth observation imaging technology generating massive high-dimensional observations acquired across multiple time instances. The collected multi-temporal remote sensed data contain rich information for Earth monitoring applications, from flood detection to crop classification. To easily classify such naturally multidimensional data, conventional low-order deep learning models unavoidably toss away valuable information residing across the available dimensions. In this work, we extend state-of-the-art convolutional network models based on the U-Net architecture to their high-dimensional analogs, which can naturally capture multi-dimensional dependencies and correlations. We introduce several model architectures, both of low as well as of high order, and we quantify the achieved classification performance vis-Ă -vis the latest state-of-the-art methods. The experimental analysis on observations from Landsat-8 reveals that approaches based on low-...

Abstract—In this paper, a multichannel version of the sinusoids plus noise model (also known as d... more Abstract—In this paper, a multichannel version of the sinusoids plus noise model (also known as deterministic plus stochastic de-composition) is proposed and applied to spot microphone signals of a music recording. These are the recordings captured by the var-ious microphones placed in a venue, before the mixing process pro-duces the final multichannel audio mix. Coding these microphone signals makes them available to the decoder, allowing for interac-tive audio reproduction which is a necessary component in immer-sive audio applications. The proposed model uses a single reference audio signal in order to derive a noise signal per spot microphone. This noise signal can significantly enhance the sinusoidal represen-tation of the corresponding spot signal. The reference can be one of the spot signals or a downmix, depending on the application. Thus, for a collection of multiple spot signals, only the reference is fully encoded (e.g., as an MP3 monophonic signal). For the remaining spo...

2020 54th Asilomar Conference on Signals, Systems, and Computers, 2020
In this paper, we discover the special properties of neurons in terms of compression. Neurons are... more In this paper, we discover the special properties of neurons in terms of compression. Neurons are able to transform a visual stimulus into a sequence of discrete biphasic events, called spikes trains, forming the neural code. The neural spike generation properties are beneficial to image processing community as the neural code is very compact, yet informative enough, to be used in the input stimulus recovery. We show that the spike-based compression enables to improve the reconstruction quality in time which is a completely novel feature compared to compression standards. In addition, we mathematically prove that the proposed neuro-inspired mechanism behave either as a uniform or a non-uniform quantizer depending on its parameter. Last but not least, we build an end-to-end spike-based coding/decoding architecture that first transforms an image with a DCT filter and then, it generates spikes to compress the transformed coefficients. Based on these spike trains we reconstruct the inpu...

2019 IEEE 19th International Conference on Bioinformatics and Bioengineering (BIBE), 2019
Dyslexia is a neurodevelopmental learning disorder that affects the acceleration and precision of... more Dyslexia is a neurodevelopmental learning disorder that affects the acceleration and precision of word recognition, therefore obstructing the reading fluency, as well as text comprehension. Although it is not an oculomotor disease, readers with dyslexia have shown different eye movements than typically developing subjects during text reading. The majority of existing screening techniques for dyslexia's detection employ features associated with the aberrant visual scanning of reading text seen in dyslexia, whilst ignoring completely the behavior of the underlying data generating dynamical system. To address this problem, this work proposes a novel self-tuned architecture for feature extraction by modeling directly the inherent dynamics of wearable sensor data in higher-dimensional phase spaces via multidimensional recurrence quantification analysis (RQA) based on state matrices. Experimental evaluation on real data demonstrates the improved recognition accuracy of our method when...

This paper describes the implementation of an Internet of Things (IoT) and Open Data infrastructu... more This paper describes the implementation of an Internet of Things (IoT) and Open Data infrastructure by the Institute of Computer Science of the Foundation for Research and Technology—Hellas (FORTH-ICS) for the city of Heraklion, focusing on the application of mature research and development outcomes in a Smart City context. These outcomes mainly fall under the domains of Telecommunication and Networks, Information Systems, Signal Processing and Human Computer Interaction. The infrastructure is currently being released and becoming available to the municipality and the public through the Heraklion Smart City web portal. It is expected that in the future such infrastructure will act as one of the pillars for sustainable growth and prosperity in the city, supporting enhanced overview of the municipality over the city that will foster better planning, enhanced social services and improved decision-making, ultimately leading to improved quality of life for all citizens and visitors.

2021 29th European Signal Processing Conference (EUSIPCO)
In the last decades, many studies have explored the potential of utilizing complex network approa... more In the last decades, many studies have explored the potential of utilizing complex network approaches to characterize time series generated from dynamical systems. Along these lines, Visibility Graph (VG) and Horizontal Visibility Graph (HVG) networks have contributed to an important yet difficult problem in bioinformatics, the classification of the secondary structure of low-homology proteins. In particular, each protein is presented as a two-dimensional time series that is later transformed, using either VG or HVG, into two independent graphs. However, this is an inefficient way of processing multidimensional time series as it fails to capture the correlation between the two signals while it also increases the time and memory complexity. To address this issue, this work proposes four novel VG and HVG-based frameworks that are able to deal directly with the multidimensional time series. Each of the methods generates a unique graph following a different visibility rule concerning only the relation between pairs of time series intensities of the multidimensional time series. Experimental evaluation on real protein sequences demonstrates the superiority of our best scheme, with respect to both accuracy and computational time, when compared against the state-of-the-art.

While single-class classification has been a highly active topic in optical remote sensing, much ... more While single-class classification has been a highly active topic in optical remote sensing, much less effort has been given to the multi-label classification framework, where pixels are associated with more than one labels, an approach closer to the reality than single-label classification. Given the complexity of this problem, identifying representative features extracted from raw images is of paramount importance. In this work, we investigate feature learning as a feature extraction process in order to identify the underlying explanatory patterns hidden in low-level satellite data for the purpose of multi-label classification. Sparse autoencoders composed of a single hidden layer, as well as stacked in a greedy layer-wise fashion formulate the core concept of our approach. The results suggest that learning such sparse and abstract representations of the features can aid in both remote sensing and multi-label problems. The results presented in the paper correspond to a novel real d...

Signal Process., 2021
Conventional compressive sampling (CS) primarily assumes light-tailed models for the underlying s... more Conventional compressive sampling (CS) primarily assumes light-tailed models for the underlying signal and/or noise statistics. Nevertheless, this assumption is abolished when operating in impulsive environments, where non-Gaussian infinite-variance processes arise for the signal and/or noise components. This drives traditional linear sampling operators to failure, since the gross observation errors are spread uniformly over the generated compressed measurements, whilst masking the critical information content of the observed signal. To address this problem, this paper exploits the power of symmetric alpha-stable (S α S) distributions to design a robust nonlinear compressive sampling operator capable of suppressing the effects of infinite-variance additive observation noise. Specifically, a generalized alpha-stable matched filter is introduced for generating compressed measurements in a nonlinear fashion, which achieves increased robustness to impulsive observation noise, thus subse...
We describe new methods on the modeling of the ampli tude statistics of airborne radar clutter by... more We describe new methods on the modeling of the ampli tude statistics of airborne radar clutter by means of alpha stable distributions We develop target angle and Doppler maximum likelihood based estimation techniques from radar measurements retrieved in the presence of impulsive noise thermal jamming or clutter modeled as a multivariate sub Gaussian random process We derive the Cram er Rao bounds for the additive sub Gaussian interference scenario to assess the best case estimation accuracy which can be achieved Finally we introduce a new joint spatial and doppler frequency high resolution estimation technique based on the fractional lower order statistics of the measure ments of a radar array The results are of great importance in the study of space time adaptive processing STAP for airborne pulse Doppler radar arrays operating in impulsive interference environments
Compressed sensing is an attractive compression scheme due to its universality and lack of comple... more Compressed sensing is an attractive compression scheme due to its universality and lack of complexity on the sensor side. In this work we demonstrate how it could be used in a wireless sensor network. We consider a sensor network that tracks the location of a subject wearing a device that periodically transmits an audio signal. Through simulations and measurements of a simple system, we illustrate that dramatic compression can be acheived.

The spectral dimension of hyperspectral imaging (HSI) systems plays a fundamental role in numerou... more The spectral dimension of hyperspectral imaging (HSI) systems plays a fundamental role in numerous terrestrial and earth observation applications, including spectral unmixing, target detection, and classification among others. However, in several cases the spectral resolution of HSI systems is sacrificed for the shake of spatial resolution, as such in the case of snapshot spectral imaging systems that acquire simultaneously the 3D datacube. We address these limitations by introducing an efficient post-acquisition spectral resolution enhancement scheme that synthesizes the full spectrum from only few acquired spectral bands. To achieve this goal we utilize a regularized sparse-based learning procedure where the relations between high and low-spectral resolution hyper-pixels are efficiently encoded via a coupled dictionary learning scheme. Experimental results and quantitative validation on data acquired by NASA’s EO-1 mission’s Hyperion sensor, demonstrate the potential of the propos...

2011 19th European Signal Processing Conference, 2011
Accurate indoor localization is a significant task for many ubiquitous and pervasive computing ap... more Accurate indoor localization is a significant task for many ubiquitous and pervasive computing applications, with numerous solutions based on IEEE802.11, Bluetooth, ultrasound and infrared technologies being proposed. The inherent sparsity present in the problem of location estimation motivates in a natural fashion the use of the recently introduced theory of compressive sensing (CS), which states that a signal having a sparse representation in an appropriate basis can be reconstructed with high accuracy from a small number of random linear projections. In the present work, we exploit the framework of CS to perform accurate indoor localization based on signal-strength measurements, while reducing significantly the amount of information transmitted from a wireless device with limited power, storage, and processing capabilities to a central server. Equally importantly, the inherent property of CS acting as a weak encryption process is demonstrated by showing that the proposed approach...

2020 28th European Signal Processing Conference (EUSIPCO), 2021
The systematic collection of data has become an intrinsic process of all aspects in modern life. ... more The systematic collection of data has become an intrinsic process of all aspects in modern life. From industrial to healthcare machines and wearable sensors, an unprecedented amount of data is becoming available for mining and information retrieval. In particular, anomaly detection plays a key role in a wide range of applications, and has been studied extensively. However, many anomaly detection methods are unsuitable in practical scenarios, where streaming data of large volume arrive in nearly real-time at devices with limited resources. Dimensionality reduction has been excessively used to enable efficient processing for numerous high-level tasks. In this paper, we propose a computationally efficient, yet highly accurate, framework for anomaly detection of streaming data in lower-dimensional spaces, utilizing a modification of the symbolic aggregate approximation for dimensionality reduction and a statistical hypothesis testing based on the Kullback-Leibler divergence.

2020 28th European Signal Processing Conference (EUSIPCO), 2021
As the fields of brain-computer interaction and digital monitoring of mental health are rapidly e... more As the fields of brain-computer interaction and digital monitoring of mental health are rapidly evolving, there is an increasing demand to improve the signal processing module of such systems. Specifically, the employment of electroencephalogram (EEG) signals is among the best non-invasive modalities for collecting brain signals. However, in practice, the quality of the recorded EEG signals is often deteriorated by impulsive noise, which hinders the accuracy of any decision-making process. Previous methods for denoising EEG signals primarily rely on second order statistics for the additive noise, which is not a valid assumption when operating in impulsive environments. To alleviate this issue, this work proposes a new method for suppressing the effects of heavy-tailed noise in EEG recordings. To this end, the spatio-temporal interdependence between the electrodes is first modelled by means of graph representations. Then, the family of alpha-stable models is employed to fit the distr...

2018 26th European Signal Processing Conference (EUSIPCO), 2018
This paper addresses the problem of compressively sensing a set of temporally correlated sources,... more This paper addresses the problem of compressively sensing a set of temporally correlated sources, in order to achieve faithful sparse signal reconstruction from noisy multiple measurement vectors (MMV). To this end, a simple sensing mechanism is proposed, which does not require the restricted isometry property (RIP) to hold near the sparsity level, whilst it provides additional degrees of freedom to better capture and suppress the inherent sampling noise effects. In particular, a reduced set of MMVs is generated by projecting the source signals onto random vectors drawn from isotropic multivariate stable laws. Then, the correlated sparse signals are recovered from the random MMVs by means of a recently introduced sparse Bayesian learning algorithm. Experimental evaluations on synthetic data with varying number of sources, correlation values, and noise strengths, reveal the superiority of our proposed sensing mechanism, when compared against well-established RIP-based compressive sen...

2020 28th European Signal Processing Conference (EUSIPCO), 2021
Prediction of protein structural classes from amino acid sequences is a challenging problem as it... more Prediction of protein structural classes from amino acid sequences is a challenging problem as it is profitable for analyzing protein function, interactions, and regulation. The majority of existing prediction methods for low-homology sequences utilize numerous amount of features and require an exhausting search for optimal parameter tuning. To address this problem, this work proposes a novel self-tuned architecture for feature extraction by modeling directly the inherent dynamics of the data in higher-dimensional phase space via chaos game representation (CGR) and generalized multidimensional recurrence quantification analysis (GmdRQA). Experimental evaluation on a real benchmark dataset demonstrates the superiority of the herein proposed architecture when compared against the state-of-the-art unidimensional RQA taking under consideration that our method achieves similar performance in a data-driven manner with a smaller computational cost.

2020 28th European Signal Processing Conference (EUSIPCO)
The ever-increasing volume and complexity of time series data, emerging in various application do... more The ever-increasing volume and complexity of time series data, emerging in various application domains, necessitate efficient dimensionality reduction for facilitating data mining tasks. Symbolic representations, among them symbolic aggregate approximation (SAX), have proven very effective in compacting the information content of time series while exploiting the wealth of search algorithms used in bioinformatics and text mining communities. However, typical SAX-based techniques rely on a Gaussian assumption for the underlying data statistics, which often deteriorates their performance in practical scenarios. To overcome this limitation, this work introduces a method that negates any assumption on the probability distribution of time series. Specifically, a data-driven kernel density estimator is first applied on the data, followed by Lloyd-Max quantization to determine the optimal horizontal segmentation breakpoints. Experimental evaluation on distinct datasets demonstrates the superiority of our method, in terms of reconstruction accuracy and tightness of lower bound, when compared against the conventional and a modified SAX method.
Uploads
Papers by Panagiotis Tsakalides