0% found this document useful (0 votes)
9 views9 pages

LSTMs

Long short term memory

Uploaded by

Amiset 75
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views9 pages

LSTMs

Long short term memory

Uploaded by

Amiset 75
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Review of Papers with use

of LSTMs for Time Series


Analysis
Determine Novelty in own Design
Preliminary Draft
Conclusions from Study
• Recurrent Neural Networks (RNN) and its offshoot, the Long Short-Term Memory (LSTM) networks
are preferred for Time-Series (sequence) type of data, since
• capable of automatically learning features from sequence data
• support multiple-variate data
• can output a variable length sequences that can be used for multi-step forecasting
• Allows detection of anomalies in time-series data
• Provides prediction of event
• Unique/ Novel Aspects
• Use of RNN-LSTMs to predict likelihood of occurrence of afflictions in specific body segment
• Approach is to detect anomalies in the capacitance data being obtained
• Customized Time-series dataset is built for providing training to RNN-LSTM network
• Predicts likely affliction points/locations on a subjects body prior to onset of infirmity – hence preventive in
nature
• For a new patient use a NBayes LSTM to compensate for epistemic uncertainty. Once adequate data is
available pertaining to the patient, we switch to a customized stacked LSTM
• Customized Stacked LSTMs utilized for focused monitoring of individual body segments using anomaly
detection in the time sequence data – one LSTM per body segment
• Main combined Dataset – for overall prediction
Time Head Shoulder Spine Left Flank Right Flank Prediction

X1
Capacitance values of individual array patch
Xn

• Individual Dataset of each Patch/ Group of patches in a segment to predict


anomalies and hence predict likelihood of affliction
Time Head Prediction
X1
LSTM -1 Predict anomaly to indicate
Likelihood of affliction in respective patch
Xn

Time Shoulder Prediction


X1
Predict anomaly to indicate
LSTM -2 Likelihood of affliction in respective patch
Xn
Papers studied so far
• Long Short-Term Memory Network Design for Analog Computing
• ZHOU ZHAO, ASHOK SRIVASTAVA, LU PENG, and QING CHEN, Louisiana State University
• ACM Journal on Emerging Technologies in Computing Systems, Vol. 15, No. 1, Article 13. Pub.
date: January 2019
• ABSTRACT
• present an analog-integrated circuit implementation of long short-term memory network, which is compatible
with digital CMOS technology.
• used multiple-input floating gate MOSFETs as both the front-end to obtain converted analog signals and the
differential pairs in proposed analog multipliers.
• Analog crossbar is built by the analog multiplier processing matrix and bitwise multiplications.
• We have shown that using current signals as internal transmission signals can largely reduce computation
delay, compared to the digital implementation.
• We also have introduced analog blocks to work as activation functions for the algorithm.
• In the back-end of our design, we have used current comparators to achieve the output to be readable to
external digital systems.
• We have designed the LSTM network with the matrix size of 16 × 16 in TSMC 180nm CMOS technology
• INTRO
• Recurrent neural network (RNN) is one of the artificial neural networks (ANN) that can process real-time
data (Pearlmutter 1989).
• The hardware implementations of LSTM mostly use FPGA (Guan et al. 2017a; Guan et al.
2017b;
Ferreira and Fonseca 2016). The computation speed of LSTM in FPGA-based
implementations is
• https://machinelearningmastery.com/how-to-develop-lstm-models-for-m
ulti-step-time-series-forecasting-of-household-power-consumption/
• Jason Brownlee
• This tutorial is divided into nine parts; they are:

• Problem Description
• Load and Prepare Dataset
• Model Evaluation
• LSTMs for Multi-Step Forecasting
• LSTM Model With Univariate Input and Vector Output
• Encoder-Decoder LSTM Model With Univariate Input
• Encoder-Decoder LSTM Model With Multivariate Input
• CNN-LSTM Encoder-Decoder Model With Univariate Input
• ConvLSTM Encoder-Decoder Model With Univariate Input
IoT Data Analytics Using Deep Learning
• Xiaofeng Xie, Di Wu, Siping Liu, Renfa Li
• Xiaofeng Xie, Di Wu, Siping Liu and Renfa Li are with the Key Laboratory for Embedded and
Networking Computing of Hunan Province, Hunan University.
Di Wu is the corresponding author (Email: [email protected]).

Abstract:
• Deep learning is a popular machine learning approach which has achieved a lot of progress in all traditional machine
learning areas. Internet of thing (IoT) and Smart City deployments are generating large amounts of time-series sensor data
in need of analysis. Applying deep learning to these domains has been an important topic of research. The Long-Short
Term Memory (LSTM) network has been proven to be well suited for dealing with and predicting important events with
long intervals and delays in the time series. LTSM networks have the ability to maintain long-term memory.
• In an LTSM network, a stacked LSTM hidden layer also makes it possible to learn a high level temporal feature without the
need of any fine tuning and preprocessing which would be required by other techniques. In this paper, we construct a
long-short term memory (LSTM) recurrent neural network structure, use the normal time series training set to build the
prediction model. And then we use the predicted error from the prediction model to construct a Gaussian naive Bayes
model to detect whether the original sample is abnormal. This method is called LSTM-Gauss-NBayes for short. We use
three real-world data sets, each of which involve long-term time-dependence or short-term time-dependence, even very
weak time dependence. The experimental results show that LSTM-Gauss-NBayes is an effective and robust model.
https://analyticsindiamag.com/anomaly-detection-in-temperature-sensor-data-using-lstm-rnn-model/#:~:text=NVIDIA
%20Recreated%20PAC-MAN%20By,present%20in%20the%20available%20data.&text=For%20example%2C%20based
%20on%20the,the%20system%20can%20be%20predicted.

Anomaly detection has been used in various data mining applications to find the anomalous activities present in the
available data. With the advancement of machine learning techniques and developments in the field of deep learning,
anomaly detection is in high demand nowadays. This is because of implementing machine learning algorithms with heavy
datasets and generating more accurate results. The anomaly detection is not limited now to detecting the fraudulent
activities of customers, but it is also being applied in industrial applications in a full swing.

In manufacturing industries, where heavy machinery is used, the anomaly detection technique is applied to predict the
abnormal activities of machines based on the data read from sensors. For example, based on the temperature data read
through the sensors, the possible failure of the system can be predicted. In this article, we will discuss how to detect
anomalies present in the temperature data that is available in the time-series format. This data is captured from the sensors
of an internal component of a large industrial machine.

You might also like