A Novel Activity Detection Method

Similar documents
Speaker Representation and Verification Part II. by Vasileios Vasilakakis

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Neural Networks and the Back-propagation Algorithm

Deep Feedforward Networks

COMP-4360 Machine Learning Neural Networks

Artificial Neural Networks Examination, March 2004

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

CS 4700: Foundations of Artificial Intelligence

Neural Networks biological neuron artificial neuron 1

Lecture 4: Perceptrons and Multilayer Perceptrons

100 inference steps doesn't seem like enough. Many neuron-like threshold switching units. Many weighted interconnections among units

Neural networks. Chapter 19, Sections 1 5 1

ECLT 5810 Classification Neural Networks. Reference: Data Mining: Concepts and Techniques By J. Hand, M. Kamber, and J. Pei, Morgan Kaufmann

Deep Feedforward Networks. Sargur N. Srihari

CSC321 Lecture 5: Multilayer Perceptrons

Simple Neural Nets For Pattern Classification

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks

Pattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore

A Neuro-Fuzzy Scheme for Integrated Input Fuzzy Set Selection and Optimal Fuzzy Rule Generation for Classification

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

BACKPROPAGATION. Neural network training optimization problem. Deriving backpropagation

Machine Learning and Data Mining. Multi-layer Perceptrons & Neural Networks: Basics. Prof. Alexander Ihler

Deep Feedforward Networks

Unit III. A Survey of Neural Network Model

Object Recognition Using a Neural Network and Invariant Zernike Features

Machine Learning. Neural Networks

Neural networks. Chapter 20. Chapter 20 1

Introduction to Machine Learning

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

Design Collocation Neural Network to Solve Singular Perturbed Problems with Initial Conditions

Learning features by contrasting natural images with noise

Radial-Basis Function Networks

Lecture 4: Feed Forward Neural Networks

Lecture 5: Logistic Regression. Neural Networks

Lecture 17: Neural Networks and Deep Learning

Neural networks. Chapter 20, Section 5 1

Intelligent Handwritten Digit Recognition using Artificial Neural Network

POWER SYSTEM DYNAMIC SECURITY ASSESSMENT CLASSICAL TO MODERN APPROACH

Artificial Neural Networks Examination, June 2004

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

VC-dimension of a context-dependent perceptron

AI Programming CS F-20 Neural Networks

Artificial Neural Networks

Artificial Neural Network : Training

Introduction to Neural Networks

CLOUD NOWCASTING: MOTION ANALYSIS OF ALL-SKY IMAGES USING VELOCITY FIELDS

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Pattern Recognition and Machine Learning

Multilayer Perceptrons (MLPs)

Introduction To Artificial Neural Networks

Convolutional Neural Networks

Learning and Memory in Neural Networks

An artificial neural networks (ANNs) model is a functional abstraction of the

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6

An Efficient Algorithm for Fast Computation of Pseudo-Zernike Moments

Introduction to Neural Networks

Unit 8: Introduction to neural networks. Perceptrons

Radial-Basis Function Networks

Hand Written Digit Recognition using Kalman Filter

Data Mining Part 5. Prediction

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Grundlagen der Künstlichen Intelligenz

CS 4700: Foundations of Artificial Intelligence

CSE446: Neural Networks Spring Many slides are adapted from Carlos Guestrin and Luke Zettlemoyer

Estimation of Reference Evapotranspiration by Artificial Neural Network

Multilayer Perceptron

Intro to Neural Networks and Deep Learning

Object Recognition Using Local Characterisation and Zernike Moments

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat

Neural Networks. Intro to AI Bert Huang Virginia Tech

Artificial Neural Networks. Edward Gatt

Intelligent Systems Discriminative Learning, Neural Networks

Neural Networks Introduction

SPSS, University of Texas at Arlington. Topics in Machine Learning-EE 5359 Neural Networks

Neural Networks Lecture 4: Radial Bases Function Networks

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation

Neural networks (NN) 1

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)

Artificial Neural Networks. MGS Lecture 2

ECE521 Lectures 9 Fully Connected Neural Networks

Role of Assembling Invariant Moments and SVM in Fingerprint Recognition

Course 395: Machine Learning - Lectures

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING *

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

Multilayer Neural Networks. (sometimes called Multilayer Perceptrons or MLPs)

CSC 411 Lecture 10: Neural Networks

Artificial neural networks

Statistical Machine Learning from Data

Neural Networks DWML, /25

Security Analytics. Topic 6: Perceptron and Support Vector Machine

Mixture Models and EM

The Multi-Layer Perceptron

Improved Kalman Filter Initialisation using Neurofuzzy Estimation

Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning

Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 6: Multi-Layer Perceptrons I

Neural Networks Lecturer: J. Matas Authors: J. Matas, B. Flach, O. Drbohlav

Transcription:

A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of older adults. Care giver can remotely monitor the activities of them. Data may be collected using web cameras. First of all, silhouettes are extracted from each video frames. Next is the feature extraction from the frames containing silhouette. Here Zernike moment feature is extracted. Then neural network classification is done on the image moments in order to detect the activities. The network is trained using back propagation algorithm. The approach described here is capable of detecting several different activity states like bending, being upright and being on the floor. An alarm can be rung in case of emergency. This method also addresses their privacy concerns since the person monitoring the activity, sees only silhouettes instead of real images. KEYWORDS: silhouette extraction; zernike moment; back propagation I. INTRODUCTION Human activity recognition has been an important focus of research in computer vision for a long time. It has been important in various applications such as surveillance, robot learning, and human-computer interaction, as well as physical function monitoring. Several approaches have been proposed to identify different activities from video sequences. The activity detection method described in this project can be used to monitor the activity of older adults in their apartments, while addressing their privacy concerns. In this system, background subtraction techniques are used to separate the foreground from the background, and the resulting silhouettes are then taken as input to the automatic activity segmentation system. Privacy is maintained by using silhouettes instead of raw images for further analysis. Neural network is employed in order to classify and detect activities. The network was trained using back propagation algorithm. Also an alarm can be given in case of emergency conditions. II. PROPOSED METHOD The aim of this detection method is to monitor activities of older adults, especially the fall condition in their apartments. Mainly activities like sitting, standing, lying etc are detected. No body-worn sensors are required in this. This method also addresses their privacy concerns. Fixed fisheye lens Unibrain web cameras are used to capture the video of activities of individuals. The cameras are fitted with range of viewing angle equal to 180 degree. Now, this video data is the primary input for our detection process. We know that a video is a collection of video frames. The video input is given at the rate of 30 frames per second. Copyright to IJIRSET DOI:10.15680/IJIRSET.2015.0410134 10489

Fig. 1: Block Diagram of The Proposed System A. Silhouette Extraction The main advantage of this method is the privacy consideration. So for meeting this purpose, silhouette extraction will be the next step. A silhouette is an image in single colour with no features within. So the person s privacy can be maintained. The accuracy of silhouette extraction depends on how well the background is modelled. It is a background change detection technique. The background subtraction method implemented uses a mixture of Gaussians. First, a statistical model of the scene is built. The intruding object or foreground can be detected by checking the parts of the image that does not fit the statistical model. Each pixel is modelled as a mixture of Gaussians in the background model. B. Extraction of the Image Moments Fig. 2: A Frame After Silhouette Extraction (right) After obtaining the silhouettes from the image sequence, the next step in the algorithm is extracting image moments, as shown in the block diagram. Image moments are applicable in a wide range of applications such as pattern Copyright to IJIRSET DOI:10.15680/IJIRSET.2015.0410134 10490

recognition and image encoding. In this detection process, we are using the Zernike moments. They are often used as shape descriptors of images. The Zernike polynomials in polar coordinates are given as follows: V mn (r, θ) = R mn (r) * exp (jnθ) The orthogonal radial polynomial is defined by Where, For a discrete image, if P xy is the current pixel intensity (0 or 1 for binary images), the Zernike moments are given by Moments were used in this experiment with order m = 4 and angular dependence n = 2. The Zernike orthogonal moments comprise image moments with higher performance in terms of noise resilience, information redundancy, and reconstruction capability. C. Classification Using Neural Network A neural network is a computational structure inspired by the study of biological neural processing. One of the most effective learning methods to approximate real-valued, discrete-valued, and vector valued functions is neural network. Classification and regression problems, such as handwritten characters, recognizing spoken words, and face recognition widely uses neural network. One of the most useful neural networks in function approximation is multilayer perceptron (MLP) network. A MLP has an input layer, several hidden layers, and an output layer. A node in a MLP network includes a summer and a nonlinear activation function g. Fig. 3: A Multilayer Perceptron Network With One Hidden Layer. Copyright to IJIRSET DOI:10.15680/IJIRSET.2015.0410134 10491

The weights w gets multiplied with the inputs x to the neuron and then summed up together with the constant bias term θ. The resulting n is the input to the activation function g. A MLP network is formed by connecting several nodes in parallel and series. It is a nonlinear parameterized map from input space to output space. Activation functions g are usually assumed to be the same in each layer and known in advance. Given input-output data, finding the best MLP network is formulated as a data fitting problem. The parameters to be determined are weights and biases. The procedure for using neural network includes the following steps. First the designer needs to fix the structure of the MLP network architecture: the number of hidden layers and neurons (nodes) in each layer. The activation functions for each layer are also chosen at this stage, that is, they are assumed to be known. Many algorithms exist for determining the network parameters. One of the most well-known is the back-propagation algorithm. Back propagation training takes place in 3 stages as follows: 1. Feed forward of the input training pattern. 2. Back propagation of the associated error 3. Weight adjustment. During feed forward stage, each input neuron receives an input signal and then it is passed to each hidden neurons, which in turn computes the activation and passes it on to its output unit, which again computes the activation to obtain the net output. During training, the net output is compared with the target value and the appropriate error is calculated. From the error, the error factor is obtained. It is used to distribute the error back to the hidden layer. Then the weights are updated accordingly. The average squared error between the network outputs a and the target outputs t is defined as follows: The feed forward back propagation network does not have feedback connections, but errors are back propagated during training. After training, the weights and biases which are tuned through several iterations will be stored. When our video input is given, the activities will get classified by the network and hence we can get the display of each activity in each colour. Also we can provide an alarm in case of emergency. III. EXPERIMENT AND RESULTS In this detection method, input videos with walking, bending and lying activities are given. The feed forward back propagation network used has 4 hidden layers. First, second, third and fourth hidden layers contain 10, 100, 80 and 3 neurons respectively as shown in fig. 4. The transfer function selected was log-sigmoid transfer function. During training of the network, 10,000 iterations were performed. Copyright to IJIRSET DOI:10.15680/IJIRSET.2015.0410134 10492

Fig. 4: Network Training Result Input to the training section was several walking, bending and lying frames. The best training performance was at the epoch 9996. Performance is measured using the parameter mean squared error. Fig 5 shows a plot of the performance curve. 10 0 Best Training Performance is 0.073145 at epoch 9996 Train Best Mean Squared Error (mse) 10-1 10-2 0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 10000 Epochs Fig. 5: Performance Curve Produced by the Network Copyright to IJIRSET DOI:10.15680/IJIRSET.2015.0410134 10493

After the training, the weight updates were stored in the network and it is ready to respond to any set of unknown inputs. When the video inputs were given at the rate of 30 frames per second, the activities got identified by the network and hence we got the display of each activity in each colour. Walking posture in blue colour, bending posture in green and lying posture in red colour was displayed as shown in fig. 6. Also we can provide an alarm in case of emergency. Fig. 6: Three of the Resulting Frames After Classification Using Neural Network IV. CONCLUSION The detection method can remotely monitor the activity states of older adults in their apartments. Their privacy is maintained by the usage of silhouettes. Walking posture, bending posture and lying posture is clearly classified and identified by the neural network. An alarm can also be sent on an emergency condition. REFERENCES [1] T. Banerjee, J. M. Keller, Z. Zhou, M. Skubic, and E. Stone, Day or night activity recognition from video using fuzzy clustering techniques, IEEE Trans. Fuzzy systems, vol. 22, no. 3, pp. 483 493, June 2013. [2] T. Banerjee, J. M. Keller, Z. Zhou, M. Skubic, and E. Stone, Activity segmentation of infrared images using fuzzy clustering techniques, presented at the World Conf. Soft Comput., San Francisco, CA, USA, May 2011. [3] H.A. Rowley, S. Baluja, and T. Kanade, Neural network-based face detection, IEEE Trans. Pattern Analysis and Machine Intelli-gence, vol. 20, no. 1, pp. 23 38, Jan. 1998. [4] R. Babuska, P. J. van der Veen, and U. Kaymak, Improved covariance estimation for Gustafson-Kessel clustering, in Proc. IEEE Int. Conf. Fuzzy Syst., 2002, pp. 1081 1085. [5] C. W. Chong, P. Raveendran, and R. Mukundan, A comparative analysis of algorithms for fast computation of Zernike moments, Pattern Recognit., vol. 36, pp. 731 742, 2003. [6] T. Banerjee, Activity segmentation with special emphasis on sit-to-stand analysis, Master s thesis, Dept. Electr. Comput. Eng., Univ. Missouri, Columbia, MO, USA, 2010. [7] H. Fang and H. Lei, BP neural network for human activity recognition in smart home, in Proc. IEEE Int. Conf. Comput. Sci. Serv. Syst., 2012,pp. 1034 1037. Copyright to IJIRSET DOI:10.15680/IJIRSET.2015.0410134 10494