Download presentation

Presentation is loading. Please wait.

Published byTianna Janet Modified over 3 years ago

1
1 Backpropagation Neural Network for Soil Moisture Retrieval Using NAFE05 Data : A Comparison Of Different Training Algorithms Soo See Chai Department of Spatial Sciences, Curtin University of Technology

2
2 CONTENT Neural Network and Soil Moisture Retrieval Backpropagation Neural Network Training of Neural Network Testing Results Q and A

3
3 Neural Network For Soil Moisture Retrieval Radiometric signatures of a vegetation- covered field reflect an integrated response of the soil and vegetation system to the observing microwave system Surface parameters and radiometric signatures :

4
4 Backpropagation Neural Network

5
5 Different Backpropagation Training Algorithms Several different training algorithms have a variety of different computation and storage requirements No one algorithm is best suited to all locations MATLAB : 11 different training algorithms Review : basic gradient descent and Levenberg-Marquardt(LM) algorithm How about the other algorithms ?

6
6 Data Preparation Roscommon area : 1/11, 8/11, 15/11 Determine the area coordinate : Roscommon : Top latitude : -32.15380 Bottom latitude : -32.18370 Left longitude : 150.120 Right longitude : 150.46900 MATLAB : cut the area, extract the fields in the PLMR file Copy the latitude, longitude, brightness temperature and altitude data into Excel Extract the aircraft altitude of medium resolution mapping which is around 1050m to 1270m ASL

7
7 Roscommon 1/11Roscommon 8/11Roscommon 15/11

8
8 Example :

9
9 Find the minimum and maximum of average Tb for each data set Next find the range (max-min) Find the width for each class ( 3 classes : training, validation and testing ) Range / 3 Find starting and ending point for each class A bit of Statistics …

10
10 We have now : Group 1 of date 1/11, 8/11 and 15/11 (combined : GRP1) Group 2 of date 1/11, 8/11, 15/11 (combined : GRP2) Group 3 of date 1/11, 8/11, 15/11(combined : GRP3) GRP 1 : randomly divide them into 3 groups : 60% for training, 30% for validation and 10% for testing Same with GRP2 and GRP3 All training data in one file, all validation data in one file, all testing data in one file

11
11 Training : K-Fold Cross Validation No. of data set is small, to get a better accuracy result, K-fold validation is used. Training data + validation data = 112 8-fold cross validation, each time 14 data will be used for validation, 98 data for training To make sure the data is random enough, each time the data will be randomized. Eg: First run : Second run : validationtraining validationtraining

12
12 Training :NN Parameters determination A series of experiments trial and error lowest RMSE If yes, then save the input weight, layer weight and bias for the NN to be used for the other training algorithms Fixed Layer : 3 layers ( 1 input, 1 hidden, 1 output ) Input : H polarized brightness temperature, TbH and physical soil temperature at 4cm Hidden : sigmoid function Output : linear function Soil moisture (%v/v)

13
13 Experiments carried out : Decision : Learning rate, lr = 0.005 Momentum, mc = 0.4 Input Weight, iw = W2.mat Layer weight, lw = LW2.mat Bias, b = B2.mat No. of hidden neuron = 4 No. of epochs = 200

14
14 Testing Result I – Roscommon No.Backpropagation AlgorithmRMSE (%) 1.Batch Gradient with Momentum (traingdm)4.86 2.Gradient Descent with Adaptive Learning Rate (traingda)5.34 3.Gradient descent with momentum and adaptive learning rate (traingdx) 4.88 4.Resilient backpropagation (trainrp)4.93 5.Conjugate gradient backpropagation with Fletcher-Reeves updates (traincgf) *finish : epochs = 2 4.82 6.Conjugate gradient backpropagation with Polak-Ribiére updates (traincgp) *finish : epochs = 2 4.83 7.Conjugate gradient backpropagation with Powell-Beale restarts (traincgb) *finish : epochs = 2 4.83 8.Scaled conjugate gradient backpropagation (trainscg)5.77 9.Quasi-Newton Algorithm :BFGS (trainbfg)3.93 10.Quasi-Newton Algorithm :One step Secant Algorithm (trainoss)5.51 11.Levenberg-Marquardt (trainlm)4.04

15
15 Conclusions Different types of training algorithms of backpropagation NN is giving different but similar accuracy result The training data is representative of the testing data

16
16 Questions Is the NN architecture transferable ? Is number of data a factor contribute to the accuracy of the retrieval ? Adding ancillary data (beside soil temperature) : vegetation water content and land cover information help ? Adding V-polarized brightness temperature as an input ? Adding these data directly or let the NN account for these data ?

17
17

Similar presentations

OK

WELCOME. Malay Mitra Lecturer in Computer Science & Application Jalpaiguri Polytechnic West Bengal.

WELCOME. Malay Mitra Lecturer in Computer Science & Application Jalpaiguri Polytechnic West Bengal.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on cross site scripting xss Ppt on reflection in java Ppt on phonetic transcription converter Ppt on computer languages to learn Ppt on cash flow analysis Ppt on resistance spot welding Ppt online viewer for word Ppt on online shopping in java Ppt on rainwater harvesting in india download Ppt on high voltage dc transmission