Presentation on theme: "Prediction of Draping Behaviour of Handloom Cotton Fabrics by Artificial Neural Network Ashis Mitra Department of Silpa-Sadana, Visva-Bharati University,"— Presentation transcript:
Prediction of Draping Behaviour of Handloom Cotton Fabrics by Artificial Neural Network Ashis Mitra Department of Silpa-Sadana, Visva-Bharati University, Sriniketan, Birbhum, West Bengal, India, 731236. firstname.lastname@example.org
Woven Textile Fabric A woven fabric is made up of two mutually perpendicular (orthogonal) sets of yarns. The longitudinal sets of yarns are termed as warp and the transverse yarns are termed as weft. The linear densities (i.e., coarseness or fineness) of yarns are expressed as yarn count. Ne is one of the units of yarn count, Relative closeness of yarns in a fabric is expressed in terms of ends per inch or EPI (for warp yarns) and picks per inch or PPI (for weft yarns). Fig. 1: Plain woven fabric showing interlacement of warp and weft yarns
Fabric Drape Fabric drape is one of the most important quality features for assessing fabric performance in apparel. Drape affects the aesthetics of fabrics used in garments. Drape is the term used to describe the way a fabric hangs down under its own weight in folds. It indicates the conformity of garments to body contours [1-2].
Fabric Drapability Drape is generally expressed by the so-called ‘drape coefficient’. The higher the drape coefficient, the lower the fabric drapability, or the lower the propensity to drape. Knitted fabrics generally exhibit lower drape coefficients or higher propensity to drape than woven fabrics [1-2].
Materials & Methods A) Sample Preparation Test samples for this study consist of 25 plain woven cotton handloom fabrics, which have been manufactured using semi- automatic handloom. During manufacturing of the samples, fabric constructional parameters namely EPI, PPI, warp count and weft count were varied as much as possible so that the samples cover a wide range of variability. B) Fabric Construction Parameters Thread density (i.e. EPI and PPI) was measured by using a pick glass. Yarn count (Ne) was determined on a direct yarn count balance.
Evaluation of Fabric Drape Coefficient The Cusick Drape Tester  was used to measure fabric drape coefficient. Fig. 2 shows the principle of Optical Drape Meter. Fig. 2: Principle of Cusick Drape Test
Drape Coefficient is calculated using the following equation: Fig. 3: Drape test, top view of draped fabric Fig. 4: Paper ring Fig. 5: Paper ring with draped image
Sample No. EPIPPIWarp count [Ne] Weft count [Ne] Drape coefficient [%] Minimum 31.2024.405.606.1063.34 Maximum 101.50115.5085.2090.3080.09 Mean 69.6669.7933.8837.89 70.07 SD 11.4422.8514.0115.50 5.56 CV% 16.4332.7441.3640.92 7.94 Table 1: Summary statistics of fabric constructional parameters and drape coefficient
Artificial Neural Network ANNs are computational systems, either in hardware or in software, which imitate the behaviour of biological neurons by using a large number of interconnected artificial neurons. The use of ANN is a rather new approach to computing in textile engineering as the ANN possesses the unique capabilities like prediction, pattern recognition, generalization, fault tolerance, and high speed information processing. In ANN, each neuron or processing element receives a signal from the neurons of the previous layer and each of these signals is then multiplied by a separate weight known as synaptic weight. The weighted inputs are then summed up and passed through a transfer function, which converts the output to a fixed range of values [3-4].
….. Contd. The output of the transfer function is then transmitted to the neurons of the next layer. Finally, the output is produced and is compared with the expected output. The error signal, thus generated, is used to optimize the synaptic weights connecting the neurons. ANN learns from real-life examples known as ‘training data’, and during learning (which may be supervised or unsupervised/self-organizing) the weights connecting the neurons present in different layers are so optimized that the error signal reduces in each iterative steps [3-4]. Once sufficiently trained, ANN can be utilized to predict unknown instances by presenting testing datasets as the input.
Back Propagation Algorithm Training occurs in two phases, namely, forward pass and backward pass. In the forward pass, a set of experimental data is presented to the network as input and a set of outputs is produced, and the error vector is calculated according to the following equation: where E is the error vector, E j is the error associated with the j-th pattern and P is the total number of training patterns, T k and O k are the target output and predicted outputs at output node k, and S is the total number of output nodes .
In the backward pass, the error signal is propagated backwards to the network and the synaptic weights are adjusted in such a manner that the error signal reduces in each iteration step. The corrections required in the synaptic weights between the output and the hidden layers are carried out by a delta rule, which may be expressed as follows for log-sigmoid transfer function [3, 5]: where W jk is the weight connecting the neurons j of hidden layer and neuron k of output layer, W jk is the correction applied to W jk at a particular iteration, is the learning rate and O j is the output of neuron j.
Structure of ANN and its Parameters Input layerHidden layer Output layer Drape Coefficient Ends per inch (EPI) Picks per inch (PPI) Warp count (Ne) Weft count (Ne)
Structure of ANN and its Parameters Six network structures having 5 to 10 nodes in the single hidden layer were tried for predicting fabric drape coefficient. Learning rate and momentum were optimized by trial and error method to 0.5 and 0.2, respectively. To overcome the problem of under-trained or over-trained network, training was stopped as soon as MSE of testing dataset reached the minimum level. From the available 25 data sets, 20 data sets were used for the ANN training purpose, and 5 data sets were used for testing or validation of ANN models.
Structure of ANN and its Parameters 8 nodes in the single hidden layer were giving the best prediction results after 4500 epochs/iterations. Training was done with back-propagation algorithm developed by Rumelhart et al. , using the EasyNN Plus software. The log-sigmoid transfer function used in the hidden and output layers is as follows: Here y o is the transformed output from the node and I is the weighted sum to the node .
ANN Model for Predicting Drape After completion of training, the unseen testing data sets were presented to the trained ANN for the prediction of fabric drape coefficient. Statistical parameters calculated: Correlation coefficient (R) Mean absolute error% Mean squared error (MSE).
Data sets Statistical parameters Nodes 5678910 Training Correlation coefficient (R) 0.9090.9590.9430.9530.8900.932 Coefficient of determination (R 2 ) 0.8260.9200.8890.9080.7920.869 Mean absolute error% 2.4991.7612.0931.8763.0162.721 Mean squared error (MSE) 5.7892.6233.7062.9918.4176.011 Table 2: Prediction Performance of Different ANN Models during Training
Table 3: Prediction Performance of Different ANN Models during Testing Data sets Statistical parameters Nodes 5678910 Testing Correlation coefficient (R) 0.9120.8770.8830.9330.8920.880 Coefficient of determination (R 2 ) 0.8320.7690.7800.8700.7960.774 Mean absolute error% 5.0194.3984.5694.7815.8695.323 Mean squared error (MSE) 15.19810.54711.8932.33420.26917.259
All the models show very high prediction accuracy (R>0.87) in both training and testing data sets. The mean error of prediction is always lower than 5.90 % in testing data sets. TABLE 2 & 3 show that ANN model with 8 nodes in the hidden layer exhibited the overall best performance. For this ANN model, the R 2 in training and testing data sets were 0.908 and 0.870 respectively. The comparable values of coefficient of determination in the training and testing data sets imply good generalization of the ANN model. Observations:
Fig. 2: Prediction performance of ANN model in training data All training samples except sample no. 14 and 15, are showing good association between the actual and predicted drape coefficient values.
Fig. 3: Prediction performance of ANN model in testing data Test samples 1 and 5, are showing rather high prediction error Reason Use of limited no. of data sets for the training of ANN model.
Prediction Performance of ANN Model Prediction accuracy of the accepted ANN model (with 8 nodes in hidden layer) is exceptionally good. The correlation coefficients for this ANN model are 0.953 for training and 0.933 for unseen testing data sets, the mean absolute error of prediction being only 1.876% for training and 4.781% for unseen testing data sets.
Training sample No. Actual Drape coefficient [%] Prediction results of ANN Predicted Drape coefficient [%] Absolute Error (%) 165.9068.614.112 267.6268.200.858 368.2869.972.475 466.0068.263.424 580.0979.281.011 675.7675.920.211 763.3464.111.216 879.5377.842.125 978.4379.190.969 1064.8164.950.216 1168.7669.160.582 1265.0864.980.154 1367.0165.901.656 1470.9767.035.552 1566.6570.145.236 1668.0967.900.279 1768.3167.491.200 1880.0077.682.900 1965.4864.970.779 2077.9479.942.566 Table 4: Detailed Prediction Results for Training Data Observation : No training sample is showing prediction error more than 6%
Impact of Input Parameters on Drape For judging the relative importance of input parameters, an input-saliency test was carried out by eliminating one designated input from the optimized ANN model at a time. ANN training was then initiated with the same learning rate and momentum, and continued up to the same number of iterations as done for the optimized network. The difference in the prediction performance was measured by the percentage change in the value of MSE for the testing data sets. A higher percentage change in MSE signifies higher importance or saliency of the eliminated input and vice versa.
ParameterANN model MSE% increase in MSE Rank EPI10.257339.454 PPI49.2902011.832 Warp count 119.5035020.081 Weft count 19.734745.513 Table 5: Impact of Input Parameters on Fabric Drape Coefficient Ranking of input parameters as determined by the Input Saliency test of ANN model reveals: Warp count, PPI and weft count are the first three contributors to fabric drape coefficient in descending order of importance
Conclusion: Drape coefficient of handloom cotton plain fabrics has been predicted with the help of ANN models using four basic fabric constructional parameters as inputs. ANN model is capable of predicting the drape coefficient with a very high degree of accuracy. Optimized ANN models can perform prediction of drape coefficient with mean absolute error lower than 5% in the testing data sets. Extreme prediction errors are also very rare in case of ANN model. ANN models demonstrated high predictive power as they can handle the non-linear relationships prevailing between the fabric parameters and the fabric properties very easily.
….Contd. Ranking of fabric constructional parameters has been carried out by conducting an input saliency test. Warp count, PPI and weft count are found to be the first three contributors to fabric drape coefficient in descending order of importance. From the experimental results and ongoing discussion, it can be concluded that ANN model not only exhibits very high predictive power but also possesses the capability to handle noisy data, which are inherent in handloom fabrics. Therefore, ANN seems to be a promising data modelling tool, if not the ideal, for endeavouring handloom fabric engineering.
References: 1.H.M. Behery, Effect of mechanical and physical properties on fabric hand, Woodhead Publishing Ltd, Cambridge, England, 2005. 2.B.P. Saville, Physical testing of textiles, Woodhead Publishing Ltd., Cambridge, England, 1999. 3.A. Majumdar, Res J Text Apparel, vol. 3, p. 86, 2010. 4.R. Chattopadhyay and A. Guha, Textile Progress, vol. 38, p. 5, 2004. 5.D.E. Rumelhart, G. Hinton, and R.J. Williams, “Learning Internal Representations by Error propagation”, In Parallel Distributed Processing, vol.1, pp. 318-362, 1986.
Your consent to our cookies if you continue to use this website.