外文翻译-评估不同的模式识别苹果的技术分类外文翻译-评估不同的模式识别苹果的技术分类

收藏

压缩包内文档预览:
Available at vier r techniques for apple sorting I ˙ . Kavdır a,C3 , D.E. Guyer b a Department of Agricultural Machinery, C- anakkale b Department of Biosystems and Agricultural article info Article history Received 4 November 2006 Accepted 24 September 2007 Available online 28 November 2007 nology handling systems, consistency is the most important uire- algorithms have been studied for classification of agricultural ARTICLE IN PRESS BIOSYSTEMS ENGINEERING 99 2008 211–219 C3 Corresponding author. advantage the artificial classifiers provide in classification of agricultural commodities. In addition, the advantages of automated classification operations over conventional manual products. The number of features plays a key role in determining the efficiency of the pattern classification in terms of time and accuracy. 1537-5110/-see front matter Luo et al., 1999; in both studies, non- parametric classification approaches pered better com- pared to statistical s although the difference was not significant in the potato classification study Kirsten et al., 1997. Kim et al. 2000 applied linear and non-linear recognition models for classification of fruit. Various feature extraction and dimensionality reduction techniques were pered on the spectral data obtained from visible and near-infrared w m class m x ij original value of the jth feature of ith pattern x ij ’ normalised value of the jth feature of ith pattern e estimate of true error rate m i mean of class i S i covariance matrix for class i Subscripts i index for the test patterns j number of features k index for the training patterns 99 2008 211–219 spectra. Linear pattern recognition techniques such as linear discriminant analysis LDA and non-linear techniques based on multi-layer perceptrons MLPs were used to classify the products. In the results, non-linear approaches produced superior classification results. Penza et al. 2001 used pattern recognition techni- ques to classify food, beverages and perfumes. Suc- cessful results were obtained with PCA and cluster analysis s. Leemans et al. 2002 developed an on-line fruit grading system based on external quality features of apples using quadratic discriminant analysis and NNs. Both grading algorithms resulted in similar results 79 and 72 for both varieties studied. Similarly, Hahn et al. 2004 used discriminant analysis and NNs to detect Rhizopus stolonifer spores on tomatoes using spectral reflectance. The NN classifier outpered the discriminant analysis approach. circumference measuring device Cranton Machinery Co.. Weight was measured using an electronic scale Model no 3.3. Classification algorithms 3.3.1. Pre-processing of data and feature selection where x ij is the original value of the jth feature of the ith pattern, x 0 ij the normalised value of the jth feature of the ith pattern, m j the mean of the jth feature, S j the standard deviation of the jth feature m j ¼ 1 n X n i¼1 x ij 2 and S j ¼ 1 n X n i¼1 ðx ij C0 m j Þ 2 . 3 Three feature sets were used in the classification applica- tion. Two different subgroups with four and five features ARTICLE IN PRESS CT1200-S serial no 3403, capacity 120070.1g. Programming for the classifiers was done in Matlab. 3.2. Data collection and handling The number of apples used for each class was determined based on the availability of specially featured apples in the set of apples collected for this study. The total number of apples was 181 which included three classes as bad class-3, medium class-2 and good class-1 quality. The size of the pattern matrix was 181C29 where nine represented the number of features. Eighty of the apples were kept at room temperature for fourdays after harvest while another 80 were kept in a cooler at about 31C for the same period, without applying any quality pre-sorting, to create colour variation on the surfaces of apples. In addition, 21 of the apples were harvested before the others and kept for a further 15 days at room temperature for the purpose of creating a variation in the appearance of the apples to be tested. Apples were graded first by a human expert and then by the classification algorithms developed. The expert was trained on the external quality criteria of apples for good, medium and bad apple groups defined by USDA standards USDA, 1976. The USDA standards for apple quality explicitly define the quality criteria so that it is quite straightforward for an expert to follow up and apply them. Extremely large or small apples were already excluded by the handling personnel. Apples were graded by the human expert into three quality 3. Materials and s 3.1. Data acquisition Nine features were measured from Golden Delicious apples. These were hue angle for colour, shape defect, circumfer- ence, firmness,weight,blushrednatural spots on the surface of the apple percentage, russet a natural netlike ation on the surface of an apple, bruise content and number of natural defects. Firmness was measured using a Magnes- s–Taylor MT tester applying an 11mm diameter probe into about an 8mm depth Effegi-McCormick, Yakima-FT-327. Colour was measured using a CR-200 Minolta colorimeter in the domain of L, a and b, where L is the lightness factor and a and b are the chromaticity coordinates Ozer et al., 1995. The Hue angle tan C01 b/a, whichwas used to represent the colour of apples, was shown to be the best representation of human recognition of colour Hung et al., 1993. The sizes of the surface defects natural and bruises on apples were deter- mined using a special figure template, which consisted of a number of holes of different diameters. In addition, a shape defect lopsidedness was measured using a Mitutoya electro- nic calliper Mitutoya Corporation and taking the ratio of the maximum height of the apple to the minimum height. The maximum circumference was measured using a Cranton BIOSYSTEMS ENGINEERING groups depending on the expert’s experience, expectations and USDA standards USDA, 1976. The numbers of apples determined for each quality group by the human expert are Each feature was normalised using Eq. 1 below to eliminate the unit difference between them x 0 ij ¼ x ij C0 m j S j , 1 Different classification algorithms were applied to classify apples after measuring the quality features and grading of the apples by the expert. The perance of the classifiers was compared with one another and the expert. Parametric and non-parametric classifiers were used. Para- meter estimation in the parametric approach was done using maximum likelihood estimation. Estimated parameters were then put in the plug-in decision rule PDR. For the non- parametric approach, the K-NN decision rule with different values of K 1, 2, 3, a decision tree DT classifier which was ed in the S-plus Venables x i2 ; .;x id Þ 1 ð2PÞ d2 jS i j 12 exp C0 1 2 ½ðx C0 m i Þ T S C01 i ðx C0 m i ÞC138 C26C27 , 4 where d is the number of measurements and the expres- sion expfC0 1 2 ½ðx C0m i Þ T S C01 i ðx C0 m i ÞC138g is the Mahalanobis distance ARTICLE IN PRESS second discriminant variable –4 –2 0 –2 –1 0 1 G G G G G G G G G G G G G G G G G G G G G G G G G G GG G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G M M M M M M M M M M M M M M M M M M M MM M B B B B B second principal component –2 0 4 –2 –1 0 2 GG G G G G G G G G G G G G G G G G G G GG G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G M M M M M M MM M M M M M M M M M M M M M M B B B B B 2 3 2 first discriminant variable 2 first principal component 1 Fig. 1 – Visualisation of the 3 quality classes of apples in two-dimensional projections of a four-dimensional feature space; the training data were transed using both linear discriminant analysis and principal component analysis, and then the 3 anal BIOSYSTEMS ENGINEERING 99 2008 211–219214 first 2 linear discriminant variables and first 2 principal compon quality apple; M, medium class-2 quality apple; B, bad class- second discriminant variable –4 –2 0 first discriminant variable –2 –1 0 2 G G G G G G G G G G G G G G G G G G G GG G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G M M M M M M M M M M M M M M M M M M M M M B B B B B 2 1 3 Fig. 2 – Visualisation of the 3 quality classes of apples in two-dime training data were transed using both linear discriminant 2 linear discriminant variables and first 2 principal components apple; M, medium class-2 quality apple; B, bad class-3 quality ents were plotted for quality groups G, good class-1 quality apple. second principal component –2 0 4 –3 –2 –1 0 2 G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G GG G G G G G G G G G G G G M M M M M M M M M M M M M M M M M M M M M B B B B B first principal component 2 1 nsional projections of a five-dimensional feature space; the ysis and principal component analysis, and then the first were plotted for quality groups G, good class-1 quality apple. ARTICLE IN PRESS measurement that, together with the exponential component in the Gaussian distribution function, makes the probability of a pattern small if it is far from the mean. Priori probabilities of pw 1 , pw 2 and pw 3 were determined using training samples fromeach class. Unknown parameters of m i and S i , which are the mean and covariance matrix for class i, were estimated from training samples using the technique of maximum likelihood estimation given as follows m i ¼C0 1 n X n i x i , 5 S i ¼C0 1 n i X n i ðx C0 m i Þ 2 . 6 Using these parameters, the discriminant function for each second discriminant variable –8 –6 –4 –2 0 –2 G G G G G G G G G G G G G G G G GG G G G G G G G G G G G G G G G G G G G G GGG G G G G G G G G G G G G G G G M M M M M M M M M MM M M M M M MM M M M B B B B B 2 first discriminant variable 0 2 4 Fig. 3 – Visualisation of the 3 quality classes of apples in two-dimensio the training data were transed using both linear discriminant first 2 linear discriminant variables and first 2 principal componen quality apple; M, medium class-2 quality apple; B, bad class-3 BIOSYSTEMS ENGINEERING class was calculated using Eq. 7 g i ðx i Þ¼C0 1 2 ðx i C0m i Þ T S C01 i ðx i C0 m i ÞC0 1 2 ln S i C12 C12 C12 C12 þ ln Pðw i Þ. 7 Using the value of g i x i in the decision rule given below, the test pattern was assigned to the class whose discrimination function output the highest value. The discrimination func- tion g m x i for class w m is given as follows Assign x i to class w m if g m ðx i Þ4g k ðx i Þ for all mak, 8 where m and k are the classes for x i to be assigned. 3.3.3. K-nearest-neighbour K-NN classifier The nearest-neighbour classifier is a non-parametric classi- fier which does not make any assumptions on the of the conditional densityof a class and assignsthepattern thelabel most frequently represented among the K-nearest samples in the training set Alchanatis et al., 1993. K represents the number of nearest neighbours. To assign a test pattern a class, the entire training data are used, i.e. distances between the test pattern and each of the training patterns have to be measured. Although training of this classifier is quite simple, it requires a large computer memory as it has to keep the ination of every sample in the training set. Various metrics are used in measuring the similarity between patterns. The Euclidean distance was used to measure the distance similarity between patterns in this study. In the 1-NN classifier, the test pattern was assigned to the class, which contained the pattern training that was closest to the measured pattern testing. Also, 2-NN and 3-NN classifiers were tested; in these classifiers, the pattern was assigned to the class which had the majority of patterns represented in K-NN. When eachof the nearest-neighbour patternstraining was from a different class in 2-NN and 3-NN classifiers, the test pattern was assigned to the class that had the closest member to the test pattern, i.e. 1-NN classification procedure was appliedwhen each nearest-neighbour pattern was from a different quality class in 2-NN and 3-NN classifiers. The euclidean Distance is expressed as follows; X d 2 3 12 nal projections of a nine-dimensional feature space; analysis and principal component analysis, and then the ts were plotted for quality groups G, good class-1 quality apple. second principal component –2 0 2 –3 –2 –1 G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G GG G G G G G G G G G G G G G G G G G G G G G G G G G M M M M M M M M M M M M M M M M M M M M M M B B B B B 0 1 2 first principal component 46 99 2008 211–219 215 d E ði;kÞ¼ j¼1 ðx i C0 x k Þ 2 4 5 , 9 where d is the numberof features, d E is the Euclidean distance index between the patterns i and k, x i is the position of the test pattern and x k is the position of the training pattern. 3.3.4. Decision tree DT classifier The non-parametric and hierarchical DT classifier was ed by splitting of subsets of the training data into descendant subsets. Binary tree structure was used to the DT which was generated in S-plus Venables it consists of an layer with a number of neurons equal to the number of features, hidden layers and an output layer. With no hidden layer, the perceptron can only per linear tasks. experiments continued using two hidden layers in the structure of the NN with different number of neurons in it. Eight neurons in the first hidden layer and four neurons in the second hidden layer produced the optimal results compared to the rest of the trial and error experiments to find the optimal structure of the NNs. A non-linear ‘‘tansig’’ transfer function was used in the hidden layers. One neuron was used in the output layer with a linear transfer function, ‘‘purelin’’. Different types of functions were used in the trial and error procedure to find the optimal function that is effective on the classification success of the NN structure. ‘‘Purelin’’ was the most efficient in the output neuron. Values of 0.25 and 0.0025 were, respectively, selected as the coefficients of learning rate and momentum based on trial and error. The maximum number of iterations in training was set to 3000. The error rate convergence crite- rion for the NNs to stop learning was 0.02. NNs Toolbox in Matlab was used to implement the classifier Howard the number of neurons in the layer were four, five and nine all the features measured, respectively, for three different sets of features Fig. 4; and the number of neurons in the first and second hidden layers was eight and four, respectively the number of hidden layers and the number of neurons in the hidden layers were selected based on trial and error. Experiments started with using one hidden layer first, using different number of neurons in it. Later, trial and error Weights Weights 1 2 3 1 2 3 4 First Hidden Layer Fig. 4 – Schematic display of the MLP Weights Layer Output Layer Error Target Out 4 3 2 1 Error estimation was pered on classification results obtained from training and testing the classifiers. In uat- ing a classifier, half of the data was used to train the classifier while half of it wasused for testing Table 1. The error rate for the NN classifier was calculated at the end of training and testingprocedures, based on the differences between targeted and calculated outputs. Classification accuracy for the classifiers studied was calculated using C15 ¼ e N , 10 neural network classifier. where e is the estimate of the true error rate, N is the total number of samples and eis the number of misclassified patterns. 4. Results and discussion The classification results obtained using one parametric and five non-parametric classification algorithms and three different feature sets are given in Tables 2 and 3. It should be noted that the successes of the classifiers are being compared with the classification results obtained from the human expert. Subjectivity is involved in the classification peredbythehumanexperteventhoughtheexpert followed the USDA standards. Therefore, it is not expected that an expert could per a 100 correct classification. Possible measurement errors from devices should also be considered. Also, the data used in this study werefrom the samevariety of apples in which the morphological properties did not differ ARTICLE IN PRESS Table 3 – Confusion matrices for the results obtained in testing a Classifier Class 4 Features 5 Features 9 Features 123123123 1 42 22 0 44 20 0 56 8 0 Plug-in decision rule 2 3 19 0 1 21 0 13 9 0 3 0 400 403 1 Error rate 0.322 0.278 0.278 1 585160406040 1-Nearest neighbour 2 8 9 5 7 11 4 15 7 0 3 1 211 121 21 Error rate 0.244 0.189 0.244 1 585160406040 2-Nearest neighbour 2 8 9 5 7 11 4 15 7 0 3 1 211 121 21 Error rate 0.244 0.189 0.244 1 631064006400 3-Nearest neighbour 2 16 5 1 17 4 1 20 2 0 3 3 104 003 1 0 8 0 0 3 3 0 22 w Table 2 – Perances of the classifiers using different number of features Classifier Classification success, 4 Features a 5 Features b 9 Features c Plug-in decision rule 67.78 72.22 72.22 1-Nearest neighbour 75.56 81.11 75.56 2-Nearest neighbour 75.56 81.11 75.56 3-Nearest neighbour 75.56 75.56 73.33 Decision tree classifier d 75.56 – – Multi-layer perceptron 83.33 88.89 90.00 a Colour, shape defect, weight, russeting. b Colour, shape defect, firmness, weight, russeting. c Colour, shape defect, circumference, firmness, weight, blush percentage, russeting, size of bruises, size of natural defects. d Used only with the sub-set including 4 features. BIOSYSTEMS ENGINEERING 99 2008 211–219 217 Error rate 0.167 a Numbers in the diagonal from the left corner to the right corner sho Error rate 0.244 1613 Decision tree classifier b 277 313 Error rate 0.244 1 614 Multi-layer perceptron 2 6 13 the numbers off the left to the right diagonal show the number of misclass b Used only with four features. 0.244 0.267 – – 61306220 6 16 0 6 16 0 1 030 13 0.111 0.100 the number of correctly classified patterns for the related class while ified patterns. patterns per class was not equal; 128 patterns in class-1, 44 nine-featured sub-set, which contained all the features be seen that the best classifications were pered by the ARTICLE IN PRESS originally measured. Using the five-featured sub-set of features decreasedthe timefor theclassification applications, which resulted in the same or higher classification successes compared to using nine-featured subsets. Using nine features in the NN classifier, however, resulted in the highest classification success. 4.2. Comparison of classification algorithms in terms of classification perance Whenthe classification resultswerecompared in termsof the successes of the classification algorithms, MLP NNs yielded the most successful results for all the feature sets studied. This may be attributed to the ability of this classifier in assessing the non-linearities between the features and the output classes. The most successful result 90, Table 2 using NNs classifier was obtained using nine features; there- fore, improved classification results can be expected with more features as in the case of using a nine-featured set in the NNs classifier. However, MLP NNs using a five-featured sub-set produced a classification perfor- patterns in class-2 and only nine patterns in class-3. This situation may have led to poor training perance espe- cially for class-3. Different feature sets were tested on the classifiers. The results obtained from these applications are reported in the following sections. 4.1. Effect of different feature sub-sets on classification perance Different subsets of nine initial features were tested on the classifiers four, five and nine featured Tabl
编号:20190813134251861    类型:共享资源    大小:841.10KB    格式:ZIP    上传时间:2019-08-13
  
10
关 键 词:
外文 翻译 评估 不同 模式识别 苹果 技术 分类
资源描述:
外文翻译-评估不同的模式识别苹果的技术分类,外文,翻译,评估,不同,模式识别,苹果,技术,分类
  装配图网所有资源均是用户自行上传分享,仅供网友学习交流,未经上传用户书面授权,请勿作他用。
0条评论

还可以输入200字符

暂无评论,赶快抢占沙发吧。

关于本文
本文标题:外文翻译-评估不同的模式识别苹果的技术分类
链接地址:https://www.zhuangpeitu.com/p-684144.html

当前资源信息

4.0
 
(2人评价)
浏览:47次
QQ-1****6396上传于2019-08-13

官方联系方式

客服手机:17625900360   
2:不支持迅雷下载,请使用浏览器下载   
3:不支持QQ浏览器下载,请用其他浏览器   
4:下载后的文档和图纸-无水印   
5:文档经过压缩,下载后原文更清晰   

相关资源

  • 自动送料机构的设计【含CAD图纸、说明书】自动送料机构的设计【含CAD图纸、说明书】

  • 压力机 三维图纸压力机 三维图纸

  • 拧盖机设计图纸拧盖机设计图纸

  • 菠萝采摘器结构设计(含CAD图纸、SW三维模型、说明书)菠萝采摘器结构设计(含CAD图纸、SW三维模型、说明书)

  • CA6140拨叉(831006零件)的加工工艺规程及镗Φ55孔夹具设计【含CAD图纸、说明书】CA6140拨叉(831006零件)的加工工艺规程及镗Φ55孔夹具设计【含CAD图纸、说明书】

  • CA6140车床拨叉831006铣拨叉铣侧面夹具设计【含CAD图纸、说明书】CA6140车床拨叉831006铣拨叉铣侧面夹具设计【含CAD图纸、说明书】

  • 拨叉831006铣16mm槽加工工艺及夹具设计【含CAD图纸、说明书】拨叉831006铣16mm槽加工工艺及夹具设计【含CAD图纸、说明书】

  • CA6140车床拨叉831006零件的机械加工工艺规程及铣16槽夹具设计【含CAD图纸、说明书】CA6140车床拨叉831006零件的机械加工工艺规程及铣16槽夹具设计【含CAD图纸、说明书】

  • 拨动叉加工工艺及(一道工序)夹具设计【含CAD图纸、卡片、说明书】拨动叉加工工艺及(一道工序)夹具设计【含CAD图纸、卡片、说明书】

  • CA6140车床拨叉831005铣18mm下槽夹具装配图CAD图纸CA6140车床拨叉831005铣18mm下槽夹具装配图CAD图纸

  • 设计831007型号拨叉零件加工工艺规程及钻削φ8mm孔工序专用夹具【含CAD图纸、说明书】设计831007型号拨叉零件加工工艺规程及钻削φ8mm孔工序专用夹具【含CAD图纸、说明书】

  • 拨叉零件加工工艺规程及加工M6底孔夹具设计【含CAD图纸、说明书、工艺工序卡片】拨叉零件加工工艺规程及加工M6底孔夹具设计【含CAD图纸、说明书、工艺工序卡片】

  • CA6140车床拨叉831008零件加工工艺及中间铣断夹具设计【含CAD图纸、说明书、工艺工序卡片】CA6140车床拨叉831008零件加工工艺及中间铣断夹具设计【含CAD图纸、说明书、工艺工序卡片】

  • CA6140-车床拨叉零件机械加工工艺规程及工艺装备钻831003拨叉钻2&amp#215;M8孔夹具设计【含CAD图纸、说明书】CA6140-车床拨叉零件机械加工工艺规程及工艺装备钻831003拨叉钻2&amp#215;M8孔夹具设计【含CAD图纸、说明书】

  • CA6140车床拨叉831003零件的机械加工工艺规程及铣底平面保证尺寸80mm夹具设计【含CAD图纸、说明书、工艺工序卡片】CA6140车床拨叉831003零件的机械加工工艺规程及铣底平面保证尺寸80mm夹具设计【含CAD图纸、说明书、工艺工序卡片】

  • 设计拨叉零件的机械加工工艺及车φ55圆弧车床夹具(含CAD图纸、说明书、工艺工序卡】设计拨叉零件的机械加工工艺及车φ55圆弧车床夹具(含CAD图纸、说明书、工艺工序卡】

  • CA6140车床拨叉(831002)机械加工工艺规程及专用夹具设计【钻25孔、铣叉口侧面】(两幅夹具设计含CAD图纸、说明书)CA6140车床拨叉(831002)机械加工工艺规程及专用夹具设计【钻25孔、铣叉口侧面】(两幅夹具设计含CAD图纸、说明书)

  • CA6140车床拨叉831002课程设计说明书、工艺工序卡片、CAD图纸CA6140车床拨叉831002课程设计说明书、工艺工序卡片、CAD图纸

  • KST1002换挡拨叉的铣Φ10上端面夹具设计及加工工艺规程装备含三维及4张CAD图KST1002换挡拨叉的铣Φ10上端面夹具设计及加工工艺规程装备含三维及4张CAD图

  • KST1002换挡拨叉的铣19上端面夹具设计及加工工艺规程装备含三维及4张CAD图KST1002换挡拨叉的铣19上端面夹具设计及加工工艺规程装备含三维及4张CAD图

  • KST1002换挡拨叉的铣10C11左端面夹具设计及加工工艺规程装备含三维及4张CAD图KST1002换挡拨叉的铣10C11左端面夹具设计及加工工艺规程装备含三维及4张CAD图

  • 精品推荐

    装配图网资料搜索

    相关阅读

    装配图网资料搜索
    关于我们 - 网站声明 - 网站地图 - 资源地图 - 友情链接 - 网站客服 - 联系我们

    网站客服QQ:3392350380    装配图网文库上传用户QQ群:460291265   

    copyright@ 2019-2021  zhuangpeitu.com 装配图网版权所有  

    经营许可证编号:苏ICP备12009002号-6