This code do a fruit classification based on color and shape using Naive Bayes Classifier. Two steps are needed to get the results: Train and Test. In training you need to click on test button to start training where the program train the images already inserted in training folder. In testing process you just need to select a fruit and the program will compare it with the trained fruits then gives the results.
In this window we have two main parts, first one: image training window, in it we train the fruits and get the gray scale and binary scale images. In the second window we test the fruits and get two results, first, it’s on the down-left of the window, there is 4 places for numbers (3 RGB colors and 1 for diameter) and 1 for the true name of the fruit. The second result it’s for the probability answer. Figure (3) show the results of the testing for an apple, after training the whole fruits.
Figure (1) – Main Window
About the Code:
There are two main parts of the code, in the first one we train the fruits and gets what’s we want features to recognize the fruit in the test level. In the test level we use the small library of features that we made in the first level and we use this library for showing the true answer, then we use Naive Bayes classifier logarithm for guessing kind of fruit.
When we click on the Training Data button, the program gets the features from the fruits, and gets the gray scale and binary images. As in figure (2).
Figure (2) – training the fruits
After that we need to test any kind of fruits, let it be apple, after clicking the Test Button, we get the information from the fruit, and comparison with the library the we get from the first code, and we get the real name of the fruit, and other information about its colors. The guessed named of the fruits appear in the last place in the picture. As in figure (3).
Figure (3) – results
As we can see the guessed result, is exactly true.
About the Classification Method:
Pattern consciousness is a branch concerning machine lesson that focuses about the recognition about patterns or regularities of data, even though it is into some cases considered in accordance with lie almost synonymous including computing device learning. Pattern recognition systems are between deep instances educated from labeled ‘training’ data (supervised learning), but then no labeled records are reachable ignoble algorithms do remain ancient in accordance with discover earlier strange patterns (unsupervised learning).
Naive Bayes classifier
Naive Bayes classifiers are a family regarding easy probabilistic classifiers based over making use of Bayes’ theorem along intensive (naive) solitude assumptions among the features.
Naive Bayes has been strong substantially considering the 1950s. It was delivered underneath a extraordinary honor of the text retrieval neighborhood within the early Sixties or stays a famous (baseline) approach because of text categorization, the hassle on judging archives as belonging to certain class and the mean (such so unsolicited mail and legitimate, sports activities or politics, etc.) with word frequencies as like the features. With suitable pre-processing, such is aggressive of this domain including more advanced techniques along with support vector machines. It also finds software into automatic medical diagnosis.
Naive Bayes classifiers are quite scalable, requiring a quantity over parameters linear between the number on variables (features/predictors) among a discipline problem. Maximum-likelihood education may stay done through evaluating a closed-form appearance which takes linear time, alternatively than by means of high-priced iterative approximation as old because dense ignoble kinds regarding classifiers. In the information yet pc erudition literature, Naive Bayes fashions are recognised under a variety about names, such as simple Bayes yet seclusion Bayes. All this names reference the use regarding Bayes’ theorem in the classifier’s decision rule, however guileless Bayes is not (necessarily) a Bayesian method longevity.
Naive Bayes is a easy technique because setting up classifiers: fashions as deliver classification labels in accordance with problem instances, represented as like vectors over characteristic values, the place the classification labels are continuous out of half finite set. It is not a unaccompanied algorithm because of training certain classifiers, however a family on algorithms based totally on a common principle: entire naive Bayes classifiers expect up to expectation the charge on a particular feature is impartial about the value about someone other feature, given the category variable. For example, a crop plants may additionally stand considered to keep an apple if that is red, round, then in relation to x cm in diameter.
Toughness stability A innocent Bayes classifier considers each on these services to make a contribution independently to the likelihood so much this albumen is an apple, regardless regarding somebody feasible correlations between the color, roundness, or diameter features.
For some types of likelihood models, guileless Bayes classifiers do remain educated altogether successfully among a supervised instruction setting. In many sensible applications, parameter determination because of naive Bayes fashions use the technique over maximum likelihood; in vile words, some execute job with the guileless Bayes mannequin besides accepting Bayesian chance then using someone Bayesian methods.
Durability longevity permanency despite their plain sketch and interestingly oversimplified assumptions, guileless Bayes classifiers have worked pretty properly in dense complex real-world situations. In 2004, an analysis of the Bayesian classification hassle confirmed to that amount so are echo theoretical reasons because the curiously implausible faculty concerning innocent Bayes classifiers. Still, a complete contrast together with ignoble classification algorithms in 2006 confirmed so much Bayes classification is outperformed by way of lousy approaches, such as like boosted timber or indiscriminately forests.
An advantage over plain Bayes is up to expectation it only requires a baby wide variety on education data to tab the parameters integral because classification. The Naïve Bayes classifier is a statistical array method. It is based of Bayes theorem of subject probabilities:
The goal of this classifier is to find two probabilities:
P(S|D) – The probability that a given document D belongs to the class S or spam. P(¬S|D) – The probability that a given document D does not belong to the class S or spam. If P(S|D) is greater, the document will be classified as spam; otherwise it will be classified as ham.
 Woo Chaw Seng and Seyed Hadi Mirisaee, “A New Method for Fruits Recognition System”, MNCC Transactions on ICT, Vol. 1, No. 1, June 2009.
 Ferat Sahin, “A Radial Basis Function Approach to a Color Image Classification Problem in a Real Time Industrial Application”, Master’s thesis, Virginia polytechnic institute, Blacksburg, 1997.
 Kartikeyan,B and Sarkar,A, “An identification approach for 2-D autoregressive models in describing textures” Graphical Models and Image Processing, vol.53, pp.121-131, 1991.
 Haralick, R. M., Shanmugan, K. and Dinstein, I., “Textural features for image classification”, IEEE Transactions on Systems, Man, and Cybernetics, vol.3, pp.610-621,1973.
 Jain, A and Healey,G, “A multiscale representation including opponent color features for texture recognition”, IEEE Transactions on Image Processing vol.7, No.1, pp. 124-128, 1998.
 Leemans, V. and Destain, M.-F, “A real-time grading method of apple based on features extracted from defects” Journal of Food Engineering, vol.61, pp.83-89,2004.
 Leemans, V., Magein, H. and Destain M.-F, “Defects segmentation on ‘Golden Delicious’ apples by using colour machine vision” Computers and Electronics in Agriculture, vol.20, pp.117-130,1999.
 Sarkar, N, and Wolfe, R. R, “Feature extraction techniques for sorting tomatoes by computer vision” Transactions of the ASAE, vol.28, pp.970-979, 1985.
 Fernández, L., Castillero, C. and Aguilera, J. M., “An application of image analysis to dehydration of apple discs” Journal of Food Engineering, vol.67, pp.185-193, 2005.
 Kondo, N., Ahmad, U., Monta, M. and Murasc, H., “Machine vision based quality evaluation of Iyokan orange fruit using neural networks”, Computers and Electronics in Agriculture, vol.29,pp.135-147, 2000.
 Paliwal, J., Visen, N. S., Jayas, D. S. and White, N. D. G., “Cereal grain and dockage identification using machine vision” Biosystems Engineering, Vol.85, pp. 51-57, 2003.
 Zhao, J.T., J. Katupitiya, J., “On-tree fruit recognition using texture properties and color data”, International conference on Intelligent Robots and Systems, pp. 263-268, 2005.