Analysis of Different ways for Improving the Speed and Accuracy of Image Classification Abs tract

Analysis of Different ways for Improving the Speed and
Accuracy of Image Classification
Abs tract: T oday , t he focus of machine learning algorit hms is for learning feat ures from unla-
beled dat a. Nowaday s t he siz e and comp lexit y of t he dat aset increases leads t o increase in sp eed
and accuracy in learning t he algorit hm. T here are different met hods and classifiers for image
classificat ion. Sup p ort vect or machine is one of t he most widely used algorit hm for image
classificat ion. But t he t ime t aken for image classificat ion is wit h SVM is large. So for get t ing
t he fast er result s t he use of GPU is very imp ort ant . So wit h t he help of GPU’s t he t raining and
image classificat ion t ime is reduced. SVM mainly used for classificat ion of dat a, and it con-
st ruct s t he hy p er p lanes of different labels. Anot her met hod for image classificat ion is Ext reme
Learning M achine (ELM ) it cont ains only 3 lay ers namely , one inp ut lay er, one hidden lay er
and one out p ut lay er. In t his p ap er consist of analy sis of t wo different classifiers namely , SVM
and ELM for image classificat ion and t hen t he met hodologies t o imp lement t hose classifiers
and at t he end t he comp arison bet ween t hose classifiers.
Ke ywords: High Performance Comp ut ing, Unsup ervised Feat ure Learning (UFL),
Ext reme Learning M achine (ELM), Radial Basis Function (RBF), Sup p ort Vect or Ma-
chine (SVM).
1 Introductio n
There are different ways for image clas s ification s uch as minimum dis-
tance, maximum likelihood, neural network, s upport vector machine which gives the
clas s ification of data. There are s ome uns upervis ed clas s ifiers as well which us es
clus tering bas ed algorithm they are K-Means , k-NN, K-Medoid, ISODATA etc.1
For image clas s ification us ing neural network applying the appropriate clas s ification
technique is very important to get the fas ter res ults . The Support vector machine
(SVM) bas ed on kernel is an effective technique for categorization of the images .
This clas s ifier is us ed in many of the applications like recognition in remote s ensing
applications . But when we us e the individual clas s ifier for clas s ification it gives nor-
mal res ults . As the s ize of the datas et increas es the time required for clas s ification
als o increas es . So to get the better res ults with large datas et nowadays there is a trend
to us e multiple clas s ifiers together. Some of the combined clas s ifiers are Neural Net-
work clas s ifiers and s upport vector machine clas s ifier.
So, aim of this paper is to analyze the different combinations of clas s ifiers
and us e for image clas s ification.

2 Lite rature Surve y

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

According to Le Hoang Thai, Tran S on Hai, Nguyen Thanh Thuy 1 In terms of
s peed and accuracy, the ANN and SVM collectively produce much better clas s ifica-
tion res ults . The paper us es image feature extraction is the fundamental s tep in image
clas s ification. This clas s ification technique cons is t of two layers . Firs t layer cons is t of
k-ANNs which gives the clas s ification res ults bas ed on feature vector. The main work
of the s econd layer is to collect all the res ults from firs t layer and SVM clas s ifier is
us ed to integrate all the res ults from firs t layer and give the clas s ification res ult.

Mahmood, Yous efi-Azar, Mark D. McDonnell 2 s upervis ed and unsupervis ed
techniques together forms the cluster of the images it us es the k-means clus tering
algorithm and the algorithm not only res tricted to RGB colors but als o for Lab color
repres entation. The combination of uns upervis ed feature learning algorithm with ex-
treme machine learning outperforms the other traditional methods .

According to Dao Lam, Donald Wuns ch 3 UFL-ELM clas s ification gives better
res ults than SVM and other approaches . In UFL-ELM the features are extricate from
data only rather than other traditional methods . Then clas s ifier is trained us ing ELM
for getting des ired s olution. This method is eas y to us e and gives s peed of training of
the data.

Zuo Bai, Guang-Bin Huang, Danwei Wang, Han Wang, and M. Brandon
Wes tover 4 Traditional methods for clas s ification takes large s torage s pace and
tes ting time to reduce that time s pars e ELM method is developed. In this method new
algorithm is developed for efficient training of data. Becaus e of this the time and
complexity is reduced s ignificantly. This s pars e ELM gives fas ter speed of training
than other methods .

Dao Lam, Donald Wuns ch 5 s ugges ted the better way and fas ter res ult for image
clas s ification. Uns upervis ed feature learning algorithm is us ed for learning the fea-
tures in this method. And RBF-ELM is us ed for further clas s ification of the data.
W hen features are derived from the algorithm then thos e features are given to the
RBF-ELM. So this approach gives the better res ult, but to improve the training and
tes ting time of the data a new parallel approach is s ugges ted which is implementation
of the CUDA kernel. So with the help of CUDA kernel it gives 20 time’s fas ter res ults
than CPU and other parallel approach.

3
3 M e thodology
There are different ways or methodologies us ed to clas s ify the images using neural
network architecture like us ing SVM, Spars e-ELM, UFL-ELM etc. But the mos t
promis ing methodology for clas s ification of image is RBF-ELM with the parallel
architecture like CUDA kernel.
The important thing to us e the UFL is that it gives far better res ults than the tradi-
tional methods . A clas s ifier gives better res ults only when it has lots of data for train-
ing and tes ting. This methodology us es the large amount of data for training and test-
ing as well as it us es the GPU architecture for more s peed and accuracy in the res ult
5.
So, the firs t tas k of the image clas s ification is inputting the unlabeled image da-
tas et and derives the features from it. For deriving the features a well- known UFL
algorithm is us ed which is k-Means UFL. For deriving the features needs to extract
the patches from the datas et. After extracting the patches needs to preproces s thos e
patches and then k-means algorithm is applied for obtaining the centroids .9
RBF-ELM algorithm is us ed to improve the performance of the clas s ification and
any radial bas is function (s uch as Gaus s ian function) is us ed as the activation function
for the hidden layer into the neural network. ELM us es only 3 layers to get the output
from the neural network they are one input layer, only one hidden layer and one out-
put layer. Depending on the datas et the input is randomly as s igned and output can be
generated from the hidden layer output 35.
So, with the help of this methodology the image clas s ification gives the better re-
s ults . But s till it takes cons iderable time to train and tes t the data into the neural net-
work.
The us e of GPU for image clas s ification with neural network gives far better re-
s ults than the other traditional methods . There are different mechanisms for parallel-
ization one is us e of multiple cores of the s ys tem and other one is us e of explicit par-
allel programming architecture.
W hile us ing the CUDA architecture memory management and right portion of the
program to be executed on to the GPU is very important. Detect the part of the pro-
gram to be parallelized and then apply proper parallelization technique to improve the
performance. Finally, this CUDA kernel RBF-ELM architecture for image clas s ifica-
tion gives 20 times fas ter res ults than that of other approaches .

4 Conclusion
This review paper s hows analys is of different image clas s ification techniques
with their working in s hort and which technique is bes t amongs t them. The RBF-ELM
us es only three layers one input, one hidden and one output with randomized input
gives fas ter res ults than traditional methods .There are s ome popular algorithms s uch
as SVM but the RBF-ELM with CUDA kernel gives better performance with im-
proved s peed and accuracy.

Re fe re nce s
1. T hai, Le Hoang, et al. ” Image Classificat ion Using Sup p ort Vect or M achine and Art ificial
Neural Net work.” International Journal of Information Technology and Computer Sci-
ence, no. 5, Feb. 2012, p p . 32-38.
2. Yousefi-Az ar, M ahmood, and M ark D. M cdonnell. ” Semi-Sup ervised Convolut ional Ex-
t reme Learning M achine.” 2017 International Joint Conference on Neural Network s
(IJCNN), 2017, p p . 1-7.
3. Lam, Dao, and Donald Wunsch. ” Unsup ervised Feat ure Learning Classificat ion Using an
Ext reme Learning M achine.” The 2013 International Joint Conference on Neural Net-
work s (IJCNN), 2013.
4. Bai, Z uo, et al. ” Sp arse Ext reme Learning M achine for Classificat ion.” IEEE Transactions
on Cybernetics, no. 10, 2014.
5. Lam, Dao, and Donald Wunsch. ” Unsup ervised Feat ure Learning Classificat ion Wit h Ra-
dial Basis Funct ion Ext reme Learning M achine Using Grap hic Processors.” IEEE Trans-
actions on Cybernetics, no. 1, 2017.
6. Huang, Guang-Bin, et al. ” Ext reme Learning M achine: a New Learning Scheme of Feed-
forward Neural Net works.” 2004 IEEE International Joint Conference on Neural Net-
work s, 2004.
7. Ranz at o, M arc'aurelio, et al. ” Unsup ervised Learning of Invariant Feature Hierarchies wit h
Ap p licat ions t o Object Recognit ion.” 2007 IEEE Conference on Computer Vision and Pat-
tern Recognition, 2007.
8. Le, Quoc V. ” Building High-Level Feat ures Using Large Scale Unsup ervised Learning.”
2013 IEEE International Conference on Acoustics, Speech and Signal Processing, 2013.
9. A. Coat es, A. Y. Ng, and H. Lee, ” An analy sis of single-lay er net works
in unsup ervised feat ure learning,” International Conference on Artificial Intelligence Sta-
tistics Fort Lauderdale, 2011, p p . 215–223

x

Hi!
I'm Mila

Would you like to get a custom essay? How about receiving a customized one?

Check it out