07/10/13
libsvm for MATLAB - Kittipat's Homepage
Kittipat'sHomepage
Pesquisarosite
Home
Relevantprojectsshowroom
Academics
Announcements Home Botinanutshell Resume(ShortCV) CV MVPAandMachineLearningfor BrainfMRI fMRIpreprocessMATLAB toolboxversion1.0 Convertdatabetween MATLABandNiftiformat MutualInformationbased FeatureRankingforMVPAof fMRIbraindata BrainFunctional DistinguishabilityMeasure simpleclassificationonfMRI data SimulatedAnnealing(SA)for VoxelsSelection Classspecificmeasures:CMI andCD functionalbrainclustering biclusteringalgorithmon categorysensitivedescriptor 3Dsupervoxelforbrainimage sparseregularizationon lexicaldata ClassificationwithSparse Regularization Relevantprojectsshowroom EMalgorithmforGaussian mixturemodelongridded space logisticregressionforfeature selection statichandposture recognition ImageSegmentationusing GMMBIC 3DLiDARpointcloud segmentation IrregularTreeStructured BayesianNetworks Classes Usefulstuff Toolsfordatascience libsvmforMATLAB Hierarchicalbiclustering toolbox SpectralCoClustering (Biclustering)algorithm informationtheoreticlearning (ITL)MATLABtoolbox OftenusedLaTeXtechniques OftenusedMATLAB Techniques OftenusedRtechniques ProgrammingwithPython Downloadavailablecodesand toolboxes Sitemap Recentsiteactivity
libsvmforMATLAB
7
libsvmisagreattoolforSVMasitisveryeasytouseandisdocumentedwell.Thelibsvmpackagewebpageismaintained byChihChungChangandChihJenLinofNTU.Thewebpagecanbefoundhere.Imadethistutorialasareminderfor myselfwhenIneedtouseitagain.Allthecreditsgoforthelibsvmdevelopers.Hereishowyoucancitethelibsvm.
Content
Inthisshorttutorial,thefollowingtopicswillbediscussed: HowtoinstallthelibsvmforMATLABonUnixmachine LinearkernelSVMforbinaryclassification kernelSVMforbinaryclassification crossvalidationforCandGamma multiclassSVM:onevsrest(OVR) Morereadytousematlabexample Availablematlabcodestodownload
Hereishowtoinstallthetoolbox
Justreadthereadmefileinthepackage.It'sveryeasy.YoucandoitinbothterminalandinMATLABworkspace.On Ubuntumachine,justtomakesureyouhavegccinyourmachine.Ifnot,youneedtoinstallitusingthecommandbelow: s u d oa p t g e ti n s t a l lb u i l d e s s e n t i a lg + +
BasicSVM:LinearkernelSVMforbinaryclassification
Belowisthefirstcodetorun.Thecodeisforbinaryclassificationandusethevariablec=1,gamma(g)=0.07and'b1' denotestheprobabilityoutput. %T h i sc o d ej u s ts i m p l yr u nt h eS V Mo nt h ee x a m p l ed a t as e t" h e a r t _ s c a l e " , %w h i c hi ss c a l e dp r o p e r l y .T h ec o d ed i v i d e st h ed a t ai n t o2p a r t s %t r a i n :1t o2 0 0 %t e s t :2 0 1 : 2 7 0 %T h e np l o tt h er e s u l t sv st h e i rt r u ec l a s s .I no r d e rt ov i s u a l i z et h eh i g h %d i m e n s i o n a ld a t a ,w ea p p l yM D St ot h e1 3 Dd a t aa n dr e d u c et h ed i m e n s i o n %t o2 D c l e a r c l c c l o s ea l l %a d d p a t ht ot h el i b s v mt o o l b o x a d d p a t h ( ' . . / l i b s v m 3 . 1 2 / m a t l a b ' ) ; %a d d p a t ht ot h ed a t a d i r D a t a=' . . / l i b s v m 3 . 1 2 ' ; a d d p a t h ( d i r D a t a ) ; %r e a dt h ed a t as e t [ h e a r t _ s c a l e _ l a b e l ,h e a r t _ s c a l e _ i n s t ]=l i b s v m r e a d ( f u l l f i l e ( d i r D a t a , ' h e a r t _ s c a l e ' ) ) ; [ ND ]=s i z e ( h e a r t _ s c a l e _ i n s t ) ; %D e t e r m i n et h et r a i na n dt e s ti n d e x t r a i n I n d e x=z e r o s ( N , 1 ) ;t r a i n I n d e x ( 1 : 2 0 0 )=1 ; t e s t I n d e x=z e r o s ( N , 1 ) ;t e s t I n d e x ( 2 0 1 : N )=1 ; t r a i n D a t a=h e a r t _ s c a l e _ i n s t ( t r a i n I n d e x = = 1 , : ) ; t r a i n L a b e l=h e a r t _ s c a l e _ l a b e l ( t r a i n I n d e x = = 1 , : ) ; t e s t D a t a=h e a r t _ s c a l e _ i n s t ( t e s t I n d e x = = 1 , : ) ; t e s t L a b e l=h e a r t _ s c a l e _ l a b e l ( t e s t I n d e x = = 1 , : ) ; %T r a i nt h eS V M m o d e l=s v m t r a i n ( t r a i n L a b e l ,t r a i n D a t a ,' c1g0 . 0 7b1 ' ) ; %U s et h eS V Mm o d e lt oc l a s s i f yt h ed a t a [ p r e d i c t _ l a b e l ,a c c u r a c y ,p r o b _ v a l u e s ]=s v m p r e d i c t ( t e s t L a b e l ,t e s t D a t a ,m o d e l ,' b1 ' ) ; %r u nt h eS V Mm o d e lo nt h et e s td a t a
Boringitems
Bot'sBibTex Readinglist ResearchWriting Bot'sSillyTools WhiteBorderRemoval Toolbox
%= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = %= = = = =S h o w i n gt h er e s u l t s= = = = = = %= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = %A s s i g nc o l o rf o re a c hc l a s s %c o l o r L i s t=g e n e r a t e C o l o r L i s t ( 2 ) ; % T h i si sm yo w nw a yt oa s s i g nt h ec o l o r . . . d o n ' t w o r r ya b o u ti t c o l o r L i s t=p r i s m ( 1 0 0 ) ; %t r u e( g r o u n dt r u t h )c l a s s t r u e C l a s s I n d e x=z e r o s ( N , 1 ) ; t r u e C l a s s I n d e x ( h e a r t _ s c a l e _ l a b e l = = 1 )=1 ; t r u e C l a s s I n d e x ( h e a r t _ s c a l e _ l a b e l = = 1 )=2 ; c o l o r T r u e C l a s s=c o l o r L i s t ( t r u e C l a s s I n d e x , : ) ; %r e s u l tC l a s s r e s u l t C l a s s I n d e x=z e r o s ( l e n g t h ( p r e d i c t _ l a b e l ) , 1 ) ; r e s u l t C l a s s I n d e x ( p r e d i c t _ l a b e l = = 1 )=1 ; r e s u l t C l a s s I n d e x ( p r e d i c t _ l a b e l = = 1 )=2 ; c o l o r R e s u l t C l a s s=c o l o r L i s t ( r e s u l t C l a s s I n d e x , : ) ; %R e d u c et h ed i m e n s i o nf r o m1 3 Dt o2 D
Favoritewebsites
Machinelearningwebsites Aiqus AISocial(BrenOCon) MLTheory/News(Langford) NLPers
https://sites.google.com/site/kittipat/libsvm_matlab
1/8
07/10/13
ClassCentral Udacity Coursera ProjectEuler Kaggle ChaLearnAcademy Kittipat'sBlog GoogleResearchBlog
libsvm for MATLAB - Kittipat's Homepage
d i s t a n c e M a t r i x=p d i s t ( h e a r t _ s c a l e _ i n s t , ' e u c l i d e a n ' ) ; n e w C o o r=m d s c a l e ( d i s t a n c e M a t r i x , 2 ) ; %P l o tt h ew h o l ed a t as e t x=n e w C o o r ( : , 1 ) ; y=n e w C o o r ( : , 2 ) ; p a t c h S i z e=3 0 ;% m a x ( p r o b _ v a l u e s , [ ] , 2 ) ; c o l o r T r u e C l a s s P l o t=c o l o r T r u e C l a s s ; f i g u r e ;s c a t t e r ( x , y , p a t c h S i z e , c o l o r T r u e C l a s s P l o t , ' f i l l e d ' ) ; t i t l e ( ' w h o l ed a t as e t ' ) ; %P l o tt h et e s td a t a x=n e w C o o r ( t e s t I n d e x = = 1 , 1 ) ; y=n e w C o o r ( t e s t I n d e x = = 1 , 2 ) ; p a t c h S i z e=8 0 * m a x ( p r o b _ v a l u e s , [ ] , 2 ) ; c o l o r T r u e C l a s s P l o t=c o l o r T r u e C l a s s ( t e s t I n d e x = = 1 , : ) ; f i g u r e ;h o l do n ; s c a t t e r ( x , y , 2 * p a t c h S i z e , c o l o r T r u e C l a s s P l o t , ' o ' , ' f i l l e d ' ) ; s c a t t e r ( x , y , p a t c h S i z e , c o l o r R e s u l t C l a s s , ' o ' , ' f i l l e d ' ) ; %P l o tt h et r a i n i n gs e t x=n e w C o o r ( t r a i n I n d e x = = 1 , 1 ) ; y=n e w C o o r ( t r a i n I n d e x = = 1 , 2 ) ; p a t c h S i z e=3 0 ; c o l o r T r u e C l a s s P l o t=c o l o r T r u e C l a s s ( t r a i n I n d e x = = 1 , : ) ; s c a t t e r ( x , y , p a t c h S i z e , c o l o r T r u e C l a s s P l o t , ' o ' ) ; t i t l e ( ' c l a s s i f i c a t i o nr e s u l t s ' ) ; Theresultshows: o p t i m i z a t i o nf i n i s h e d ,# i t e r=1 3 7 n u=0 . 4 5 7 4 2 2 o b j=7 6 . 7 3 0 8 6 7 ,r h o=0 . 4 3 5 2 3 3 n S V=1 0 4 ,n B S V=8 1 T o t a ln S V=1 0 4 A c c u r a c y=8 1 . 4 2 8 6 %( 5 7 / 7 0 )( c l a s s i f i c a t i o n ) Thewholedatasetisplotted:
Theclusteringresultsmightlooklikethis:
Theunfilledmarkersrepresentdatainstancefromthetrainset.Thefilledmarkersrepresentdatainstancefromthetestset, andfilledcolorrepresentstheclasslabelassignedbySVMwhereastheedgecolorrepresentsthetrue(groundtruth)label. Themarkersizeofthetestsetrepresentstheprobabilitythatthesampleinstanceisassignedwithitscorrespondingclass labelthebigger,themoreconfidence.
KernelSVMforbinaryclassification
Nowlet'sapplysomekerneltotheSVM.Weusealmostthesamecodeasbefore,theonlyexceptionisthetraindataset,
https://sites.google.com/site/kittipat/libsvm_matlab
2/8
07/10/13
libsvm for MATLAB - Kittipat's Homepage
trainData,isreplacedbythekernelizedversion[ ( 1 : 2 0 0 ) 't r a i n D a t a * t r a i n D a t a ' ] andthetestdata,testData,is replacedbyitskernelizedversion[ ( 1 : 7 0 ) 't e s t D a t a * t r a i n D a t a ' ] asappearedbelow. %T r a i nt h eS V M m o d e l=s v m t r a i n ( t r a i n L a b e l ,[ ( 1 : 2 0 0 ) 't r a i n D a t a * t r a i n D a t a ' ] ,' c1g0 . 0 7b1t4 ' ) ; %U s et h eS V Mm o d e lt oc l a s s i f yt h ed a t a [ p r e d i c t _ l a b e l ,a c c u r a c y ,p r o b _ v a l u e s ]=s v m p r e d i c t ( t e s t L a b e l ,[ ( 1 : 7 0 ) ' t e s t D a t a * t r a i n D a t a ' ] ,m o d e l ,' b1 ' ) ;%r u nt h eS V Mm o d e lo nt h et e s td a t a Thecompletecodecanbefoundhere.Theresultingclustersareshowninthefigurebelow. ' L i n e a r 'k e r n e l o p t i m i z a t i o nf i n i s h e d ,# i t e r=4 0 3 7 9 6 n u=0 . 3 3 5 7 2 0 o b j=6 7 . 0 4 2 7 8 1 ,r h o=1 . 2 5 2 6 0 4 n S V=7 4 ,n B S V=6 0 T o t a ln S V=7 4 A c c u r a c y=8 5 . 7 1 4 3 %( 6 0 / 7 0 )( c l a s s i f i c a t i o n ) ' p o l y n o m i a l 'k e r n e l o p t i m i z a t i o nf i n i s h e d ,# i t e r=1 0 2 3 8 5 n u=0 . 0 0 0 0 0 1 o b j=0 . 0 0 0 0 8 6 ,r h o=0 . 4 6 5 3 4 2 n S V=6 9 ,n B S V=0 T o t a ln S V=6 9 A c c u r a c y=7 2 . 8 5 7 1 %( 5 1 / 7 0 )( c l a s s i f i c a t i o n ) ' R B F 'k e r n e l o p t i m i z a t i o nf i n i s h e d ,# i t e r=3 7 2 n u=0 . 8 9 0 0 0 0 o b j=9 7 . 5 9 4 7 3 0 ,r h o=0 . 1 9 4 4 1 4 n S V=2 0 0 ,n B S V=9 0 T o t a ln S V=2 0 0 A c c u r a c y=5 7 . 1 4 2 9 %( 4 0 / 7 0 )( c l a s s i f i c a t i o n ) ' S i g m o i d 'k e r n e l o p t i m i z a t i o nf i n i s h e d ,# i t e r=9 0 n u=0 . 8 7 0 0 0 0 o b j=1 9 5 . 4 1 7 1 6 9 ,r h o=0 . 9 9 9 9 9 3 n S V=1 7 4 ,n B S V=1 7 4 T o t a ln S V=1 7 4 A c c u r a c y=6 0 %( 4 2 / 7 0 )( c l a s s i f i c a t i o n ) ' M L P 'k e r n e l o p t i m i z a t i o nf i n i s h e d ,# i t e r=1 2 4 7 n u=0 . 3 5 2 6 1 6 o b j=6 8 . 8 4 2 4 2 1 ,r h o=0 . 5 5 2 6 9 3 n S V=7 7 ,n B S V=6 3 T o t a ln S V=7 7 A c c u r a c y=8 2 . 8 5 7 1 %( 5 8 / 7 0 )( c l a s s i f i c a t i o n )
LinearkernelSVM:85.7%accuracy
PolynomialkernelSVM:72.86%accuracy
RBFkernelSVM:57.14%accurac
SigmoidkernelSVM:60%accuracy
MLPkernelSVM:82.86%accuracy
CrossvalidationofCandGamma
Theoptionforsvmtrain n f o l dc r o s sv a l i d a t i o n :nm u s t> =2 U s a g e :m o d e l=s v m t r a i n ( t r a i n i n g _ l a b e l _ v e c t o r ,t r a i n i n g _ i n s t a n c e _ m a t r i x , ' l i b s v m _ o p t i o n s ' ) ; l i b s v m _ o p t i o n s : ss v m _ t y p e:s e tt y p eo fS V M( d e f a u l t0 ) 0-C S V C 1-n u S V C 2-o n e c l a s sS V M 3-e p s i l o n S V R 4-n u S V R tk e r n e l _ t y p e:s e tt y p eo fk e r n e lf u n c t i o n( d e f a u l t2 ) 0-l i n e a r :u ' * v 1-p o l y n o m i a l :( g a m m a * u ' * v+c o e f 0 ) ^ d e g r e e 2-r a d i a lb a s i sf u n c t i o n :e x p ( g a m m a * | u v | ^ 2 )
https://sites.google.com/site/kittipat/libsvm_matlab
3/8
07/10/13
libsvm for MATLAB - Kittipat's Homepage
3-s i g m o i d :t a n h ( g a m m a * u ' * v+c o e f 0 ) 4-p r e c o m p u t e dk e r n e l( k e r n e lv a l u e si nt r a i n i n g _ i n s t a n c e _ m a t r i x ) dd e g r e e:s e td e g r e ei nk e r n e lf u n c t i o n( d e f a u l t3 ) gg a m m a:s e tg a m m ai nk e r n e lf u n c t i o n( d e f a u l t1 / n u m _ f e a t u r e s ) rc o e f 0:s e tc o e f 0i nk e r n e lf u n c t i o n( d e f a u l t0 ) cc o s t:s e tt h ep a r a m e t e rCo fC S V C ,e p s i l o n S V R ,a n dn u S V R( d e f a u l t1 ) nn u:s e tt h ep a r a m e t e rn uo fn u S V C ,o n e c l a s sS V M ,a n dn u S V R( d e f a u l t0 . 5 ) pe p s i l o n:s e tt h ee p s i l o ni nl o s sf u n c t i o no fe p s i l o n S V R( d e f a u l t0 . 1 ) mc a c h e s i z e:s e tc a c h em e m o r ys i z ei nM B( d e f a u l t1 0 0 ) ee p s i l o n:s e tt o l e r a n c eo ft e r m i n a t i o nc r i t e r i o n( d e f a u l t0 . 0 0 1 ) hs h r i n k i n g:w h e t h e rt ou s et h es h r i n k i n gh e u r i s t i c s ,0o r1( d e f a u l t1 ) bp r o b a b i l i t y _ e s t i m a t e s:w h e t h e rt ot r a i naS V Co rS V Rm o d e lf o rp r o b a b i l i t y e s t i m a t e s ,0o r1( d e f a u l t0 ) w iw e i g h t:s e tt h ep a r a m e t e rCo fc l a s sit ow e i g h t * C ,f o rC S V C( d e f a u l t1 ) vn:n f o l dc r o s sv a l i d a t i o nm o d e q:q u i e tm o d e( n oo u t p u t s ) Inthisexample,wewillusetheoptionenforcingnfoldcrossvalidationinsvmtrain,whichissimplyputthe'vn'inthe parametersection,wherendenotenfoldcrossvalidation.Hereistheexampleofusing3foldcrossvalidation: p a r a m=[ ' qv3c' ,n u m 2 s t r ( c ) ,'g' ,n u m 2 s t r ( g ) ] ; c v=s v m t r a i n ( t r a i n L a b e l ,t r a i n D a t a ,p a r a m ) ; Intheexamplebelow,Iwillshowthenestedcrossvalidation.First,wesearchfortheoptimalparameters(candgamma)in thebigscale,thenthesearchingspaceisnarroweddownuntilsatisfied.Theresultsarecomparedwiththefirstexperiment whichdoesnotusetheoptimalparameters.Thefullcodecanbefoundhere. Bigscaleparameterssearching Mediumscaleparameterssearching
Smallscaleparameterssearching
Accuracy=84.29%whichisbetterthanusingthenonreallyoptimalparameterc=1and gamma=0.07inthepreviousexperimentwhichgives81.43%accuracy.
MulticlassSVM
Naturally,SVMisabinaryclassificationmodel,howcanweuseSVMinthemulticlassscenario?Inthisexample,wewill showyouhowtodomulticlassclassificationusinglibsvm.Asimplestrategyistodobinaryclassification1pairatatime. Herewewilluseoneversusrestapproach.Infact,wecanjustusetheoriginalcodes(svmtrainandsvmpredict)fromthe libsvmpackagetodothejobbymakinga"wrappercode"tocalltheoriginalcodeonepairatatime.Thegoodnewsisthat libsvmtutorialpageprovidesawrappercodetodosoalready.Yes,wewilljustuseitproperly. JustdownloadthedemocodefromtheendofthisURL,whichsays [ t r a i n Yt r a i n X ]=l i b s v m r e a d ( ' . / d n a . s c a l e ' ) ; [ t e s t Yt e s t X ]=l i b s v m r e a d ( ' . / d n a . s c a l e . t ' ) ; m o d e l=o v r t r a i n ( t r a i n Y ,t r a i n X ,' c8g4 ' ) ; [ p r e da cd e c v ]=o v r p r e d i c t ( t e s t Y ,t e s t X ,m o d e l ) ; f p r i n t f ( ' A c c u r a c y=% g % % \ n ' ,a c*1 0 0 ) ; Thecodesovrtrainandovrpredictarethewrapper.Youcanalsodothecrossvalidationfromthedemocodebelow,where get_cv_acisagainthewrappercode.
b e s t c v=0 ;
https://sites.google.com/site/kittipat/libsvm_matlab
4/8
07/10/13
libsvm for MATLAB - Kittipat's Homepage
f o rl o g 2 c=1 : 2 : 3 , f o rl o g 2 g=4 : 2 : 1 , c m d=[ ' qc' ,n u m 2 s t r ( 2 ^ l o g 2 c ) ,'g' ,n u m 2 s t r ( 2 ^ l o g 2 g ) ] ; c v=g e t _ c v _ a c ( t r a i n Y ,t r a i n X ,c m d ,3 ) ; i f( c v> =b e s t c v ) , b e s t c v=c v ;b e s t c=2 ^ l o g 2 c ;b e s t g=2 ^ l o g 2 g ; e n d f p r i n t f ( ' % g% g% g( b e s tc = % g ,g = % g ,r a t e = % g ) \ n ' ,l o g 2 c ,l o g 2 g ,c v ,b e s t c ,b e s t g ,b e s t c v ) ; e n d e n d
Thefullimplementedcodecanbefoundhere.Resultsshowthat
row12000:trainingset.
Theonevsrestmulticlass SVMresults.Herewedo parameterselectiononthe trainsetyieldingthe accuracyforeachclass: class1:Accuracy= 94.3508%(1119/1186) (classification) class2:Accuracy= 95.4469%(1132/1186) (classification) class3:Accuracy= 94.1821%(1117/1186) (classification) overallclass:Accuracy= 94.0135% Thebestparametersare c=8andgamma=0.0625. Notewhentheparameters arenotselectproperly,say c=8,gamma=4,the accuracyisaslowas 60%.So,parameter selectionisreally important!!!!
Moreexamples
Youmayfindthefollowingexamplesuseful.Eachcodeisbuiltforsomespecificapplication,whichmightbeusefultothe readertodownloadandtweakjusttosaveyourdevelopingtime.
Bigpicture:Inthisscenario,Icompiledaneasyexampletoillustratehowtousesvminfullprocess.Thecode contains: datageneration determiningtrainandtestdataset parameterselectionusingnfoldcrossvalidation,bothsemimanualandtheautomaticapproach trainthesvmmodelusingoneversusrest(OVR)approach usethesvmmodeltoclassifythetestsetinOVRmode makeconfusionmatrixtoevaluatetheresults showtheresultsinaninformativeway displaythedecisionboundaryonthefeaturespace Reportingaresultsusingnfoldcrossvalidation:Incaseyouhaveonly1dataset(i.e.,thereisnoexplicittrainor testset),nfoldcrossvalidationisaconventionalwaytoassessaclassifier.Theoverallaccuracyisobtainedby averagingtheaccuracypereachofthenfoldcrossvalidation.Theobservationsareseparatedintonfoldsequally,the codeusen1foldstotrainthesvmmodelwhichwillbeusedtoclassifytheremaining1foldaccordingtostandard OVR.Thecodecanbefoundhere. Usingmulticlassovrsvmwithkernel:SofarIhaven'tshowntheusageofovrsvmwithkernelspecific('tx').In fact,youcanaddthekerneltoanyovrcode,theywillwork.Thecompletecodecanbefoundhere. Forparameterselectionusingcrossvalidation,weusethecodebelowtocalculatetheaverageaccuracycv. Youcanjustadd' tx ' tothecode.
c m d=[ ' qc' ,n u m 2 s t r ( 2 ^ l o g 2 c ) ,'g' ,n u m 2 s t r ( 2 ^ l o g 2 g ) , 't0 ' ] ; c v=g e t _ c v _ a c ( t r a i n L a b e l ,[ ( 1 : N T r a i n ) 't r a i n D a t a * t r a i n D a t a ' ] ,c m d ,N c v ) ;
https://sites.google.com/site/kittipat/libsvm_matlab
5/8
07/10/13
libsvm for MATLAB - Kittipat's Homepage
Training:justadd' tx ' tothetrainingcode b e s t P a r a m=[ ' qc' ,n u m 2 s t r ( b e s t c ) ,' ,g' ,n u m 2 s t r ( b e s t g ) , 't0 ' ] ; m o d e l=o v r t r a i n B o t ( t r a i n L a b e l ,[ ( 1 : N T r a i n ) 't r a i n D a t a * t r a i n D a t a ' ] , b e s t P a r a m ) ; Classification:the' tx ' isincludedinthevariablem o d e l already,soyoudon'tneedtospecify' tx ' againwhenclassifying. [ p r e d i c t _ l a b e l ,a c c u r a c y ,d e c i s _ v a l u e s ]=o v r p r e d i c t B o t ( t e s t L a b e l ,[ ( 1 : N T e s t ) ' t e s t D a t a * t r a i n D a t a ' ] ,m o d e l ) ; [ d e c i s _ v a l u e _ w i n n e r ,l a b e l _ o u t ]=m a x ( d e c i s _ v a l u e s , [ ] , 2 ) ; However,Ifoundthatthecodecanbeveryslowinparameterselectionroutinewhenthenumberofclassand thenumberofcrossvalidationarebig(e.g.,Nclass=10,Ncv=3).Ithinktheslowpartmightbecausedby [ ( 1 : N T r a i n ) 't r a i n D a t a * t r a i n D a t a ' ] whichcanbehuge.PersonallyIliketousethedefaultkernel (RBF),whichwedon'tneedtomakethekernelmatrixX*X',whichmightcontributetoaprettyquickspeed. Completeexampleforclassificationusingnfoldcrossvalidation:Thiscodeworksonthesingledatawherethe trainandtestsetarecombinedwithinonesingleset.Moredetailscanbefoundhere. Completeexampleforclassificationusingtrainandtestdatasetseparately:Thiscodeworksonthedataset wherethetrainandtestsetareseparated,thatis,trainthemodelusingtrainsetandusethemodeltoclassifythe testset.Moredetailscanbefoundhere. HowtoobtaintheSVMweightvectorw:PleaseseetheexamplecodeanddiscussionfromStackOverflow.
Listofavailablematlabcodes
code demo_libsvm_test1.m binary/multiclass parameter classification kernel selection separated/n fold binary no, separated default manually (RBF) dataset description
heart_scale Thiscodeshowsthesimple (perhapssimplest)usageofthe svmlibpackagetotrainand classify.Veryeasyto understand. Thiscodejustsimplyrunthe SVMontheexampledataset "heart_scale",whichisscaled properly.Thecodedividesthe datainto2partstrain:1to200 andtest:201:270 Thenplottheresultsvstheirtrue class.Inordertovisualizethe highdimensionaldata,weapply MDStothe13Ddataandreduce thedimensionto2D
demo_libsvm_test2.m binary demo_libsvm_test3.m binary
no, separated manually
semi separated automatic, butthe codeis stillnot compact demo_libsvm_test4.m multiclass,OVR semi separated automatic
Specified heart_scale Identicalto_test1exceptthatit showshowtospecifythekernel (e.g.,'t4')inthecode. default heart_scale Identicalto_test1exceptthatit includearoutinesearchingfor goodparameterscandgamma
default
dna_scale Thiscodeshowshowtousethe libsvmforthemulticlass,more specificallyonevsrest(OVR), scenario.Forbothtrainingand classifying,weadopttheOVR wrappercodespostedinthe libsvmwebsite: 1. ovrtrain.mand 2. ovrpredict.m
demo_libsvm_test5.m multiclass,OVR multi separated scale automatic butnot perfect
default
10class spiral
respectively. Hereboththetrainandtestset aregeneratedfrom10class spiralmadeavailablehere.The datasetisveryintuitive. Inthiscode,wealsomakea routinetodeterminetheoptimal parametersautomatically.The usercanguessaninitial parameter,theroutinewillkeep improvingit. Herewealsomodifytheoriginal trainandclassifyfunctionabit: 1. ovrtrainBot.m< ovrtrain.m 2. ovrpredictBot.m< ovrpredict.m Furthermore,theconfusion matrixisshownintheresults. Wealsoplotthedecisionvalues inthefeaturespacejusttogive anideahowthedecision boundarylookslike.
demo_libsvm_test6.m multiclass,OVR no, leaveone manually outnfold cross validation
default
10class spiral
Inthiscodewewanttoillustrate howtoperformclassification usingnfoldcrossvalidation, whichisacommonmethodology
https://sites.google.com/site/kittipat/libsvm_matlab
6/8
07/10/13
libsvm for MATLAB - Kittipat's Homepage
tousewhenthedatasetdoes nothaveexplicittrainingandtest setseparately.Suchdatasets usuallycomeasasinglesetand wewillneedtoseparatethem intonequalparts/folds.The leaveoneoutnfoldcross validationistoclassify observationsinafoldkbyusing themodeltrainedfrom{all}{k} models,andrepeattheprocess forallk. Theuserisrequiredtoseparate thedataintonfoldsbyassigning "run"labelforeachobservation. Theobservationswithidentical runnumberwillbegrouped togetherintoafold.Itisa preferencetohaveobservations fromalltheclasseswithina certainfold.Infact,assigningthe runnumbertoeachobservation randomlyisfineaswell. Thiscodeisdevelopedbased on_test5.Whatweaddare: 1. betterautomaticcross validationroutinethan _test5.m 2. kernelspecificcode snippet Wefoundthathavingkernel specificismuchslowerthan usingthedefault(without'tx'). Atthispoint,Ipreferusingthe defaultkernel. demo_libsvm_test8.m multiclass,OVR multi separated scale automatic, quite perfect default and specific arefine here 10class spiral Thecodeisdevelopedbasedon _test7.Theimprovementisthat theautomaticcrossvalidationfor parameterselectionismadeinto afunction,whichismuchmore convenient.Thefunctionis automaticParameterSelection demo_libsvm_test9.m multiclass,OVR multi leaveone scale outnfold automatic, cross quite validation perfect default 10class spiral Thiscodeisanexcellent examplecompletecodefor classificationusingnfoldcross validationandautomatic parametersselection. Thecodeisdevelopedbasedon _test8.Thedifferenceisweput thenfoldclassification(from _test6)intoafunction: classifyUsingCrossValidation demo_libsvm_test10.m multiclass,OVR multi separated scale automatic, quite perfect default 10class spiral Thiscodeisanexcellent examplecompletecodefor classificationonstrain test_separateddatasetand automaticparametersselection.
demo_libsvm_test7.m multiclass,OVR multi separated scale automatic, quite perfect
default and specific arefine here
10class spiral
demo_libsvm_test11.m multiclass,OVR multi separated scale automatic, quite perfect
demo_libsvm_test12.m multiclass,pair wise(default methodfor multiclassinthe libsvmpackage)
multi separated scale automatic, quite perfect
Thecodeisdevelopedbasedon _test8and_test9. default 3classring Thiscodeisdevelopedbased and ontest10,exceptthatthecode specific ismadetoworkforanykernel. arefine However,theresultsarenot here goodatall.Moreover,therun timeisnotgoodeither.Wefound abetterwayusingmulticlass pairwiseSVM,whichisthe defaultmulticlassSVMapproach inthelibsvmpackage.Inthe nextversion(_test12),wewill testthepairwiseSVM. default 4class Thecodeisdevelopedbasedon and spiral _test11.Ifigurethatthefunction specific svmtrainandsvmpredict, kernelare originallyimplementedinlibsvm, finehere. supportmulticlasspairwise SVM.Wedon'tevenneedto makethekernelmatrixourself, weyouneedtodoisjustpick yourkernel'tx',parameters'c ygz',andyouwillgetthe results.Withthisregard,Imake anotherversionofparameter selectionroutineusingcross validation: automaticParameterSelection2.m <onlyslightlydifferentfrom
https://sites.google.com/site/kittipat/libsvm_matlab
7/8
07/10/13
libsvm for MATLAB - Kittipat's Homepage
automaticParameterSelection.m> whichcallthenfoldcross validationclassificationroutine: svmNFoldCrossValidation.m Iwouldsaythisisthebestsofar codetorunonseparateddata setasitprovidesparameter selectionroutineandthetrain andclassificationroutines.Very easytofollow. demo_libsvm_test13.m multiclass,pair multi leaveoneout default 4classring Thecodeisdevelopedbasedon wise scale nfoldcross and _test12.Theonlydifferenceis automatic, validation specific thatthiscodeusenfoldcross quite kernelare validationwhenclassifyingthe perfect finehere. "single"dataset,i.e.,thedata setwherebothtrainandtestset arecombinetogetheroften foundwhenthenumberof observationsislimited. Thisisthebestcodetouseto runonthesingledatasetusing nfoldcrossvalidation classification. Allthecodecanbefoundinthezipfilehere.
Subpginas(8): Classifyusingnfoldcrossvalidation Completeexample Completeexampleforclassificationusingnfoldcross validation Completeexampleonclassificationinmostgeneralscenario demo_libsvm_crossvalidation.m demo_libsvm_kernel.m demo_libsvm_ovr_multiclass.m UseMulticlassOVRSVMwithcrossvalidationandkernelspecification
libsvm3.12.zip(840k) use_libsvm.zip(1597k)
Comentrios
Vocnotempermissoparaadicionarcomentrios.
KittipatKampa,08/05/201309:58 KittipatKampa,11/01/201312:52 v.1 v.3
Fazerlogin | Atividaderecentenosite | Denunciarabuso | Imprimirpgina | Removeracesso | Tecnologia GoogleSites
https://sites.google.com/site/kittipat/libsvm_matlab
8/8