From: David Cohen on 10 Oct 2009 08:11 Hello all, I have bulid a Neural Network (RBF) based on data I have gathered using the SVMTrain, I have also test the results from the SVMTrain on other data sets and got satisfactory results. Now I would like to take the SVMStruct which I got back from the SVMTrain procedure and extract the information from it in order to build an approximation function. How can it be done? As my data consists of 4 input parameters I expect to get some function which consists from those parameters with coefficients and weights I am correct? Any help will be greatly appreciated.
From: Bruno Luong on 10 Oct 2009 11:31 "David Cohen" <dudu.cohen(a)gmail.com> wrote in message <haptkl$k6q$1(a)fred.mathworks.com>... > Hello all, > > I have bulid a Neural Network (RBF) based on data I have gathered using the SVMTrain, I have also test the results from the SVMTrain on other data sets and got satisfactory results. > Now I would like to take the SVMStruct which I got back from the SVMTrain procedure and extract the information from it in order to build an approximation function. > How can it be done? > Do you mean to know what formula used by svmclassify? For any x a vector, the class if is the sign of the algebraic distance. c(x) = sign(d(x)). The algebraic distance d(x) of x from the hyperplan (in the *feature* space) - scaled by the margin - is: d(x) = sum K(xi,x)*(alphai.*yi) - 0.5*sum K(xi,s)*(alphai.*yi) xi is training points; s is (any) support vector alphai is dual variables K(.,.) is kernel x is classify point Notes - Matlab classify returns (alphai.*yi) after training. - The second sum using *one* support vector in the formula can be calculated more accurately by replacing with the mean of the same quantity on *all* support vectors. But why bother not using svmclassify? Bruno
From: David Cohen on 10 Oct 2009 12:19 Thank you Bruno, I am not using svmclassify as this should be implemented in a c++ code in a greater project. I just use the SVMTrain in order to find out the formula I'll need. I still don't exactly get what is the formula to extract, Can I use only one support vector? (any one?) what exactly is the K? I have used the default SVMTrain parameters. [AlphaY, SVs, Bias, Parameters, nSV, nLabel] = SVMTrain(Samples,Labels);
From: Bruno Luong on 10 Oct 2009 14:02 "David Cohen" <dudu.cohen(a)gmail.com> wrote in message <haqc5l$cab$1(a)fred.mathworks.com>... > Thank you Bruno, > > I am not using svmclassify as this should be implemented in a c++ code in a greater project. > I just use the SVMTrain in order to find out the formula I'll need. > > I still don't exactly get what is the formula to extract, Can I use only one support vector? (any one?) what exactly is the K? I have used the default SVMTrain parameters. K(.,.) is the kernel you are using to train. Look at the doc the see which is used be default. As I said any support vector would do, but it's better to use all of them. Bruno
|
Pages: 1 Prev: projection profile histogram Next: nonlinear curve fitting with equal and unequal constrains |