From: Charles on 28 Apr 2010 17:49 Hi guys, I have a pattern recognition problem that I want to solve. Basically, I have data in the form of (of course simplified) A=[1:10]; B=[10:20]. There are a lot of A's and B's, i.e for each A, there is a corresponding B. There exist a relationship between each pair of A and B. What I want is to use the data pairs [A(i) and B(i)] as a training set for a supervised learning algorithm, so that at the end, if I give the algorithm A(j), I will receive an output B(j) which is an approximation from the result of training. Please, I am not the best programmer around and will be appreciative if there exists a Matlab function that will accomplish this task or if someone could provide me with a code. Thanks guys for the job you are doing helping people out. Charles.
|
Pages: 1 Prev: blurring an image with a 1D fir filter Next: How to understand Matlab generated errors ? |