Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- %%%%%% LOAD DATA %%%%%%%
- matrix = (load('podaci_564.mat').data)';
- input = matrix(:, 1:2);
- output = matrix(:,3);
- negative = input(find(output == 0),:);
- positive = input(find(output == 1),:);
- %a - visualize data classes
- figure('Name', 'Data classes');
- %0 - red crosses
- plot(negative(:,1), negative(:,2),'r+');
- hold on;
- %1 - blue circles
- plot(positive(:,1), positive(:,2),'bo');
- %pause;
- hold off;
- %NOTE - first 450 data samples are class 0 , next 114 are class 1,
- %for test randomize data - shuffle
- %b - make test dataset (20% of original size)
- shuffled_array = randperm(int32((length(input) * 0.2)));
- test_input = input(shuffled_array, :);
- test_output = output(shuffled_array, :);
- %neural nwtwork creation
- first_input_col = input(:,1)';
- second_input_col = input(:,2)';
- Pr = min_max([first_input_col;second_input_col]);
- Ni_1 = [2 3 ];
- Ni_2 = [2 3 1];
- %output layer is considered as part of hidden layer
- %trainlm can produce maximum memory usage problem. Possible solutions:
- %1.Slow trainlm training, but reduce memory requirements by setting net.trainParam.mem_reduc to 2 or more. (See help trainlm.)
- %2.Use trainbfg, which is slower but more memory-efficient than trainlm.-not available
- %3.Use trainrp, which is slower but more memory-efficient than trainbfg. -not available
- net = newff(Pr, Ni_2, {"tansig","purelin","logsig"}, "trainlm", "learngdm", "mse");
- net.trainParam.mem_reduc = 2;
- net.trainParam.Mu = 0.0001;
- %Y = sim(net,input');
- %plot(input,output);
- %,input',Y,'o')
- %disp(net.inputs.size);
- net.trainParam.epochs = 50;
- net = train(net,input',output');
- #{Y = sim(net,input);
- plot(input,output,input,Y,'o')
- #}
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement