Advertisement
subasah

Untitled

May 2nd, 2020
168
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
MatLab 4.74 KB | None | 0 0
  1. %clc;clear
  2. %https://www.mathworks.com/help/deeplearning/examples/transfer-learning-using-alexnet.html
  3. %alex net,resnet,darknet importer,google net,cnn,tensorflow and keras models
  4. %deepNetworkDesigner
  5. digitDatasetPath = fullfile('C:\Users\UTStudent\Desktop\salari\salari-project-here\skin-cancer-dataset');
  6. %here number of classes is determined based on the number folder for the
  7. %respective you have in that directory
  8. imds = imageDatastore(digitDatasetPath, ...
  9.     'IncludeSubfolders',true,'LabelSource','foldernames');
  10.  
  11. %inputSize = net.Layers(1).InputSize; [227 227 3]
  12. %this is here to change the channel for the images
  13. imds.ReadFcn = @(loc) repmat(imresize(imread(loc),[227 227]),[1 1 1]);
  14.  
  15. %Divide the data into training and validation data sets.
  16. %Use 70% of the images for training and 30% for validation.
  17. %splitEachLabel splits the images datastore into two new datastores.
  18. [imdsTrain,imdsValidation] = splitEachLabel(imds,0.7,'randomized');
  19.  
  20. %Load the pretrained AlexNet neural network.  AlexNet is trained on more than
  21. %one million images and can classify images into 1000 object categories, such
  22. %as keyboard, mouse, pencil, and many animals.
  23. %As a result, the model has learned rich feature representations for a wide range of images.
  24. net = alexnet;
  25.  
  26. %The last three layers of the pretrained network net are configured for 1000 classes.
  27. %These three layers must be fine-tuned for the new classification problem. Extract
  28. %all layers, except the last three, from the pretrained network.
  29. layersTransfer = net.Layers(1:end-3);
  30.  
  31. %Transfering the layers to the new classification task by replacing the
  32. %last three layers with a fully connected layer, a softmax layer, and a
  33. %classification output layer. Specifying the options of the new fully connected layer
  34. %according to the new data. Set the fully connected layer to have the same size as the
  35. %number of classes in the new data.
  36. numClasses = numel(categories(imdsTrain.Labels));
  37.  
  38. %To learn faster in the new layers than in the transferred
  39. %layers, increase the WeightLearnRateFactor and BiasLearnRateFactor values of the fully connected layer.
  40.  
  41. layers = [
  42.     layersTransfer
  43.     fullyConnectedLayer(numClasses,'WeightLearnRateFactor',20,'BiasLearnRateFactor',20)
  44.     softmaxLayer
  45.     classificationLayer];
  46.    
  47. options = trainingOptions('rmsprop', ...
  48.     'MiniBatchSize',20, ...
  49.     'MaxEpochs',6, ...
  50.     'InitialLearnRate',1e-4, ...
  51.     'Shuffle','every-epoch', ...
  52.     'ValidationData',imdsValidation, ...
  53.     'ValidationFrequency',3, ...
  54.     'Verbose',false, ...
  55.     'Plots','training-progress');
  56.  
  57.     inputSize = net.Layers(1).InputSize;
  58. %The network requires input images of size 227-by-227-by-3, but
  59. %the images in the image datastores have different sizes. Use an
  60. %augmented image datastore to automatically resize the training images.
  61. %Specify additional augmentation operations to perform on the training images: randomly flip the training images along the vertical axis, and randomly translate them up to 30 pixels horizontally and vertically. Data augmentation helps prevent the network from overfitting and memorizing the exact details of the training images.
  62.  
  63. pixelRange = [-35 35];
  64. imageAugmenter = imageDataAugmenter( ...
  65.     'RandXReflection',true, ...
  66.     'RandXTranslation',pixelRange, ...
  67.     'RandYTranslation',pixelRange);
  68. augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, ...
  69.     'DataAugmentation',imageAugmenter);
  70. augimdsValidation = augmentedImageDatastore(inputSize(1:2),imdsValidation);
  71.  
  72. %Training the network that consists of the transferred and new layers.
  73. %By default, trainNetwork uses a GPU if one is available (requires Parallel Computing
  74. %Toolbox™ and a CUDA® enabled GPU with compute capability 3.0 or higher). Otherwise, it uses a CPU.
  75. %You can also specify the execution environment by using the 'ExecutionEnvironment' name-value pair argument of trainingOptions.
  76. netTransfer = trainNetwork(augimdsTrain,layers,options);
  77.  
  78. %Classify the validation images using the fine-tuned network.
  79. [YPred,scores] = classify(netTransfer,augimdsValidation);
  80. YValidation = imdsValidation.Labels;
  81. accuracy = sum(YPred == YValidation)/numel(YValidation)%0.9934
  82.  
  83. %%save Network, this saved model now can be loaded anywhere in matlab and
  84. %%classfy foreign skin cancer to this model
  85. save skin-cancer-model-alexnet.mat netTransfer
  86.  
  87. %% Try to classify something else
  88. idx = randperm(numel(imdsValidation.Files),4);
  89. figure
  90. for i = 1:4
  91.     subplot(2,2,i)
  92.     I = readimage(imdsValidation,idx(i));
  93.     actualLabel = imdsValidation.Labels(idx(i));
  94.     %I = readimage(imdsValidation,idx(i));
  95.     predictedLabel = netTransfer.classify(I);
  96.     imshow(I)
  97.     label = YPred(idx(i));
  98.     title(['Predicted: ' char(predictedLabel) ', Actual: ' char(actualLabel)])
  99. end
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement