started: working on the efficient net literature review
All checks were successful
continuous-integration/drone/push Build is passing
All checks were successful
continuous-integration/drone/push Build is passing
This commit is contained in:
parent
2223b3e302
commit
e0d93ea366
8
main.bib
8
main.bib
@ -251,3 +251,11 @@ year = 1998
|
|||||||
archivePrefix={arXiv},
|
archivePrefix={arXiv},
|
||||||
primaryClass={cs.CV}
|
primaryClass={cs.CV}
|
||||||
}
|
}
|
||||||
|
@misc{efficient-net,
|
||||||
|
title={EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks},
|
||||||
|
author={Mingxing Tan and Quoc V. Le},
|
||||||
|
year={2020},
|
||||||
|
eprint={1905.11946},
|
||||||
|
archivePrefix={arXiv},
|
||||||
|
primaryClass={cs.LG}
|
||||||
|
}
|
||||||
|
@ -80,6 +80,7 @@
|
|||||||
\item Create a system to distribute the load of training the model's among multiple services.
|
\item Create a system to distribute the load of training the model's among multiple services.
|
||||||
\end{itemize}
|
\end{itemize}
|
||||||
|
|
||||||
|
\pagebreak
|
||||||
\section{Literature and Technical Review}
|
\section{Literature and Technical Review}
|
||||||
This section reviews existing technologies in the market that do image classification. It also reviews current image classification technologies, which meet the requirements for the project. This review also analyses methods that are used to distribute the learning between various physical machines, and how to spread the load so minimum reloading of the models is required when running the model.
|
This section reviews existing technologies in the market that do image classification. It also reviews current image classification technologies, which meet the requirements for the project. This review also analyses methods that are used to distribute the learning between various physical machines, and how to spread the load so minimum reloading of the models is required when running the model.
|
||||||
|
|
||||||
@ -142,12 +143,12 @@
|
|||||||
% This needs some work in terms of gramar
|
% This needs some work in terms of gramar
|
||||||
ResNet works by creating shortcuts between sets of layers, the shortcuts allow residual values from previous layers to be used on the upper layers. The hypothesis being that it is easier to optimize the residual mappings than the linear mappings.
|
ResNet works by creating shortcuts between sets of layers, the shortcuts allow residual values from previous layers to be used on the upper layers. The hypothesis being that it is easier to optimize the residual mappings than the linear mappings.
|
||||||
The results proved that the using the residual values improved training of the model, as the results of the challenge prove.
|
The results proved that the using the residual values improved training of the model, as the results of the challenge prove.
|
||||||
It's important to note that using residual networks tends to give better, the deeper the model is. While this could have a negative impact in performance, the number of parameters per layer does not grow that steeply in ResNet when comparing it with other architectures as it uses other optimizations such as $1x1$ kernel sizes, which are more space efficient. Even with these optimizations, it can still achieve incredible results. Which might make it a good contender to be used in the service as one of the predefined models to use to try to create the machine learning models.
|
It's important to note that using residual networks tends to give better results, the more layers the model has. While this could have a negative impact on performance, the number of parameters per layer does not grow that steeply in ResNet when comparing it with other architectures as it uses other optimizations such as $1x1$ kernel sizes, which are more space efficient. Even with these optimizations, it can still achieve incredible results. Which might make it a good contender to be used in the service as one of the predefined models to use to try to create the machine learning models.
|
||||||
|
|
||||||
|
|
||||||
% RestNet-152
|
% EfficientNet
|
||||||
|
EfficientNet \cite{efficient-net} is deep convulution neural network that was able to achieve $84.3\%$ top-1 accuracy while ``$8.4$x smaller and $6.1$x faster on inference thatn the best existing ConvNet''. EfficientNets\footnote{the family of models that use the thecniques that described in \cite{efficient-net}} are models that instead of the of just increasing the depth or the width of the model, we increase all the parameters at the same time by a constant value.
|
||||||
% EddicientNet
|
|
||||||
|
|
||||||
% \subsection{Efficiency of transfer learning}
|
% \subsection{Efficiency of transfer learning}
|
||||||
|
|
||||||
@ -162,7 +163,7 @@
|
|||||||
|
|
||||||
% There are also unsupervised learning methods that do not have a fixed number of classes. While this method would work as an expandable model method, it would not work for the purpose of this project. This project requires that the model has a specific set of labels which does not work with unsupervised learning which has unlabelled data. Some technics that are used for unsupervised learning might be useful in the process of creating expandable models.
|
% There are also unsupervised learning methods that do not have a fixed number of classes. While this method would work as an expandable model method, it would not work for the purpose of this project. This project requires that the model has a specific set of labels which does not work with unsupervised learning which has unlabelled data. Some technics that are used for unsupervised learning might be useful in the process of creating expandable models.
|
||||||
|
|
||||||
|
\pagebreak
|
||||||
\section{Problem Analysis \& Design Choices}
|
\section{Problem Analysis \& Design Choices}
|
||||||
|
|
||||||
\subsection{Structure of the Service}
|
\subsection{Structure of the Service}
|
||||||
@ -243,6 +244,7 @@
|
|||||||
\appendix
|
\appendix
|
||||||
\newpage
|
\newpage
|
||||||
|
|
||||||
|
\pagebreak
|
||||||
\section{References}
|
\section{References}
|
||||||
\printbibliography[heading=none]
|
\printbibliography[heading=none]
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user