More work on the literature review
All checks were successful
continuous-integration/drone/push Build is passing
All checks were successful
continuous-integration/drone/push Build is passing
This commit is contained in:
parent
f50c55b748
commit
94ac62fefa
35
main.bib
35
main.bib
@ -193,4 +193,37 @@ year = 1998
|
|||||||
pages={248--255},
|
pages={248--255},
|
||||||
year={2009},
|
year={2009},
|
||||||
organization={Ieee}
|
organization={Ieee}
|
||||||
}
|
}
|
||||||
|
@article{resnet-152,
|
||||||
|
author = {Qilong Wang and
|
||||||
|
Banggu Wu and
|
||||||
|
Pengfei Zhu and
|
||||||
|
Peihua Li and
|
||||||
|
Wangmeng Zuo and
|
||||||
|
Qinghua Hu},
|
||||||
|
title = {ECA-Net: Efficient Channel Attention for Deep Convolutional Neural
|
||||||
|
Networks},
|
||||||
|
journal = {CoRR},
|
||||||
|
volume = {abs/1910.03151},
|
||||||
|
year = {2019},
|
||||||
|
url = {http://arxiv.org/abs/1910.03151},
|
||||||
|
eprinttype = {arXiv},
|
||||||
|
eprint = {1910.03151},
|
||||||
|
timestamp = {Mon, 04 Dec 2023 21:30:01 +0100},
|
||||||
|
biburl = {https://dblp.org/rec/journals/corr/abs-1910-03151.bib},
|
||||||
|
bibsource = {dblp computer science bibliography, https://dblp.org}
|
||||||
|
}
|
||||||
|
@article{efficientnet,
|
||||||
|
author = {Mingxing Tan and
|
||||||
|
Quoc V. Le},
|
||||||
|
title = {EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks},
|
||||||
|
journal = {CoRR},
|
||||||
|
volume = {abs/1905.11946},
|
||||||
|
year = {2019},
|
||||||
|
url = {http://arxiv.org/abs/1905.11946},
|
||||||
|
eprinttype = {arXiv},
|
||||||
|
eprint = {1905.11946},
|
||||||
|
timestamp = {Mon, 03 Jun 2019 13:42:33 +0200},
|
||||||
|
biburl = {https://dblp.org/rec/journals/corr/abs-1905-11946.bib},
|
||||||
|
bibsource = {dblp computer science bibliography, https://dblp.org}
|
||||||
|
}
|
||||||
|
@ -123,11 +123,14 @@
|
|||||||
|
|
||||||
The system will use supervised models to classify images, using a combination of different types models, using neural networks, convulution neural networks, deed neural networks and deep convluution neural networks.
|
The system will use supervised models to classify images, using a combination of different types models, using neural networks, convulution neural networks, deed neural networks and deep convluution neural networks.
|
||||||
|
|
||||||
These types where chosen as they have had a large success in past in other image classification chalanges, for example in the imagenet chanlage \cite{imagenet}, which has ranked various different models in classifiying a large range of images.
|
These types where chosen as they have had a large success in past in other image classification chalanges, for example in the imagenet chanlage \cite{imagenet}, which has ranked various different models in classifiying a 14 million images. The contest has been running since 2010 to 2017.
|
||||||
|
|
||||||
% TODO talk about imagenet
|
The models that participated in the contest tended to use more and more Deep convlution neural networks, out of various model that where generated there are a few landmark models that were able to acchive high acurracies, including AlexNet \cite{krizhevsky2012imagenet}, VVG, ResNet-152\cite{resnet-152}, EfficientNet\cite{efficientnet}.
|
||||||
|
% TODO find vgg to cite
|
||||||
|
|
||||||
When talking about general image classification we have to talk about imagenet.
|
These models can used in two ways in the system, they can be used to generate the models via transferlearning and by using the model structure as a basis to generate a complete new model.
|
||||||
|
|
||||||
|
% TODO compare the models
|
||||||
|
|
||||||
\subsection{Creation Models}
|
\subsection{Creation Models}
|
||||||
The models that I will be creating will be Convolutional Neural Network(CNN) \cite{lecun1989handwritten,fukushima1980neocognitron}.
|
The models that I will be creating will be Convolutional Neural Network(CNN) \cite{lecun1989handwritten,fukushima1980neocognitron}.
|
||||||
|
Loading…
Reference in New Issue
Block a user