This commit is contained in:
parent
16aafe65b8
commit
7323be8dbc
13
.drone.yml
13
.drone.yml
@ -37,6 +37,17 @@ steps:
|
|||||||
- pdflatex --shell-escape report.tex
|
- pdflatex --shell-escape report.tex
|
||||||
- pdflatex --shell-escape report.tex
|
- pdflatex --shell-escape report.tex
|
||||||
- cd -
|
- cd -
|
||||||
|
|
||||||
|
- name: Build worktable
|
||||||
|
commands:
|
||||||
|
- cd worktable
|
||||||
|
- pdflatex --shell-escape worktable.tex
|
||||||
|
# Prepare bib
|
||||||
|
- /usr/bin/vendor_perl/biber worktable
|
||||||
|
# Compile twice for the table of contents and for bib text
|
||||||
|
- pdflatex --shell-escape worktable.tex
|
||||||
|
- pdflatex --shell-escape worktable.tex
|
||||||
|
- cd -
|
||||||
#
|
#
|
||||||
# - name: Generate text
|
# - name: Generate text
|
||||||
# commands:
|
# commands:
|
||||||
@ -51,7 +62,7 @@ steps:
|
|||||||
- tea login add --url https://git.andr3h3nriqu3s.com --token "$TOKEN"
|
- tea login add --url https://git.andr3h3nriqu3s.com --token "$TOKEN"
|
||||||
- tea r rm -y current || echo "Release not found"
|
- tea r rm -y current || echo "Release not found"
|
||||||
# - tea r c --title "Latest Report" --asset report/report.pdf --asset upds-1/UPDS12-1.pdf --asset upds-2/UPDS12-2.pdf --asset results.txt --asset poster/poster.pdf current
|
# - tea r c --title "Latest Report" --asset report/report.pdf --asset upds-1/UPDS12-1.pdf --asset upds-2/UPDS12-2.pdf --asset results.txt --asset poster/poster.pdf current
|
||||||
- tea r c --title "Latest Report" --asset projectsynopsis/project-synopsis.pdf --asset report/report.pdf current
|
- tea r c --title "Latest Report" --asset projectsynopsis/project-synopsis.pdf --asset report/report.pdf --asset worktable/worktable.pdf current
|
||||||
|
|
||||||
- name: Remove current on failure
|
- name: Remove current on failure
|
||||||
environment:
|
environment:
|
||||||
|
@ -146,15 +146,15 @@
|
|||||||
ResNet works by creating shortcuts between sets of layers, the shortcuts allow residual values from previous layers to be used on the upper layers. The hypothesis being that it is easier to optimize the residual mappings than the linear mappings.
|
ResNet works by creating shortcuts between sets of layers, the shortcuts allow residual values from previous layers to be used on the upper layers. The hypothesis being that it is easier to optimize the residual mappings than the linear mappings.
|
||||||
The results proved that the using the residual values improved training of the model, as the results of the challenge prove.
|
The results proved that the using the residual values improved training of the model, as the results of the challenge prove.
|
||||||
It's important to note that using residual networks tends to give better results, the more layers the model has. While this could have a negative impact on performance, the number of parameters per layer does not grow that steeply in ResNet when comparing it with other architectures as it uses other optimizations such as $1x1$ kernel sizes, which are more space efficient. Even with these optimizations, it can still achieve incredible results. Which might make it a good contender to be used in the service as one of the predefined models to use to try to create the machine learning models.
|
It's important to note that using residual networks tends to give better results, the more layers the model has. While this could have a negative impact on performance, the number of parameters per layer does not grow that steeply in ResNet when comparing it with other architectures as it uses other optimizations such as $1x1$ kernel sizes, which are more space efficient. Even with these optimizations, it can still achieve incredible results. Which might make it a good contender to be used in the service as one of the predefined models to use to try to create the machine learning models.
|
||||||
|
|
||||||
|
|
||||||
|
% MobileNet
|
||||||
|
|
||||||
% EfficientNet
|
% EfficientNet
|
||||||
EfficientNet \cite{efficient-net} is deep convulution neural network that was able to achieve $84.3\%$ top-1 accuracy while ``$8.4$x smaller and $6.1$x faster on inference thatn the best existing ConvNet''. EfficientNets\footnote{the family of models that use the thecniques that described in \cite{efficient-net}} are models that instead of the of just increasing the depth or the width of the model, we increase all the parameters at the same time by a constant value. By not scaling only depth EfficientNets are able to aquire more information about the images specialy the image size is taken into acount.
|
EfficientNet \cite{efficient-net} is deep convulution neural network that was able to achieve $84.3\%$ top-1 accuracy while ``$8.4$x smaller and $6.1$x faster on inference thatn the best existing ConvNet''. EfficientNets\footnote{the family of models that use the thecniques that described in \cite{efficient-net}} are models that instead of the of just increasing the depth or the width of the model, we increase all the parameters at the same time by a constant value. By not scaling only depth EfficientNets are able to aquire more information about the images specialy the image size is taken into acount.
|
||||||
To test their results the EfficientNet team created a baseline model which as a building block used the mobile inverted bottleneck MBConv \cite{inverted-bottleneck-mobilenet}. The baseline model was then scaled using the compound method which resulted in better top-1 and top-5 accuracy.
|
To test their results the EfficientNet team created a baseline model which as a building block used the mobile inverted bottleneck MBConv \cite{inverted-bottleneck-mobilenet}. The baseline model was then scaled using the compound method which resulted in better top-1 and top-5 accuracy.
|
||||||
While EfficientNets are smaller than their non-EfficientNet conterparts they are more computational intencive, a ResNet-50 scaled using the EfficientNet compound scaling method is $3\%$ more computational intencive then a ResNet-50 scaled using only depth while only improving the top-1 accuracy by $0.7\%$, and as the model will be trained and run multiple times decreasing the computational cost might be a better overall target for sustainability then being able to offer higher accuracies.
|
While EfficientNets are smaller than their non-EfficientNet conterparts they are more computational intencive, a ResNet-50 scaled using the EfficientNet compound scaling method is $3\%$ more computational intencive then a ResNet-50 scaled using only depth while only improving the top-1 accuracy by $0.7\%$, and as the model will be trained and run multiple times decreasing the computational cost might be a better overall target for sustainability then being able to offer higher accuracies.
|
||||||
Eventhough scaling using the EfficientNet compound method might not yield the best results using some of the EfficientNets what were obtimized by the team to would be optimal, for example, EfficientNet-B1 is both small and efficient while still obtaining $79.1\%$ top-1 accuracy in ImageNet, and realistically the datasets that this system will process will be smaller and more scope specific then ImageNet.
|
Eventhough scaling using the EfficientNet compound method might not yield the best results using some of the EfficientNets what were obtimized by the team to would be optimal, for example, EfficientNet-B1 is both small and efficient while still obtaining $79.1\%$ top-1 accuracy in ImageNet, and realistically the datasets that this system will process will be smaller and more scope specific then ImageNet.
|
||||||
|
|
||||||
|
|
||||||
% \subsection{Efficiency of transfer learning}
|
% \subsection{Efficiency of transfer learning}
|
||||||
|
|
||||||
% \subsection{Creation Models}
|
% \subsection{Creation Models}
|
||||||
|
80
worktable/worktable.tex
Normal file
80
worktable/worktable.tex
Normal file
@ -0,0 +1,80 @@
|
|||||||
|
%%% Preamble
|
||||||
|
\documentclass[11pt, a4paper]{article}
|
||||||
|
|
||||||
|
\usepackage[english]{babel} % English language/hyphenation
|
||||||
|
\usepackage{url}
|
||||||
|
\usepackage{tabularx}
|
||||||
|
\usepackage{pdfpages}
|
||||||
|
\usepackage{float}
|
||||||
|
|
||||||
|
\usepackage{graphicx}
|
||||||
|
\graphicspath{ {../images for report/} }
|
||||||
|
\usepackage[margin=2cm]{geometry}
|
||||||
|
|
||||||
|
\usepackage{hyperref}
|
||||||
|
\hypersetup{
|
||||||
|
colorlinks,
|
||||||
|
citecolor=black,
|
||||||
|
filecolor=black,
|
||||||
|
linkcolor=black,
|
||||||
|
urlcolor=black
|
||||||
|
}
|
||||||
|
|
||||||
|
\usepackage{cleveref}
|
||||||
|
|
||||||
|
%%% Custom headers/footers (fancyhdr package)
|
||||||
|
\usepackage{fancyhdr}
|
||||||
|
\pagestyle{fancyplain}
|
||||||
|
\fancyhead{} % No page header
|
||||||
|
\fancyfoot[L]{} % Empty
|
||||||
|
\fancyfoot[C]{\thepage} % Pagenumbering
|
||||||
|
\fancyfoot[R]{} % Empty
|
||||||
|
\renewcommand{\headrulewidth}{0pt} % Remove header underlines
|
||||||
|
\renewcommand{\footrulewidth}{0pt} % Remove footer underlines
|
||||||
|
\setlength{\headheight}{13.6pt}
|
||||||
|
|
||||||
|
% numeric
|
||||||
|
\usepackage[style=ieee,sorting=none,backend=biber]{biblatex}
|
||||||
|
\addbibresource{../main.bib}
|
||||||
|
|
||||||
|
% Write the approved title of your dissertation
|
||||||
|
\title{Automated image classification with expandable models}
|
||||||
|
|
||||||
|
% Write your full name, as in University records
|
||||||
|
\author{Andre Henriques, 6644818}
|
||||||
|
|
||||||
|
\date{}
|
||||||
|
|
||||||
|
%%% Begin document
|
||||||
|
\begin{document}
|
||||||
|
\begin{tabular}{ |m{0.5\textwidth}|m{0.5\textwidth}| }
|
||||||
|
\hline
|
||||||
|
Month & Goals \\
|
||||||
|
\hline
|
||||||
|
Feburary & \begin{itemize}
|
||||||
|
\item Add api support.
|
||||||
|
\item Start working on expandable models generation
|
||||||
|
\item Start documenting the design process
|
||||||
|
\item Improve Literature review
|
||||||
|
\item Start documenting results
|
||||||
|
\end{itemize} \\
|
||||||
|
\hline
|
||||||
|
March & \begin{itemize}
|
||||||
|
\item Create systems to expand the expandable models and contract models
|
||||||
|
\item Continue documenting the design process
|
||||||
|
\item Review draft submissions
|
||||||
|
\end{itemize} \\
|
||||||
|
\hline
|
||||||
|
April & \begin{itemize}
|
||||||
|
\item Basic final report finish
|
||||||
|
\item Create systems to expand and reduce expandable models
|
||||||
|
\end{itemize} \\
|
||||||
|
\hline
|
||||||
|
May & \begin{itemize}
|
||||||
|
\item Finish and submit final report
|
||||||
|
\end{itemize} \\
|
||||||
|
\hline
|
||||||
|
\end{tabular}
|
||||||
|
\end{document}
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue
Block a user