Skip to content

Commit

Permalink
Merge branch 'doc/mmp-final-report' into develop
Browse files Browse the repository at this point in the history
Merge of doc/mmp-final-report into develop. Closes #138
  • Loading branch information
cgddrd committed May 10, 2015
2 parents 261a8c4 + 99d580e commit 10e3096
Show file tree
Hide file tree
Showing 76 changed files with 29,146 additions and 485 deletions.
50 changes: 43 additions & 7 deletions documentation/final_report/Appendix1/appendix1.tex
Original file line number Diff line number Diff line change
@@ -1,13 +1,49 @@
\chapter{Third-Party Code and Libraries}

If you have made use of any third party code or software libraries, i.e. any code that you have not designed and written yourself, then you must include this appendix.
%If you have made use of any third party code or software libraries, i.e. any code that you have not designed and written yourself, then you must include this appendix.
%
%As has been said in lectures, it is acceptable and likely that you will make use of third-party code and software libraries. The key requirement is that we understand what is your original work and what work is based on that of other people.
%
%Therefore, you need to clearly state what you have used and where the original material can be found. Also, if you have made any changes to the original versions, you must explain what you have changed.
%
%As an example, you might include a definition such as:
%
%Apache POI library Ð The project has been used to read and write Microsoft Excel files (XLS) as part of the interaction with the clientÕs existing system for processing data. Version 3.10-FINAL was used. The library is open source and it is available from the Apache Software Foundation
%\cite{apache_poi}. The library is released using the Apache License
%\cite{apache_license}. This library was used without modification.

As has been said in lectures, it is acceptable and likely that you will make use of third-party code and software libraries. The key requirement is that we understand what is your original work and what work is based on that of other people.
\textbf{OpenCV} \cite{opencv}

Therefore, you need to clearly state what you have used and where the original material can be found. Also, if you have made any changes to the original versions, you must explain what you have changed.
Originally developed by Intel, \textit{OpenCV} is one of the most popular open source computer vision and machine learning libraries currently available, providing over 2500 optimised functions that implement some of the most renowned computer vision algorithms including SIFT (Scale Invariant Feature Transform)\footnote{\url{http://docs.opencv.org/modules/nonfree/doc/feature_detection.html}}, Viola-Jones face detection\footnote{\url{http://docs.opencv.org/doc/tutorials/objdetect/cascade_classifier/cascade_classifier.html}} and Lucas-Kanade optical flow\footnote{\url{http://docs.opencv.org/modules/video/doc/motion_analysis_and_object_tracking.html#calcopticalflowpyrlk}}.

As an example, you might include a definition such as:
Available as open source under the BSD license \cite{opencv-lic}.

Apache POI library Ð The project has been used to read and write Microsoft Excel files (XLS) as part of the interaction with the clientÕs existing system for processing data. Version 3.10-FINAL was used. The library is open source and it is available from the Apache Software Foundation
\cite{apache_poi}. The library is released using the Apache License
\cite{apache_license}. This library was used without modification.
\textbf{Numpy} \cite{numpy}

Third-party extension library providing optimised array-type objects in addition to a wide range of mathematical operations associated with numeric collections including matrix manipulation\footnote{\url{http://docs.scipy.org/doc/numpy/reference/routines.linalg.htm}}, statistical analysis\footnote{\url{http://docs.scipy.org/doc/numpy/reference/routines.statistics.html}} and Fourier transforms\footnote{\url{http://docs.scipy.org/doc/numpy/reference/routines.fft.html}}.

One of the core data structures provided by \textit{Numpy} is the multidimensional array\footnote{\url{http://docs.scipy.org/doc/numpy/reference/generated/numpy.ndarray.html}}, which happened to be the data structure of selected to represent images within the Python bindings for the \textit{OpenCV} library \cite{opencv}.

Extensively utilised throughout the project to provide statistical analysis of results and in order to work with the Python bindings within OpenCV.

Available as open source under the BSD license \cite{numpy-lic}.

\textbf{Cython} \cite{cython}

Provided facilities to translate computationally expensive functions written originally in Python, into optimised C code. The library came with its own dialect of Python, that while being almost identical, enabled developers to add additional `C-specific instructions' (such as specifying type declarations for variables) to allow for full conversion from Python to C without the need for the developer to write any C code themselves. \textit{Cython} source files were distinguished from ``true" Python source files through the use of the \textit{``.pyx"} file extension (as opposed to the \textit{``.py"} extension for Python source files).

Available as open source under the Apache Software license \cite{cython-lic}.

\textbf{Matplotlib} \cite{matplotlib}

Plotting library for Python providing high-level support for producing a wide range of scientific graphs and figures in both 2D and 3D. It was used extensively throughout this project for depicting experiment results and analysing method behaviour (e.g. plotting the behaviour of template matching similarity scores).

Available as open source under the Python Software Foundation and BSD licenses \cite{matplotlib-lic}.

\textbf{iPython} \cite{ipython}

Provides an interactive environment for running computational operations where a dedicated graphical user interface may not be appropriate. Provides ability to implement, run and share computational tasks and experiments within a web-based environment (known as an \textit{iPython notebook}\footnote{\url{http://ipython.org/notebook.html}}).

Provided ability to both run experiment tests and view results within the same testing environment.

Available as open source under the Revised BSD license \cite{ipython-lic}.
256 changes: 165 additions & 91 deletions documentation/final_report/Appendix2/appendix2.tex
Original file line number Diff line number Diff line change
@@ -1,94 +1,168 @@
\chapter{Code samples}

\section{Random Number Generator}

The Bayes Durham Shuffle ensures that the psuedo random numbers used in the simulation are further shuffled, ensuring minimal correlation between subsequent random outputs \cite{NumericalRecipes}.

\begin{verbatim}
#define IM1 2147483563
#define IM2 2147483399
#define AM (1.0/IM1)
#define IMM1 (IM1-1)
#define IA1 40014
#define IA2 40692
#define IQ1 53668
#define IQ2 52774
#define IR1 12211
#define IR2 3791
#define NTAB 32
#define NDIV (1+IMM1/NTAB)
#define EPS 1.2e-7
#define RNMX (1.0 - EPS)
double ran2(long *idum)
{
/*---------------------------------------------------*/
/* Minimum Standard Random Number Generator */
/* Taken from Numerical recipies in C */
/* Based on Park and Miller with Bays Durham Shuffle */
/* Coupled Schrage methods for extra periodicity */
/* Always call with negative number to initialise */
/*---------------------------------------------------*/
int j;
long k;
static long idum2=123456789;
static long iy=0;
static long iv[NTAB];
double temp;
if (*idum <=0)
{
if (-(*idum) < 1)
{
*idum = 1;
}else
{
*idum = -(*idum);
}
idum2=(*idum);
for (j=NTAB+7;j>=0;j--)
{
k = (*idum)/IQ1;
*idum = IA1 *(*idum-k*IQ1) - IR1*k;
if (*idum < 0)
{
*idum += IM1;
}
if (j < NTAB)
{
iv[j] = *idum;
}
}
iy = iv[0];
}
k = (*idum)/IQ1;
*idum = IA1*(*idum-k*IQ1) - IR1*k;
if (*idum < 0)
{
*idum += IM1;
}
k = (idum2)/IQ2;
idum2 = IA2*(idum2-k*IQ2) - IR2*k;
if (idum2 < 0)
{
idum2 += IM2;
}
j = iy/NDIV;
iy=iv[j] - idum2;
iv[j] = *idum;
if (iy < 1)
{
iy += IMM1;
}
if ((temp=AM*iy) > RNMX)
{
return RNMX;
}else
{
return temp;
}
}
\end{verbatim}
\begin{algorithm}
\caption{Generalised approach to performing template matching as implemented within the experiments for this investigation.}
\label{appen:code1}
\begin{algorithmic}[1]

\Procedure{Template\_Matching}{\textit{template\_image}, \textit{search\_image}}

\State let \textit{high\_score} = $-1$
\State let \textit{high\_score\_position} = $(-1, -1)$ \Comment{Initialise high score and position.}
\State let \textit{template\_image\_height} = len(\textit{template\_image})
\State let \textit{template\_image\_width} = len(\textit{template\_image}[0])\\\\

\Comment{Convolve the template image through the image we are searching.}
\For{$i \coloneqq 0$ \textbf{to} (len(\textit{search\_image}) - 1) - \textit{template\_image\_height} \textbf{step} $1$}
\For{$j \coloneqq 0$ \textbf{to} (len(\textit{search\_image[0]}) - 1) - \textit{template\_image\_width} \textbf{step} $1$} \\

\State let current\_window = search\_image[$i$][$j$]
\State let current\_match\_score = CHECK\_SIMILARITY(\textit{template\_image}, \textit{current\_window}) \\

\If{(\textit{high\_score} == $-1$) \textbf{or} (\textit{current\_match\_score} \textbf{is} better than \textit{high\_score})}

\State \textit{high\_score} = \textit{current\_match\_score}
\State \textit{high\_score\_position} = $(i, j)$

\EndIf \\

\EndFor
\EndFor \\

\\ \Return \textit{high\_score\_position} \\
\EndProcedure

\end{algorithmic}
\end{algorithm}

\begin{algorithm}
\caption{Non-Exhaustive Search of Localised Search Window}
\label{appen:code2}
\begin{algorithmic}[1]

\Procedure{Localised\_Search\_NonExhaustive}{\textit{template\_patch}, \textit{search\_column}}

\State let \textit{high\_score} = $-1$
\State let \textit{vertical\_displacement} = $0$
\State let \textit{template\_patch\_height} = len(\textit{template\_patch})\\

\LineComment{Localised search window originates relative to the top of the extracted template patch within the image.}\\
\For{$i \coloneqq 0$ \textbf{to} (len(\textit{search\_column}) - 1) - \textit{template\_patch\_height} \textbf{step} $1$} \\

\State let current\_match\_score = CHECK\_SIMILARITY(\textit{template\_patch}, \textit{search\_column}) \\

\If{(\textit{high\_score} == $-1$) \textbf{or} (\textit{current\_match\_score} \textbf{is} better than \textit{high\_score})} \\

\LineComment{If no high score has been set (i.e. the search has just begun) or the new score is deemed ``better" than the previous high score, then we have found the new best match.} \\
\State \textit{high\_score} = \textit{current\_match\_score}
\State \textit{vertical\_displacement} = $i$


\Else

\LineComment{Otherwise, if the new score is \textit{worse}, then stop the search at this point.}

\State \Return \textit{vertical\_displacement}

\EndIf

\EndFor

\\ \Return \textit{vertical\_displacement} \\
\EndProcedure

\end{algorithmic}
\end{algorithm}

%\section{Random Number Generator}
%
%The Bayes Durham Shuffle ensures that the psuedo random numbers used in the simulation are further shuffled, ensuring minimal correlation between subsequent random outputs \cite{NumericalRecipes}.
%
%\begin{verbatim}
% #define IM1 2147483563
% #define IM2 2147483399
% #define AM (1.0/IM1)
% #define IMM1 (IM1-1)
% #define IA1 40014
% #define IA2 40692
% #define IQ1 53668
% #define IQ2 52774
% #define IR1 12211
% #define IR2 3791
% #define NTAB 32
% #define NDIV (1+IMM1/NTAB)
% #define EPS 1.2e-7
% #define RNMX (1.0 - EPS)
%
% double ran2(long *idum)
% {
% /*---------------------------------------------------*/
% /* Minimum Standard Random Number Generator */
% /* Taken from Numerical recipies in C */
% /* Based on Park and Miller with Bays Durham Shuffle */
% /* Coupled Schrage methods for extra periodicity */
% /* Always call with negative number to initialise */
% /*---------------------------------------------------*/
%
% int j;
% long k;
% static long idum2=123456789;
% static long iy=0;
% static long iv[NTAB];
% double temp;
%
% if (*idum <=0)
% {
% if (-(*idum) < 1)
% {
% *idum = 1;
% }else
% {
% *idum = -(*idum);
% }
% idum2=(*idum);
% for (j=NTAB+7;j>=0;j--)
% {
% k = (*idum)/IQ1;
% *idum = IA1 *(*idum-k*IQ1) - IR1*k;
% if (*idum < 0)
% {
% *idum += IM1;
% }
% if (j < NTAB)
% {
% iv[j] = *idum;
% }
% }
% iy = iv[0];
% }
% k = (*idum)/IQ1;
% *idum = IA1*(*idum-k*IQ1) - IR1*k;
% if (*idum < 0)
% {
% *idum += IM1;
% }
% k = (idum2)/IQ2;
% idum2 = IA2*(idum2-k*IQ2) - IR2*k;
% if (idum2 < 0)
% {
% idum2 += IM2;
% }
% j = iy/NDIV;
% iy=iv[j] - idum2;
% iv[j] = *idum;
% if (iy < 1)
% {
% iy += IMM1;
% }
% if ((temp=AM*iy) > RNMX)
% {
% return RNMX;
% }else
% {
% return temp;
% }
% }
%
%\end{verbatim}
%
28 changes: 28 additions & 0 deletions documentation/final_report/Appendix3/appendix3.tex
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
\chapter{Additional Figures}

\begin{figure}[ht!]
\centering
\includegraphics[scale=0.4]{images/burndown.png}
\caption{Burndown chart indicating project progress over the previous 90-day period. Courtesy: \protect\url{http://burndown.io/#cgddrd/cs39440-major-project}.}
\label{fig:burndown}
\end{figure}

\begin{figure}[ht!]
\centering
\includegraphics[scale=0.5]{images/throughput.png}
\caption{Throughput chart describing rate of task completion in terms of task weighting over the previous eight weeks. Courtesy: \protect\url{https://waffle.io/cgddrd/cs39440-major-project/metrics/throughput}.}
\label{fig:throughput}
\end{figure}

\begin{landscape}

\begin{figure}[ht!]
\centering
\includegraphics[scale=0.38]{images/tse_class_diagram}
\caption{UML class diagram describing the structure of the `Terrain Shape Estimation' Python-based library utilised extensively throughout the investigation for providing common functionality shared across experiments.}
\label{fig:class}
\end{figure}

\end{landscape}


0 comments on commit 10e3096

Please sign in to comment.