diff --git a/tex/chapter/200-background-network.tex b/tex/chapter/200-background-network.tex index 845fead..508e6ab 100644 --- a/tex/chapter/200-background-network.tex +++ b/tex/chapter/200-background-network.tex @@ -82,7 +82,7 @@ \subsubsection{State-based Intrusion Detection Systems} Unfortunately, the high complexity of tracking the state of each and every connection while simultaneously verifying against a profile is a major drawback of this approach. It requires a lot of processing resources and memory capacity. Further, this method is only able to detect violations against a protocol's fundamental behaviour. Additionally, the state-based analysis works the best with stateful, connection-oriented protocols, this does not include protocols relying on short, broadcasted, and self-containing commands as they can be found in \gls{bas}. \parencite[p.~306]{Whitman2009} - + \subsubsection{Anomaly-based Intrusion Detection Systems (A-IDS)} \label{sec:background:network:ids:anomaly} @@ -156,7 +156,8 @@ \subsection{Flow Monitoring} Common protocols used for flow monitoring are \gls{netflow} \parencite{Claise2004} and \gls{ipfix} \parencite{Claise2013}. %\todo{not predominantly used for security, but network congestion avoidance/mitigate.} -\begin{figure} +\newpage +\begin{figure}[h] \centering \includegraphics[width=\textwidth]{figures/200-netflow-architecture.pdf} \caption[Simplified flow monitoring system architecture]{Simplified flow monitoring system architecture. \parencite[cf.][]{Hofstede2014} \todo{adjust font size}} @@ -245,7 +246,7 @@ \section{Anomaly, Outlier, and Novelty Detection} It requires also pre-labelled data, but compared to the second type it only works with \emph{normal} labelled data points. Consequently the full gamut of \emph{normality} is required to train a precise model. On the other hand no \emph{outlying} data points are required for training, which is highly beneficial in certain, data-sparse, scenarios, when abnormal data is difficult to obtain. However, the training data must not contain outliers, otherwise they will be assumed to be a part of \emph{normality}. Generally the aim of this approach is to establish a tight boundary around \emph{normality} and is therefore suitable for static as well as dynamic data. -If a new data point lies outside of the this boundary it is considered an \emph{outlier}, if not it is part of the \emph{normal} data. +If a new data point lies outside of the this boundary it is considered an \emph{outlier}, if not it is part of the \emph{normal} data. \parencite{Hodge2004} % ------------------------------------------------------------------------------ \newpage @@ -310,6 +311,7 @@ \subsubsection{Local Outlier Factor} \newpage \begin{wrapfigure}{l}{0.5\textwidth} \centering + \vspace{-5pt} \includegraphics[width=0.5\textwidth,trim={12mm 5mm 15mm 10mm},keepaspectratio,clip]{figures/200-background-lof.pdf} \caption[Example visualisation of the LOF]{Visualisation of the \gls{lof} in a two dimensional vector space. \emph{Green}: trainings data, \emph{Red}: outlier from test data, \emph{White}: inlier from test data. Background indicates the calculated \emph{LOF} value.} \label{fig:background:network:novelty:lof} @@ -320,6 +322,7 @@ \subsubsection{Local Outlier Factor} An example of the \gls{lof} calculation in a two dimensional vector space containing random test data is illustrated in Figure~\ref{fig:background:network:novelty:lof}. The background in this figure indicates the \gls{lof} values for every possible point, whereby a blue background indicates values outside of the threshold for \emph{normality} and a red one marks values within those borders. +\vspace{8pt} Moreover, the ability of \gls{lof} to work on unclean data and account for different densities in clusters through locality, makes it a good fit for \gls{ids} solutions, as \textcite{Lazarevic2003} shows. Also \textcite{Zanero2004} conducted an experiment on using unsupervised learning algorithms for intrusion detection. They found, that proximity-based approaches like \gls{knn} need to address the problem of locality for some observations, in order to archive more precise predictions. The \gls{lof}, as presented by \textcite{Breunig2000}, focusses on this aspect. diff --git a/tex/chapter/300-prior-work.tex b/tex/chapter/300-prior-work.tex index 497d1cb..69ad7b7 100644 --- a/tex/chapter/300-prior-work.tex +++ b/tex/chapter/300-prior-work.tex @@ -41,11 +41,13 @@ The part of their research regarding flow analysis focuses on \gls{bac} over \gls{ip} and employs a simple volumetric approach (measuring the throughput) and calculates the entropy (cf. Section~\ref{sec:background:network:novelty:entropy}) of the flow data. As a result \textcite{Celeda2012} were able to detect and identify several attacks and a bot net in \gls{bac} installations, based on flow data. -\begin{wrapfigure}{r}{0.5\textwidth} +\begin{wrapfigure}{r}{0.6\textwidth} + \vspace{-18pt} \centering - \includegraphics[width=0.45\textwidth,keepaspectratio]{figures/300-Pan2014-architecture.png} + \includegraphics[width=0.6\textwidth,keepaspectratio]{figures/300-Pan2014-architecture.png} \caption[Anomaly detection framework architecture by Pan, Hairi, and Al-Nashif]{Anomaly detection framework architecture by \textcite{Pan2014}} \label{fig:background:prior-work:pan-architecture} + \vspace{-20pt} \end{wrapfigure} Further investigations in \gls{bac} security are published by \textcite{Pan2014}. They present \enquote{a framework for a rule based anomaly detection} system in \gls{bas}, using \gls{bac} as example. @@ -57,7 +59,8 @@ The algorithm, first proposed by \textcite{Cohen1995}, is used in a two-class version on pre-labelled data and was applied by \textcite{Pan2014} to more than 7000 data points resulting in baseline model with 20 rules. These rules are consequently used during the detection phase and applied on every flow-frame within a time window to improve the detection rate. In case the rule framework detects an attack, the malicious packet flow is handed over to the attack classification module, which uses a decision table to classify the attack based on three attributes: the targeted protocol layer, the attack motivation, and the victim device. -The consideration of the targeted protocol layer accounts for different kinds of vulnerabilities within the protocol stack. \textcite{Pan2014} specifically focus on \gls{bac}'s \gls{apdu} and \gls{npdu}, which corresponds to the application and network layer. +The consideration of the targeted protocol layer accounts for different kinds of vulnerabilities within the protocol stack. \textcite{Pan2014} specifically focus on \gls{bac}'s \gls{apdu} and \gls{npdu}, which correspond to the application and network layer respectively. + Further, the attack classifier accounts for different attack motivations. This includes \emph{reconnaissance attacks}, which aim to collect information about the network and its traffic, \emph{device access attacks}, representing attempts to access devices without permission, and finally \emph{\gls{dos} attacks}, where the network and devices are saturated with useless commands to disturb normal operation. The last attribute for attack classification is the targeted device, which uses domain knowledge about the network to assign roles to devices. Finally, the classifications from the baseline model and the attack classifier are passed to the action handler module, which is designed to automatically trigger suitable mitigating measures. This includes extracting useful information, dropping packets or suspending connections based on a severity level of the attack, and producing an understandable alert message comprised of the prior gathered information. diff --git a/tex/chapter/400-methods.tex b/tex/chapter/400-methods.tex index 157ecb6..58ebe20 100644 --- a/tex/chapter/400-methods.tex +++ b/tex/chapter/400-methods.tex @@ -64,7 +64,7 @@ Finally, the last category describes \emph{reconnaissance attacks}. These attacks conclude unauthorised detection and mapping of the network and its behaviour. Here only active sweeping approaches are considered, where an attacker probes each individual device in an address range. Passive eavesdropping is not considered as it can not be detected on higher protocol levels due to the bus character of the network. (cf. Section~\ref{sec:background:bas:knx:topo}) -\section{Generating a Test Dataset including malicious activities} +\section[Generating a Test Dataset including malicious activities]{Generating a Test Dataset including\\ malicious activities} \label{sec:methods:gen-test} As \gls{bas} automation systems are only seldom considered within threat models, monitoring systems are rarely installed, if at all. @@ -81,7 +81,7 @@ \section{Generating a Test Dataset including malicious activities} Second, a \gls{dos} attack is performed starting at 2017-02-13 09:00 and targeting the entire line \code{3.4}~. The attack is performed in three bursts of 15 minutes with five minutes break in between. In the \gls{dos} attack a flood of \code{A\_Restart} telegrams with \code{SYSTEM} priority is send, which in reality would cause all targeted devices to restart continuously. Additionally this blocks all other traffic, since the \code{SYSTEM} priority is the highest specified. During the attack the telegrams were injected with a maximum of \(500 \ \sfrac{telegrams}{min}\). As third attack scenario a device scan over the entire possible \gls{knx} address space was performed, starting from 2017-02-13 21:00. -To determine if a device is present, the management \gls{apci}\break\code{A\_DEVICE\_DESCRIPTOR\_READ} is send to all addresses. Every \gls{knx} device is required to implement certain management routines, among them the query for the device descriptor. \code{A\_DEVICE\_DESCRIPTOR\_READ} is ideal since the requesting telegram does not require any parameters and the response only contains two bytes of additional payload. By choosing a request which is adds as little overhead as possible, the throughput is increased and effectively reducing the time required for the scan. \parencite[cf.][p.~46]{DIN_EN_50090-4-1} +To determine if a device is present, the management \gls{apci} \code{A\_DEVICE\_DESCRIPTOR\_READ} is send to all addresses. Every \gls{knx} device is required to implement certain management routines, among them the query for the device descriptor. \code{A\_DEVICE\_DESCRIPTOR\_READ} is ideal since the requesting telegram does not require any parameters and the response only contains two bytes of additional payload. By choosing a request which is adds as little overhead as possible, the throughput is increased and effectively reducing the time required for the scan. \parencite[cf.][p.~46]{DIN_EN_50090-4-1} Equal to the \gls{dos} attack, the telegrams are injected with a maximum of \(500 \ \sfrac{telegrams}{min}\). Finally, two new rogue device are introduced with the addresses \code{3.6.26} and \code{3.5.18} during the entire day of 2017-02-14. @@ -90,12 +90,12 @@ \section{Generating a Test Dataset including malicious activities} The scripts to generate the malicious traffic and the datasets itself can be found on the data disk in Appendix~\ref{app:disk}. +\newpage \section{Evaluating the Detection Results} \label{sec:methods:eval} For each crafted attack, the different anomaly detection algorithms are benchmarked with regards to their ability to detect those. This ability is classified by following criteria: - \begin{enumerate} \item General ability to detect the attack \item Differentiation from background noise of the detection results diff --git a/tex/chapter/500-concept.tex b/tex/chapter/500-concept.tex index ddc9f38..7482529 100644 --- a/tex/chapter/500-concept.tex +++ b/tex/chapter/500-concept.tex @@ -87,7 +87,7 @@ \section{Monitoring Pipeline} \begin{figure}[h] \centering \includegraphics[width=\textwidth]{figures/300-concept-architecture.pdf} - \caption[Pipeline Architecture]{Architecture of the monitoring pipeline \todo{explain symbols.} \todo{information flow back to agents, for time sync.}} + \caption[Pipeline Architecture]{Architecture of the monitoring pipeline concept. \todo{explain symbols.} \todo{information flow back to agents, for time sync.}} \label{fig:concept:architecture} \end{figure} @@ -119,7 +119,7 @@ \section{Monitoring Pipeline} \begin{figure} \centering \includegraphics[width=\textwidth]{figures/500-knx-demo-topo-with-agents.pdf} - \caption[KNX network topology with Agents and Collector]{Exemplary logical topology of a \gls{knx} network with deployed Agents and one Collector.} + \caption[KNX network topology with Agents and Collector]{Exemplary logical topology of a \gls{knx} network with multiple deployed Agents and one Collector.} \label{fig:concept:network} \end{figure} @@ -312,6 +312,7 @@ \subsection{Generating the Feature Vector} For the construction of a feature vector normally only the features with the higher variance would be chosen, since the fields seems to stay constant and therefore do not add any additional information. However, in anomaly detection also normally stable features could be of interest, since a change in them would most certainly indicate an anomaly. + As a compromise between both points of views the following fields were selected as feature vector dimensions: \begin{itemize} @@ -424,13 +425,6 @@ \subsection{The Support Vector Machine Analyser} \subsection{The Entropy Analyser} \label{sec:concept:anal:entropy} -\begin{figure}[h] - \centering - \includegraphics[]{figures/300-time-slots.pdf} - \caption{Example of shifted time slots used in the entropy analyser module.} - \label{fig:concept:time-slots} -\end{figure} - \begin{comment} \begin{itemize} \item cf.~Section~\ref{sec:background:network:novelty:entropy} @@ -459,6 +453,14 @@ \subsection{The Entropy Analyser} The base-model is generated during a dedicated training phase and contains a \glsfirst{pmf} for every dimension of the feature vector. Only the time dimension is excluded, since it is continuous. Seasonal sensitivity is instead achieved, as described in Section~\ref{sec:background:network:features:time}, by diving one period into multiple time chunks. Each of these chunks equates to one sub-model, which represents the activity during this time slot. To reduce hard breaks at the end of each chunk, another set of chunks is used shifted by half the chunk length. Hence every point of time with in the season period is within two chunks. (see Figure~\ref{fig:concept:time-slots}) +\begin{figure}[h] + \centering + \includegraphics[]{figures/300-time-slots.pdf} + \caption{Example of shifted time slots used in the entropy analyser module.} + \label{fig:concept:time-slots} +\end{figure} + +\newpage Further, two types of baseline models will be trained: One general world-view and many Agent specific models. The world-view model is used to identify general abnormal behaviour in the network and can be seen as a general-purpose, less sensitive model. The Agent specific models, on the other hand, are specialised on the traffic and behaviour, unique for one Agent. Therefore, they are able to identify local anomalies, which might be completely normal when seen by another Agent. @@ -473,6 +475,7 @@ \subsection{The Entropy Analyser} \textcite{Toshniwal2014} who proposed this concept initially, are using a fixed amount of clusters, which are used to try-fit new observations into them. If a new observation does not fit into any of these clusters it is considered an outlier. Whereby change in the calculated entropy is used to decide whether a observations fits or not. Further, \textcite{Toshniwal2014} only keep a sliding window of data in what would be the baseline model. This comes with the earlier described disadvantage of continuous training, that an attacker can alternate the modelled \emph{normality} by slowly injecting malicious packets. +\newpage \section{Monitoring and Alerting} \label{sec:concept:mon} diff --git a/tex/chapter/600-prototype-implementation.tex b/tex/chapter/600-prototype-implementation.tex index 731b4c1..65b31b5 100644 --- a/tex/chapter/600-prototype-implementation.tex +++ b/tex/chapter/600-prototype-implementation.tex @@ -90,6 +90,7 @@ \section{The Collector Module} In the case one Agent's window is never received by the Collector, it waits a configurable timeout of about $60$ seconds before relaying this window anyway. This ensures that a time slot is analyses even when an Agent fails, regardless of the failure mode. As this is an anomaly, which can be easily queried in the monitoring and alerting system, it is also detected there and consequently not handled in the Collector apart from a warning in the log. +\newpage \section{The Agent Simulator} \label{sec:impl:agent} @@ -128,6 +129,7 @@ \section{The Agent Simulator} The statistical window is not based on the feature vector (see Section~\ref{sec:concept:anal:feature-vector}). Instead the appearance of certain features are count, e.g. the source address \code{1.1.15} appeared $15$ times. Every further processing, like vectorising and normalising is done in the individual Agent modules. +\newpage \section{The Analyser Base Module} \label{sec:impl:base} @@ -185,7 +187,6 @@ \section{The Address Analyser Module} During the operational phase, the model containing all sets is loaded. Following all addresses in incoming windows are compared to their respective address set. If an unknown address is discovered a counter will be increased. After a window is processed measurement containing these counters per Agent is pushed to the \gls{influxdb}, which contains: - \begin{itemize} \item amount of unique unknown source addresses \item amount of unique unknown destination addresses @@ -206,9 +207,9 @@ \section{Converting a Window into a Feature Vector} Within the proposed concept this means, that a statistical window has to be transformed into a numerical vector representation. The basics principle of this is described in Section~\ref{sec:concept:anal:feature-vector}, whereby this section focusses on the details how each feature is encoded. It is to note, that a window contains discrete or categorical statistical data describing a set of events, instead of just a single event, meaning the window produced by an Agent contains a number of occurrences over the period of the window. + The feature vector, however, will contain a normalised excerpt of the events' features, whereby each feature is encoded as one or more dimensions. Mentioned excerpt of fields contains a set of low level, application independent, and easy to measure fields: - \begin{itemize} \item seconds of the week \item source address @@ -226,6 +227,7 @@ \section{Converting a Window into a Feature Vector} \[ \dfrac{2 \cdot \begin{pmatrix}1 \\ 0\end{pmatrix} + 3 \cdot \begin{pmatrix}0 \\ 1\end{pmatrix}}{2 + 3} = \begin{pmatrix}0.4 \\ 0.6\end{pmatrix} \] + The result is a vector encoding the probability of each bit to occur within this specific window. Goal was to reduce the amount of dimension, which would have been necessary when the entirety of possible addresses is mapped into the feature space -- each address using one dimension to encode the possibility of occurrence. For the two address types, this would result in $2 \cdot 2^{16} = 2 \cdot 65536 = 131072$ dimensions. By applying this adapted form of the hashing trick (cf. Section~\ref{sec:background:network:features:hashing}) it was possible to reduce it to $2 \cdot 16 = 32$ dimensions. diff --git a/tex/chapter/700-results.tex b/tex/chapter/700-results.tex index 9a1a553..f450bd3 100644 --- a/tex/chapter/700-results.tex +++ b/tex/chapter/700-results.tex @@ -13,7 +13,6 @@ \section{Conducting the Performance Experiment} Before injecting and analysing the data, it was then split into three parts. The first two weeks are dedicated to train the baseline models for the Analyser modules. The following week was used as validation, to ensure the algorithms are properly fitted to the data. Finally, the last week was modified to contain four scenarios which alter the behaviour of the line, as described in Section~\ref{sec:methods:gen-test}. \newpage These modifications are meant to resemble plausible attacks on \gls{bas} networks and therefore include: - \begin{enumerate} \item Injecting unusual network traffic by copying traffic from another day and time \item Performing a \gls{dos} attack @@ -29,11 +28,11 @@ \section{Conducting the Performance Experiment} In this experiment the feature to simulate multiple Agents, using filter rules, was not used due to the fact that the dataset was merely recorded on one single line. Therefore, a further separation seemed unfeasible, as it would complicate the evaluation of the detection results without providing more insights. +\newpage \section{Experiment Results} \label{sec:results:results} Finally, in this section the detection results of the proposed algorithms are examined using the graphs generated with \gls{grafana}. They will be used to determine the quality of the detection results based on five criteria: - \begin{enumerate} \item General ability to the detect the attack \item Differentiation from background noise of the detection results @@ -137,7 +136,7 @@ \subsection{Detection of Unusual Network Traffic} 3. Response time & 0s & 0s & n/a & n/a \\\midrule 4. Persistence & n/a & n/a & n/a & n/a \\\bottomrule \end{tabularx} - \caption[Detection results of unusual traffic]{Detection results of unusual traffic. Averages are compared to the validation dataset.} + \caption[Detection results of unusual traffic]{Detection results of unusual traffic. Difference in averages is calculated against the validation dataset.} \label{tab:results:unusual} \end{table} @@ -215,7 +214,7 @@ \subsection{Detection of a DoS Attack} 3. Response time & 0s & 0s & 0s & n/a \\\midrule 4. Persistence & $100$\% & $100$\% & $100$\% & n/a \\\bottomrule \end{tabularx} - \caption[Detection results of the DoS attack]{Detection results of the \gls{dos} attack.} + \caption[Detection results of the DoS attack]{Detection results of the \gls{dos} attack. Difference in averages is calculated against the validation dataset.} \label{tab:results:dos} \end{table} @@ -295,7 +294,7 @@ \subsection{Detection of a Network Scan} 3. Response time & n/a & 0s & 0s & n/a \\\midrule 4. Persistence & n/a & $100$\% & $100$\% & n/a \\\bottomrule \end{tabularx} - \caption[Detection results of the network scan]{Detection results of the network scan.} + \caption[Detection results of the network scan]{Detection results of the network scan. Difference in averages is calculated against the validation dataset.} \label{tab:results:scan} \end{table} @@ -381,7 +380,7 @@ \subsection{Detection of New Devices} 3. Response time & n/a & ($4.5$h) & ($6.83$h) & n/a \\\midrule 4. Persistence & n/a & ($0.54$\%) & ($3.23$\%) & n/a \\\bottomrule \end{tabularx} - \caption[Detection results of new devices]{Detection results of two new devices in the network.} + \caption[Detection results of new devices]{Detection results of two new devices in the network. Difference in averages is calculated against the validation dataset.} \label{tab:results:newdevice} \end{table} diff --git a/tex/chapter/appendix.tex b/tex/chapter/appendix.tex index 6616e7f..ebc1624 100644 --- a/tex/chapter/appendix.tex +++ b/tex/chapter/appendix.tex @@ -240,8 +240,8 @@ \section{Setup Required Services} Username: \code{admin}\\ Password: \code{testpw} \item Import the pre-made dashboards, by navigating to \emph{create} shown as a plus on the left side, and then \emph{Import} - \item Upload \code{src/bas-observer/docker/grafana-bob-dashboard-avg.json} - \item Repeat the upload for \code{src/bas-observer/docker/grafana-bob-dashboard-sum.json} + \item Upload and import\\\code{src/bas-observer/docker/grafana-bob-dashboard-avg.json} + \item Repeat the upload for \\\code{src/bas-observer/docker/grafana-bob-dashboard-sum.json} \end{enumerate} \section{Train the Models} diff --git a/tex/figures/300-concept-architecture.pdf b/tex/figures/300-concept-architecture.pdf index 323b0c6..91442f7 100644 Binary files a/tex/figures/300-concept-architecture.pdf and b/tex/figures/300-concept-architecture.pdf differ diff --git a/tex/figures/300-concept-architecture.xml b/tex/figures/300-concept-architecture.xml new file mode 100644 index 0000000..4bb31a3 --- /dev/null +++ b/tex/figures/300-concept-architecture.xml @@ -0,0 +1 @@ +7V1Zl6LIEv419Th92JVHt7KoEaxyQ3hjawTZjuICv/5GsiiLdll73R5mTndrkmvEF5HJFyHckT33ONwowYr3dcO5IzD9eEf27wiCwEgc/kElUVqCYySWlpgbS8/KzgVTKzbyilnpztKNbali6PtOaAXlQs33PEMLS2XKZuMfytV++0551EAxjVrBVFOceqlo6eEqK8UZ9nzhwbDMVTZ0m2ilF1RFW5sbf+dl490R5O/kv/Syq+R9ZQvdrhTdPxSKyMEd2dv4fph+co89w0HCzcWWtru/cvU0743hhbc0ILMFbcMoX7uhgyiyr/4mXPmm7ynO4FzaTdZnoB4w+LYKXQc+4vDROFrhEhX/ounsq5R8ZfOvT8bGco3Q2GSNYZqbKGuSf5VO1/QO0iV81Rxlu7W02cry0gv3lpMPahthGGUQUnahD0XnaY98P8jq1UWTSWvr7zZatvh2BjZlYxpZrUxCSCyFZpk4h4YPy9lEUGFjOEpo7csIUjIgmqd6Z2XAh0wfl3WTzWWvOLus044Jc9/WNeY4YBhIMwCnwEjk5e+gl+5hZYXGNFCS5R3AWMv6uiqSvbEJjeMfl5sbe4tMm2SmzmTAPhTMJgf7qmgx2PsFROF1UbwfvHgJuXQRpP9gvzCsXcLpuUId2l8PX7YOXwr7LvyyNfzW1JXj1XITh9xFuLPADY8U1XCe/K0VWj6IrK/6Yei7hQodxzLRhRDJp5s072yDdCtAwlfyL7+tI1J2Nxuhryuhckd20q/E/XZv3hHdI4CA6D09CIQcdSlVPO60OFhLMWYpDxNM6/v7EdnFNfewU8lHb0RM7BGx2Moi7qjeJB7Fgx0/bVvcwypUh3Q8dgX7afro6w+Tw9hq7yXy0ZGWk0B3F7ZK4KFK0PHIZSM5YndaxJ/beY9r2S6OqZN6RJN8RO81V9vzszU9nrYPvNWGVngkD6VQI52dPrynRiIdcxFnGkN8q3o8o5GyV5wD9ESOPC0bF9r3O4cRidZ7asNy7grTHzrMKGKhtrbTYz5db8wdoP4e9clZJ/nEKjEJtCG7VmbFOQt7eegc0LWRJ+z15aMtizLMX3dGLu3oPXawGDzvNQLaLbtQdx3yPXqtD58L8nN2CinY0rLr1NZQuJbLUAI9aGDKErGIR8T5OsyVVMQJpvQxi5/Nd0I8oMYzDVNFzOKGK0cRdV8vXzsm1x7kQF7qPZU0Wc7umHyvcxSmFMnPTFqIn81ifRiDkJePsSKyu6cpdxzZA+jbWcO8Y93VMGE2j6FOoLphLBH3B3kWFOsfRvY6Gtmc+W9/YfPEnOKHz1uuD/KeaSHfN+GPFED7gqyDtUoIyZphDMcYCr4k0p48xT31wTkoPRqNaz1Zkm0MBy2u12nLw4WrRd1AhrUmn+NnXOhr1iimXI5crcZRxwTsn/Q2dumVKi7Wuus4etZOcu9jecqZ2vDek0Vhrw7ZCOZvy276P8zRAqysYW2A8eNeAxxzsGLQBCFYnCkhXQ0XkZroycEMwAw/pQ4V2RwkUXdKdXqX62jEaiUPWRLmEFWuMyrBhlCOj2wJ6eKEB7BLCsox6DMaz9Y46KZ6nUY6hOugW6RrrnJ9Ara8CGWRxkbLxU5ZTmLQQaiCXBSR3miEsNKGcwbqRJq72Ok9fCstHQfJQh3e0/KSQ3iJFZi3TD6m+nDvtxoB83gQfJXUPV10QIZgi94k0sXq/HQc8OyMxEcrweapjhzAWKG0fPQUkQr1IXtA2IKxXGX5HKrifSSDXYzEoyN7zwjTtTaa63jKw+VroDsM+gyzPouyBvui1yqphTrhgA0je+FArhW5ezq0mez1CPdlMR9ncBD6g2v1qEQPdufqeAk2Zua19jD3VaCKDiOL6ZoK9mBy/WPRnlKZT7sHlRQCfejsVdDXaMYRI3seCzZP8dMDsn+cn/G4YK9hTDZSxAXyu9jJjvsnu0/7m4GNzbSK/o57iQidot6LNohsteAXGZlcBPJwDjbHxoC1QCUotJYd+B8K1g42zG/52WAnzNbgzzUaZAV+1kF70ipfB8w9kWEyZtof8kl0Ra5b8CmbRFauvFIfBPCdcPrsPloCJc22aGYVD5EikXuAEcFSc+8HUgKtdmjwjFuwoqMwM5GlUWOwEBUhHixHOq9kyyeWg0ZN+xP6c4KvWBN4GlshFkiyriQekRfav1abiceeSQfw2tior+3G/TmNpIA8U31eA5j7hXlFr5pXu7gTljxq0esOcQdQtIaxVxl6Qhk+y4AsKUFQDblo12fm68mwVI70MxTssaMDAram5k7csXsPXmKBgV6sat10V+AZGbwyeAfsyXVC5FXTE0EIO/yhjU45Yxft3PhK75dQmo0H170JrSGEovG8R/DKLJyShNKurT2A1++lJxgYE/4IIaAZtdvPSZD58PgIc/Bqei3ME/SIqVC/tuYea0txsAQdbspyOuyTOTmPg4ldkHePLci3qAetott7SxYPLLc+rgxxEXFVT+XqsJMtDqrIgqWfvHvuATKvD20eJhHsznEiZ6inPzyuVE+A6xPn3x5GjKdUBDsxMZ4NtqOZBDsWT477Ejmerqt693LvIswGBJwZK9aZ+A7kj3pdTHPvdxqBNIfBHgEzBr//bx/8hi2B7ZsxH/NB7hOSNv05WCr0OUSIx0ze1uAcw8XjPkeN+uDPLYoA3xeNe90Vh8rsOZyFnuH7Af5QOLIq3jbROenA9TvmaGbu+Ng8wEq2fOKv1kfw9dCvCVaoweek30I/aTtoD2Xgb/smAdfhbDSg4bqp9joR8ibj/hoTpofw1Gevy6D9VH9wtkgLsuts1b5/OpOMCRnQdQjP5wn/OCYFkA+7kWe+jc7A6M+pbIpn50UfnckIIYa52hJeuI6Q62juEe0t2Nh+bmnkJFLBp4/EyQqdSTTvcWXMgh2c8Z1Cuz3I2UaoH9tVJMFZCs7FEjnZaxZuw3mZAHSAFc2TfauyPyAkHOC6X0UWQi5YiI/ONoL9jI/7oOVonSMyaQPIwUDLJtK7Ou1EArpvQfK2TdDTHPY76cjHHMWlsocz0DNe0MOxUE6MethZJ9atuCi1z3WNv7ItzAnhS4NdpdH/K/WPCwX9j9HZoc+RQgznCFsD/Zuwe0s0V9Tt9JB+Buxww5tsuNj2ZM+36LjgK7YFH2IaMGeQMw3rQ+cTpFOGIzIt26GriMftSGThvgB2/Ic1w89CwMTjXiHmDBenOub7p7L8rMrwFpKVBnOCe5RzmxDpFu5jtoqIB3rfxwp6hJ0a7reWj5G0XDOqy+7kYrsCzirnK7gnondw34SpJBei+wPdvcdA1056pq6etd7nzZ+zO5vcm6/Bmz+DtcPdP9K2RYFGnmHv6Zw0yxe1PLhJyzGPzm9F1Nyi4XhwRF4GyunzzjKP+Kir/McsGc5XWlHrcGKR8xrxIeFankxEkmXE2ftYWpJgSiwt3mrVaNq8SpGlzcveQwPizBeztNglgvbrudiceC2SsTj10WRs1vTJt2AqJ31TLaysbxb7RbTpFk6lf5Nkucd03lknZ9WCzJSoUC1AFbY15Z/mfRse6Jd5YRRCC27H/SkQqKh5D9hluWataKwiH4auhy2wC2EL6gPCFjhZE0DPdxyYvr+pSeJsCPib4jn0FZFdF01NECxdlwP5EXKgfqQcTjHrLxME/VlB2KKP/IEeEr8QbaVb3xWuwuvxVs777eyO/e7VuJUWORZoYfMyJNVUXSP1A2OvTMXLs3XIMhd8GP0BkCU+JfT6fwBZ8sKm3v4uyFKf4jh+/tEqPzGU4tzEd2kht6dGC6AF8tu08NXJID9ZC59zm3H9duDyXQhTOWUTLFHU6Iv1SZqsICCdwVvvPJj2uyGi+a6lZZ/tnRvkbT3fM/KiVLEfAqhTeuAnJBRtw42/NuDE7W8SWZBY8t/pSp74eEZiBU5vgibDfJE/yAdv0o+a9KMm/ahJP2rSj5r0oyb9qEk/atKPmvSjJv2oST9q0o+a9KMm/ahJP2rSj5r0oyb96HPTj2iS+kWXqeAWW4tVflYCEnkDE/jZCSctplVeP3Yp4eRCfsFHJJzc8kvMd+ZXnGR8Mya+ML8ifxxB8ZfUur6Bko6nONHW+PB0k1eI4+vTTch6/tH32wNTT0j8NHsg64lHHw4A8oqkfoQ91DPwBl648YPoM03idol8g0kwP88kyPwBEF9iEq3PNwnmiqR+hEnUk79G4/vPNIfbpfH15pAjqomdNrHT74+dxnC3GXdowe7g9dgpij8MMLhGXomdxnBnGfP9NQF3nWax/guxU+oT46ZHXVzgr4uZcn97vBSxBJfipUdhpu0S7qi/vhQvDXn7GfAhUeOZRFb6RjIHzCSxUFx1HUwDv5BiqNQPDnfb4A+ew4xJB3tKmXcUDYK573VCj5RSfKoS21zCXTfCY48zZRhnJOIrsJtAtXDwNTJ2MR66fAxkYnX5mvi4B9vZq7O0z2J0QIfrSJajpR7IDxMfyVSYzYlrsUpluMjHOYINVLmhc0wzsSfA8NXx1ig6VbWLU3uYO4U4K8NFUakar9Q/nmMf0xSloBlMElG0cIBzdiUSeSF+kjGQp5gL94C8brACtOLIG3DVunmc6BRHooM01sMii9sg7/nUY2G3CFzoL5SWz3UmDI2IamAG8kvpiJG8zPzj7E/7zMRG7A5qN8IFDPGEU8CFXI+7FiNagT48OnXe9bCXCJ5ZEGykVJkcMpnTdDKQKoxOQb5lTXiVuDBIcLEFv8QnkSOrU8WIp4GEtSEOPmd+so2c9SvkFlTYwpQjhuuYPAWfiOzYnkdCvCZHPQzlOWCwN4HdzoO69he23stYRfS5PziU9gR7vRPSPQGD8lJcH3zzUehLBwHtjwmmtR2MkYwLdfMcCzIbG+U+YIZ4RH454XLHlpB5SamVMIG9Eut2YurGFp9xu50ze5fZ5NjuwFpz7pcrsHs46Bf86tAJwd4ZocALg2XEyvA+AhnS4NP3qjsvtCvwy/UIdMbJV9ZR6eNt8z0CDhaAkw4jeQtML8wX1rCTxWCviBSTctYfvs4kVjCbwh6OWGN7DfrWtvwMzXV+4G3YS+4lk7dhT7c1HOVHVZlo2IdfxZJy5X0VPi8S+0X7BthfPV8iZcmL+/wOPjs67OtJHg2cJ1/HxVfmn+SyAK7XmCkkrPszAeuMR32EcwnWx8F5sivC2QMx2yTsk0m2QzVeUYkV/Dku0aOInNHm7aq+YU/oB5a+nBTjElvYAxzFZQPVDlJm2/7oeEY9H6QLfs00ixkr0AOcaFBsf4Eyo2ypkjMhEewOoVmH09BIRLu+vNIfFpF82mVB0l7SLybMOORZ0GlQAOnicDo58rGGCdEBeQ9CmHUiQKC5sODfKXyP5yjqVPcm//9WWM4rKcYN2sl9SEx9TJSAbLdLBAjeYmp3vK0Lv2fKeZJ33fHStRvc/8SvB/JfzRRTtHO6/Bt+z9TkbDe8Q8M7NLxDwzs0vEPDOzS8Q8M7NLxDwzs0vEPDOzS8w6fxDgxV+Zk6/oW8w3/14WgXnqOSpyV9A+9wQ0Zcwzs0vEPDOzS8Q8M7NLxDwzs0vEPDOzS8Q8M7NLxDwzv8bVb4bbxDq/6Dj0/jHVr/Ud6BusA7fN/zW2/4IVrDOzS8Q8M7NLxDwzs0vEPDOzS8Q8M7NLxDwzs0vEPDO/xtVvhdvANJ1p/F9Gm8w/sfwf/ie2PwEu9wpiHO75T5hvfGXOAd8pv9b3hXRp1m+CT25xveh8HWJf3xb+h50/sw2tn7L06EX/pEmKvvw6jWx0msouX3vQ8je73GXwkD8me8AoWtvgKlTfxR5dX6+AsQaZF/rF+GyM2tT8tLjaj2Xsi3YI36i7HW+rEuh8Urez1F/hFP1fo4jn+sy6k/Z2q4UX4rnlKDx3vf+PjaY1GrVfa25Nc9b4m54XlLV4GtK9vVSUwFkZTeHfSal/2cT1XE9dfx/WKZckSnlX99MjYWSAC9FDFt80ozLxvqRSt602v62pe1X1AvfeHQm5e994W8dMWuqg9tu/IG3lpH1TfXsrd57Nf6DTo/RuXzpf+8b9XqZznUH/bqrvojPHu+99sydxslCYBVzQWMPizbw8bYWnH2MD6Es0wEUJvu3tF9FBGDjWCb7gmogZIF0Bzjd3g1rrYFl2R55izZMf6hToi98DzAGmavuiKKIMtKrgeG2Us3aB/hioiXXVHhSYg3P/bwNctv5ZPIPXG7/esLH5fL3JAN/obN6QVEpGv+Cc9CZOpR6emCv/vYZyG+URpf/yxE5oZHgzYx+iZG38Tomxh9E6NvYvRNjL6J0Tcx+iZG38Tomxh9E6P/26zwFTH6dzFAtRg9e4kC+awoPfO3/jqgxiMXSeP8Tv8GLb2TDa4SXBRO/GIook238PTvmzjdt9Co7LsVq/mupWWf7Z0b5G093zPyolQdN8OgFKn4YL2/JvBxBSMvxBpSuvDjMVKNzWKfw/SzRHkcCqOK3b1YP5/XzRFq7A8R6pchDV83vh8Wq2+UYMX7uoFq/A8= \ No newline at end of file diff --git a/tex/master-thesis-peters.tex b/tex/master-thesis-peters.tex index 4f1c8c6..f280605 100644 --- a/tex/master-thesis-peters.tex +++ b/tex/master-thesis-peters.tex @@ -220,7 +220,7 @@ \chapter*{Selbstständigkeitserklärung} Hiermit erkläre ich, Martin Peters, dass ich die vorliegende Masterarbeit selbständig angefertigt habe. Es wurden nur die in der Arbeit ausdrücklich benannten Quellen und Hilfsmittel benutzt. Wörtlich oder sinngemä\ss \ übernommenes Gedankengut habe ich als solches kenntlich gemacht. - \vspace{15mm} + \vspace{25mm} %\hspace{5mm} \hfil diff --git a/tex/tables/100-knx-ack.tex b/tex/tables/100-knx-ack.tex index 41603dd..eb1a591 100644 --- a/tex/tables/100-knx-ack.tex +++ b/tex/tables/100-knx-ack.tex @@ -15,6 +15,6 @@ BUSY & 1 & 1 & 0 & 0 & 0 & 0 & 0 & 0 \\\midrule NACK + BUSY & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \\\bottomrule \end{tabularx} - \caption[\knx acknowledge telegram]{\knx short acknowledge telegram.} + \caption[KNX acknowledge telegram]{\gls{knx} short acknowledge telegram.} \label{tab:background:bas:knx:proto:ack} \end{table} \ No newline at end of file diff --git a/tex/tables/100-knx-ctrl.tex b/tex/tables/100-knx-ctrl.tex index 2af54a9..4765860 100644 --- a/tex/tables/100-knx-ctrl.tex +++ b/tex/tables/100-knx-ctrl.tex @@ -17,6 +17,6 @@ Poll Telegram & 1 & 1 & 1 & 1 & 0 & 0 & \multicolumn{2}{c|}{ } \\\midrule Acknowledge Telegram & \multicolumn{2}{c|}{ } & 0 &0 & \multicolumn{2}{c|}{ } & \multicolumn{2}{c|}{ } \\\bottomrule \end{tabularx} - \caption[\knx CTRL Byte]{\knx CTRL Byte. Telegram Type (TT), Repeat (R), Acknowledge (A), and Priority (P). cf. \textcite{Sokollik2017}} + \caption[KNX CTRL Byte]{\gls{knx} CTRL Byte. Telegram Type (TT), Repeat (R), Acknowledge (A), and Priority (P). cf. \textcite{Sokollik2017}} \label{tab:background:bas:knx:proto:ctrl} \end{table} \ No newline at end of file diff --git a/tex/tables/100-knx-ctrle.tex b/tex/tables/100-knx-ctrle.tex index 0194dc6..c0f899b 100644 --- a/tex/tables/100-knx-ctrle.tex +++ b/tex/tables/100-knx-ctrle.tex @@ -13,6 +13,6 @@ Bit & 7 & 6 & 5 & 4 & 3 & 2 & 1 & 0 \\\midrule Function & AT & \multicolumn{3}{c|}{Hops} & \multicolumn{4}{c|}{EFF} \\\bottomrule \end{tabularx} - \caption[\knx CTRLE Byte]{\knx CTRLE Byte. Address Type (AT), Hop Count, and Extended Frame Format (EFF). cf. \textcite{Sokollik2017}} + \caption[KNX CTRLE Byte]{\gls{knx} CTRLE Byte. Address Type (AT), Hop Count, and Extended Frame Format (EFF). cf. \textcite{Sokollik2017}} \label{tab:background:bas:knx:proto:ctrle} \end{table} \ No newline at end of file diff --git a/tex/tables/100-knx-data.tex b/tex/tables/100-knx-data.tex index 18b84a3..8de6f25 100644 --- a/tex/tables/100-knx-data.tex +++ b/tex/tables/100-knx-data.tex @@ -21,7 +21,7 @@ Bit & & & & & & & & & & & & & & & & & & & & & & & & \\\midrule Function & \multicolumn{16}{c|}{Payload $n+1$ Bytes} & \multicolumn{8}{c|}{Parity} \\\bottomrule \end{tabularx} - \caption[Standard KNX data telegram]{Standard \acs{knx} data telegram with $2$ to $16$ bytes of payload. Control Byte (CTRL) cf. Table~\ref{tab:background:bas:knx:proto:ctrl}, Source Address, Destination Address cf. Table~\ref{tab:background:bas:knx:topo:addr}, Address Type (AT), Hop Count (Hops), Payload Length (Length), Payload, and Parity.} + \caption[Standard KNX data telegram]{Standard \gls{knx} data telegram with $2$ to $16$ bytes of payload. Control Byte (CTRL) cf. Table~\ref{tab:background:bas:knx:proto:ctrl}, Source Address, Destination Address cf. Table~\ref{tab:background:bas:knx:topo:addr}, Address Type (AT), Hop Count (Hops), Payload Length (Length), Payload, and Parity.} \label{tab:background:bas:knx:proto:knx-standard} \end{table} @@ -50,6 +50,6 @@ Bit & & & & & & & & & \multicolumn{16}{c|}{ } \\\cmidrule{1-9} Function & \multicolumn{8}{c|}{Parity} & \multicolumn{16}{c|}{ } \\\bottomrule \end{tabularx} - \caption[Extended KNX data telegram]{Extended \acs{knx} data telegram with $2$ to $255$ bytes of payload. Control Byte (CTRL) cf. Table~\ref{tab:background:bas:knx:proto:ctrl}, Extended Control Byte (CTRLE) cf. Table~\ref{tab:background:bas:knx:proto:ctrle}, Source Address, Destination Address cf. Table~\ref{tab:background:bas:knx:topo:addr}, Payload Length (Length), Payload, and Parity.} + \caption[Extended KNX data telegram]{Extended \gls{knx} data telegram with $2$ to $255$ bytes of payload. Control Byte (CTRL) cf. Table~\ref{tab:background:bas:knx:proto:ctrl}, Extended Control Byte (CTRLE) cf. Table~\ref{tab:background:bas:knx:proto:ctrle}, Source Address, Destination Address cf. Table~\ref{tab:background:bas:knx:topo:addr}, Payload Length (Length), Payload, and Parity.} \label{tab:background:bas:knx:proto:knx-extended} \end{table} \ No newline at end of file diff --git a/tex/tables/100-knx-poll.tex b/tex/tables/100-knx-poll.tex index e4b7d44..10fdd7f 100644 --- a/tex/tables/100-knx-poll.tex +++ b/tex/tables/100-knx-poll.tex @@ -19,6 +19,6 @@ Bit & & & & & & & & & \multicolumn{16}{c|}{ } \\\cmidrule{1-9} Function & \multicolumn{8}{c|}{Parity} & \multicolumn{16}{c|}{ } \\\bottomrule \end{tabularx} - \caption[\knx poll telegram]{\knx poll telegram. Control Byte (CTRL) cf. Table~\ref{tab:background:bas:knx:proto:ctrl}, Source Address, Destination Address cf. Table~\ref{tab:background:bas:knx:topo:addr}, expected length of poll data (poll data), and Parity.} + \caption[KNX poll telegram]{\gls{knx} poll telegram. Control Byte (CTRL) cf. Table~\ref{tab:background:bas:knx:proto:ctrl}, Source Address, Destination Address cf. Table~\ref{tab:background:bas:knx:topo:addr}, expected length of poll data (poll data), and Parity.} \label{tab:background:bas:knx:proto:knx-poll} \end{table} \ No newline at end of file diff --git a/tex/tables/100-knx-tpci-apci.tex b/tex/tables/100-knx-tpci-apci.tex index b3f2efd..a1bc206 100644 --- a/tex/tables/100-knx-tpci-apci.tex +++ b/tex/tables/100-knx-tpci-apci.tex @@ -13,6 +13,6 @@ & DC & No & \multicolumn{4}{c|}{Seq. Number} & \multicolumn{10}{c|}{} \\\bottomrule \end{tabularx} - \caption[KNX TPCI/APCI structure in a standard data telegram]{\acrshort{knx} \acrshort{tpci}/\acrshort{apci} structure in a standard data telegram. Data/Control flag (DC), Numbered/Unnumbered flag (No), Sequence Number (Seq. Number).} + \caption[KNX TPCI/APCI structure in a standard data telegram]{\gls{knx} \acrshort{tpci}/\acrshort{apci} structure in a standard data telegram. Data/Control flag (DC), Numbered/Unnumbered flag (No), Sequence Number (Seq. Number).} \label{tab:background:bas:knx:comm:tpci-apci} \end{table} \ No newline at end of file