Skip to content
Timo Spring edited this page Jul 2, 2018 · 3 revisions

Welcome to the benchmark wiki!

This manual will lead you through the project and tool setup and explains how to use our benchmarking implementation to reproduce the results. The benchmark can be setup in four steps:

  1. Setup static analysis tools,
  2. run gradle shadowJar to generate build,
  3. Adapt the properties file and change the paths to the tools,
  4. Run the tool with java -jar benchmarking.jar

We start with the setup of the tools included in the benchmark—COVERT, FlowDroid, HornDroid, IC3, and IccTA. The GitHub page contains the following folders that we used for running the analysis. One can either use our proposed folder structure or adapt the properties file with the modified paths to the required artefacts. We propose the following folders:

  • benchmark/src: Contains the source code to our own benchmarking implementation to run all tools and collect and summarise the results.
  • tools: For each tool included in the benchmark, there exists a separate folder. The tools’ artefacts and resources will need to be placed there. During the analysis, the tools will create results files in those folders, that will be used by our benchmarking implementation. The tools folder also contains a commonConfig folder containing all shared resources used for the analysis. To run the tools with shared configurations, you have to provide the following resources in the com- monConfig folder: ** android.jar We used API level 23 for this work. There are, however, good collections for different API levels available online2. – SourcesAndSinks.txt Text file containing the list of sources and sinks to be checked for data flows during the analysis. We recommend to use the SuSi3 tool to obtain such a list based on the selected android.jar. – AndroidCallbacks.txt Text file containing the list of callback methods. We recommend to use an existing list for example the one provided by FlowDroid. – apktool.jar We recommend to use the latest version of the Ap- ktool4.
  • results: The summarised and grouped findings from our benchmarking imple- mentation will be stored here; including a summary text file report, a summary csv file report and a csv file showing which tools timed out and which ones completed the analysis.
  • apksToTest: All applications that should be analysed need to be put in this folder.

COVERT Setup

Clone the benchmarking project from GitHub and navigate to the tools/covert folder inside the project. Then follow the steps below to set up COVERT. 1. Obtain COVERT Note, that the folder already contains certain files and folders. These should not be changed for the benchmark to work. In order for COVERT to run, you need to obtain the following artefacts and store them in the covert folder. Make sure to rename the files as indicated or adapt the running command later on. First, you have to obtain the COVERT back-end5. Then unpack the folder and extract the following files and folders to the covert folder in the benchmarking project directory. • covert.bat • covert.sh 3 https://blogs.uni-paderborn.de/sse/tools/susi/ 4 https://ibotpeaches.github.io/Apktool/ 5 https://www.ics.uci.edu/~seal/projects/covert/

APPENDIX A. 57 • appRepo (folder) • resources (folder) Copy and paste the following resources into the configCustom folder inside the covert directory. We denote the root directory of the project as ∼: • ~/resources/AndroidPlatforms/android-8/android.jar • ~/resources/Covert/resources/apktool/apktool.jar • ~/resources/FlowDroid/resources/AndroidCallbacks.txt • ~/resources/FlowDroid/resources/SourcesAndSinks.txt These are the configurations that we will change to run the tool with shared configurations. However, we copy the artefacts to the custom- Config folder to allow the user to easily switch between original and shared configurations. 2. Running Command Go to the covert directory and make sure that all required files and artefacts are present. Copy a sample application into the app_repo folder. Then run the following command to start the analysis. Instead of a single application, you could also analyse multiple applica- tions with COVERT. To do so, you have to create an additional folder inside the app_repo folder. Instead of the application name, you then pass the name of the newly created folder in the running command. Make sure to run the tool on an sample application and verify that the results are as expected. We recommend to use the SendSMS.apk provided by DroidBench6 for a test run. COVERT should detect a data leak for the sendTextMessage() method. The source code for the application is available on DroidBench. 3. Results The results of the analysis are located in the app_repo folder. A new folder with the same name as the analysed application should have been created during the analysis. This folder contains the results file in a .xml file. 6 https://github.com/secure-software-engineering/DroidBench ./covert.sh <APPLICATION_NAME>.apk

APPENDIX A. 58 FlowDroid Setup Clone the benchmarking project from GitHub and navigate to the tools/flow- Droid folder. Then follow the steps below to set up the FlowDroid tool.

  1. Obtain FlowDroid Note, that the folder already contains certain files and folders. These should not be changed for the benchmark to work. In order for Flow- Droid to run, you to obtain the following artefacts and store them in the flowDroid folder. Make sure to rename the files as indicated or adapt the running command later on. The following additional libraries are required: • soot-trunk.jar • soot-infoflow.jar • soot-infoflow-android.jar • slf4j-simple-1.7.5.jar (libraries for logging) • slf4j-simple-1.7.5.jar (libraries for logging) • axml-2.0.jar (Android XML parser library) • android.jar (Android SDK): For the analysis we use API Level 23 Furthermore, you need to obtain the following configuration files and store them in the same folder as the artefacts above: Make sure to also copy and paste the SourcesAndSinks.txt and AndroidCallbacks.txt file in the configCustom folder. • EasyTaintWrapperSource.txt (taint wrapper) • AndroidCallbacks.txt (Android callbacks) • SourcesAndSinks.txt (sources and sinks) Make sure to also copy and paste the SourcesAndSinks.txt and An- droidCallbacks.txt file in the configCustom folder. This is required to later create soft links that allow switching between running the tools with original or shared configurations.
  2. Running Command Go to the flowDroid directory and make sure that all required files and artefacts are present. Then run the following command to start the analysis.

APPENDIX A. 59 For the benchmark, we put applications that should be analysed in the apksToTest folder. The corresponding folder path would be ../../ap ksToTest/<APPLICATION_NAME>.apk. Make sure to run the tool on an sample application and verify that the results are as expected. We recommend to use the SharedPrefer- ences1.apk provided by DroidBench for a test run. FlowDroid should detect all data leaks as indicated in the applications source code that is available on DroidBench. 3. Results The results of the analysis are stored in the flowdroid_re- sults.txt file. They also get printed to the console. HornDroid Setup For HornDroid the setup is straightforward and well documented on its project page.

  1. Obtain HornDroid First, clone the HornDroid GitHub project7 and build it using mvn cl ean package. A new folder target is created. Move the content from this folder to the horndroid folder inside the benchmarking directory. In addition, copy and paste the following files to the horndroid/config- Custom folder: • apktool.jar • Callbacks.txt • SourcesAndSinks.txt
  2. Running Command Go to the horndroid directory and make sure that all required files and artefacts are present. Then run the following command to start the analysis. 7 https://github.com/ylya/horndroid.git java -Xmx4g -cp soot-trunk.jar:soot-infoflow.jar:soot-infof low-android.jar:slf4j-api-1.7.5.jar:slf4j-simple-1.7.5.j ar:axml-2.0.jar soot.jimple.infoflow.android.TestApps.Te st <PATH_TO_APPLICATION> ./android.jar > flowdroid_resul ts.txt

APPENDIX A. 60 Make sure to run the tool on an sample application and verify that the results are as expected. We recommend to use the SharedPrefer- ences1.apk provided by DroidBench for a test run. HornDroid should detect all data leaks as indicated in the applications source code that is available on DroidBench. 3. Results The results of the analysis are located in the OUTPUT.report folder as a JSON file. They are also printed to the console. IC3 Setup The setup of IC3 also requires the installation of the Dare tool. It is used to decompile Android applications from the installation image to source code on which IC3 can perform.

  1. Obtain IC3 We start by setting up the Dare tool. First, go to the Dare installation page8 and download the latest version. After the download, unzip the installation package and create a folder named output. Move the whole content of the installation package to the tools_helper/dare folder. Now, you can start with the setup of the IC3 tool. Begin by cloning the project from GitHub9. Then go the IC3 directory and build the tool with the following command: Move the content from the newly created target folder to the ic3 folder inside the benchmarking directory. There are already two folders present— dareOutput and ic3ouput. These folders are used to store the results of Dare and IC3 respectively.
  2. Running Command Before starting the analysis, we have to run the Dare tool on the application to decompile it. To do so, run the following command from within the Dare directory: 8 http://siis.cse.psu.edu/dare/installation.html 9 https://github.com/siis/ic3 java -jar fshorndroid-0.0.1.jar / ./apktool.jar <PATH_TO_AP PLICATION> git clone https://github.com/siis/ic3 cd ic3 mvn clean package -P standalone

APPENDIX A. 61 Afterwards, the folder ~/ic3/dareOutput should contain the decom- piled files. Now we are all set to run the IC3 tool with the following command from within the ic3 folder: Make sure to run the tool on an sample application and verify that the results are as expected. We recommend to use the StartActivityFor- Result1.apk provided by DroidBench for a test run. IC3 should detect one data leaks at the startActivityForResult() method. 3. Results The results of the analysis are located in the ic3output folder as a text file. IccTA Setup The setup of IccTA is the most time intensive one. It requires to build the tool yourself and set up a mySQL database to store the intermediate results. Furthermore, it uses several frameworks that you can build yourself. However, there are already built versions available online that we use for this benchmark.

  1. Obtain IccTA Begin by importing all the following projects to Eclipse or another IDE: • jasmin • heros • soot • soot-infoflow • soot-infoflow-android • soot-infoflow-android-iccta Then change the build path of soot-infoflow-android-iccta to include the above projects and then build each project. Afterwards, create a new folder called output_iccta inside the soot- infoflow-android-iccta directory. ./dare -d <PATH_TO_IC3_DAREOUTPUT> <PATH_TO_APPLICATION> ./runIC3.sh <PATH_TO_APPLICATION>

APPENDIX A. 62 2. Database Setup To store intermediate results during the analysis, IccTA uses a mySQL database. Start mySQL and create the following database: As a database name, you have to use cc since it is hardcoded in the IC3 tool provided. When you are done with importing the IccTA schema, you have to adapt the database properties of the tool. The following files need to be updated with the correct username and password to the database: • ~/iccProvider/ic3/runIC3.sh • ~/iccProvider/ic3/runIC3Wrapper.sh • ~/res/jdbc.xml • ~/res/iccta.properties • ~/src/soot/jimple/infoflow/android/iccta/util/Constants 3. Running Command Before starting the analysis, we have to run the IC3 tool on the application. To do so, run the following command from within the ~/iccProvider/ic3 directory: Now, go to the release directory and run the following command to start IccTA: Make sure to run the tool on an sample application and verify that the results are as expected. We recommend to use the SharedPref- erences1.apk provided by DroidBench for a test run. IccTA should detect all data leaks as indicated in the applications source code that is available on DroidBench. 4. Results The results of the analysis are located in the output_iccta folder as a text file. mysql -u root -p -e 'create database cc?; mysql -u root -p cc < res/schema; ./runIC3.sh <PATH_TO_APPLICATION> java -jar IccTA.jar <PATH_TO_APPLICATION> <PATH_TO_ANDROID_ SDK> \newline -iccProvider ../iccProvider/ic3

APPENDIX A. 63 A.2 Benchmarking Setup In order for our benchmarking implementation to work, you have to make sure that all tools are working as explained in Appendix A.1. A.2.1 Build benchmark After setting up the tools, you have to build the benchmark as a jar file containing all required dependencies. This can be done using our gradle script. Note that these are only dependencies such as to apache commons or junit and not to the tools themselves. Run the following command to build the benchmark: This generates an executable jar file containing all relevant dependencies. A.2.2 Adapt Configurations In case you have setup the tools in the same folder structure as presented in this manual, then you can skip this section. In case you have changed the location or name of certain artefacts, then you can easily adapt the paths used to these benchmarks in our data_leak_detection.properties.defau lts file. Furthermore, you can exclude certain tools from the benchmark by simply changing the enabled flag in the properties file to false. A.2.3 Run Benchmark Put the APK file of the application you would like to analyse in the apksToTest folder. Then run the following command to start the benchmark with the enabled tools: The results of the analysis can be found in the results folder. It contains the individual tools’ results as well as the summarised reports. The timedOut folder contains a list of all applications that were analysed with an indication for each tool if the analysis was completed (indicated as 0), interrupted with and exception (indicated as 1), or timed out (indicated as 2).

Clone this wiki locally