Giter Club home page Giter Club logo

website-fingerprinting's Introduction

Website Fingerprinting

Website fingerprinting is a method of Tor or VPN packet inspection that aims to collect enough features and information from individual sessions that could aid in identifying the activity of anonymized users.

Introduction

For this experiment, Tor is required. It can be installed by running the following commands:

# For Debian or Ubuntu
sudo apt install tor lynx

# For Fedora
sudo yum install tor lynx

# For ArchLinux
sudo pacman -S tor torsocks lynx

By installing Tor we also get a program called torsocks; this program will be used to redirect traffic of common programs through the Tor network. For example, it can be run as follows:

# SSH through Tor.
torsocks ssh [email protected]

# CUrl through Tor.
torsocks curl -L http://httpbin.org/ip

# Etc...

Required Python 3 Modules

pip install sklearn dpkt joblib

Data Collection

For the data collection process two terminal windows in a side-by-side orientation are required, as this process is fairly manual. Also, it's advised to collect the fingerprints in a VM, in order to avoid caputring any unintended traffic. To listen on traffic there exists a script, namely capture.sh, which should be run in one of the terminals:

./pcaps/capture.sh duckduckgo.com

Once the listener is capturing traffic, on the next terminal run:

torsocks lynx https://duckduckgo.com

Once the website has finished loading, the capture process needs to be killed, along with the browser session (by hitting the q key twice). The process should be repeated several times for each web page so that there is enough data.

Machine Learning

Scikit Learn was used to write a k Nearest Neighbors classifier, that would read the pcap files, as specified in the config.json file. config.json can be changed according to which webpages were targeted for training. The training script is gather_and_train.py.

Scikit Learn kNN

Classifying Unknown Traffic

# python predict.py [packet to classify]
  python predict.py xyz.pcap

Once the training is done, and the classifier-nb.dmp is created, the predict.py script can be run with the pcap file as the sole argument. The script will load the classifier and attempt to identify which web page the traffic originated from.

It is worth noting that from each sample only the first 40 packets will be used to train a usable model and to run through the resulting classifier.

Visualizing the patterns

As can be seen in the screenshot above, the patterns of the packets of each website can be seen clearly on a 3D scale. The classifier visualizes the data in a similar way and gives us the most accurate result.

An interactive version of this graph can be found in the graphs folder.

Limitations and Disclaimers

This setup was created in order to research the topic of website fingerprinting and how easy it is to attempt to deanonymize users over Tor or VPNs. Traffic was captured and identified in a private setting and for purely academic purposes; the use of this source code is intended for those reasons only.

Traffic is never "clean", as the assumption was - for simplicity - in this research. However, if an entity has enough resources, the desired anonymized traffic can be isolated and fed into this simple classifier. This means that it is entirely possible to use a method like this to compromise anonymized users.

References

  1. Wang, T. and Goldberg, I. (2017). Website Fingerprinting. [online] Cse.ust.hk. Available at: https://www.cse.ust.hk/~taow/wf/.
  2. Wang, T. and Goldberg, I. (2017). Improved Website Fingerprinting on Tor. Cheriton School of Computer Science. Available at: http://www.cypherpunks.ca/~iang/pubs/webfingerprint-wpes.pdf
  3. Wang, T. (2015). Website Fingerprinting: Attacks and Defenses. University of Waterloo. Available at: https://uwspace.uwaterloo.ca/bitstream/handle/10012/10123/Wang_Tao.pdf

website-fingerprinting's People

Contributors

wisepythagoras avatar vishalvishw10 avatar

Stargazers

 avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.