Giter Club home page Giter Club logo

few-shot-satellite-image-classification-ops-sat's Introduction

Few-shot Satellite Image Classification (OPS-SAT)

Welcome to the Few-shot Satellite Image Classification (OPS-SAT) repository Follow the steps below to get started:

Usage Guide

  1. Clone the Repository:

    git clone https://github.com/ShendoxParadox/Few-shot-satellite-image-classification-OPS-SAT.git
  2. Navigate to Repo Root Folder:

    cd Few-shot-satellite-image-classification-OPS-SAT

Using Conda

Create a Virtual Environment

Make sure you have Conda installed on your machine.

# Create a virtual environment with Python 3.9
conda create --name myenv python=3.9

# Activate the virtual environment
conda activate myenv

# Install project dependencies
pip install -r requirements.txt

Using Docker

  1. Build Docker Image:

    docker build --no-cache -t ops_sat:latest .
  2. Run Docker Container:

    docker run -it ops_sat
  3. Modify Configuration: Edit the config.json file as needed:

    nano config.json
  4. Navigate to Source Folder:

    cd src/
  5. Run OPS-SAT Development Script:

    python OPS_SAT_Dev.py
  6. Choose W&B Option: Follow the prompts to choose the WandB option during script execution.

  7. View Run Results: Navigate to the WandB dashboard to observe the run results.

  8. (Another way) Pull and run the following docker image

    docker pull ramezshendy/ops_sat:latest
    docker run -it ramezshendy/ops_sat:latest

For any additional information or troubleshooting, refer to the documentation or contact the repository owner.

Config file guide:

  • Dataset Name: The OPS-SAT case dataset
  • Dataset Variation Description: Augmented Color Corrected Synthetic Variation

Dataset Paths

  • Training/Validation Dataset Path: ../Data/Variation_Synthetic_Generation_color_corrected_Augmentation/train/
  • Test Dataset Path: ../Data/Variation_Synthetic_Generation_color_corrected_Augmentation/test/
    Change the path of the training and test datasets from the available dataset variations in the Data folder.

Model Configuration

  • Transfer Learning: false
    Means that the model will utilize pretraining using imagenet. If true, it will use transfer learning techniques.
  • Transfer Learning Dataset: landuse
    The available transfer learning datasets are: landuse, imagenet, opensurfaces

Model Parameters

  • Project: OPS-SAT-Thesis-Project
  • Input Shape: [200, 200, 3]
  • Number of Classes: 8
  • Dropout: 0.5
  • Output Layer Activation: Softmax
  • Model Optimizer: Adam
  • Loss Function: FocalLoss
    The implemented loss functions to use from are: FocalLoss, SparseCategoricalCrossentropy
  • Model Metrics: [SparseCategoricalAccuracy]
  • Early Stopping:
    • Monitor: val_sparse_categorical_accuracy
    • Patience: 6
  • Model Checkpoint:
    • Monitor: val_sparse_categorical_accuracy
  • Cross Validation K-Fold: 5
  • Number of Epochs: 200
  • Batch Size: 4
  • Focal Loss Parameters:
    • Alpha: 0.2
    • Gamma: 2
      If loss function is FocalLoss
  • Number of Freeze Layers: 5
    If transfer learning is true.

Supplementary Links

Project Structure

- /OPS-SAT-Thesis-Project
  - /Data
    - /Variation_Synthetic_Generation_color_corrected_Augmentation
      - /train
        - /Agricultural
        - /Cloud
        - /Mountain
        - /Natural
        - /River
        - /Sea_ice
        - /Snow
        - /Water
      - /test
    - /ops_sat
    - /Variation_Augmentation
    - /Variation_Original
    - /Variation_Synthetic_Generation
    - /Variation_Synthetic_Generation_color_corrected
  - /src
    - OPS_SAT_Dev.py
    - color_correction.py
    - image_augmentation.py
    - Your source code files
  - /notebooks
  - /models
    - best_weights.h5
    - fold_1_best_model_weights.h5
    - fold_2_best_model_weights.h5
    - fold_3_best_model_weights.h5
    - fold_4_best_model_weights.h5
    - fold_5_best_model_weights.h5
  - README.md
  - Dockerfile
  - config.json
  - .gitignore
  - requirements.txt

WandB dashboard

The following are examples of what can be found after each run in wandb dashboard

Class Accuracies Class Accuracies

Correct Predictions Correct Predictions

Wrong Predictions Wrong Predictions

Train - Val Accuracy Train - Val Accuracy

Train - Val Loss Train - Val Loss

System Charts System Charts

Run Info Run Info

Configuration Parameters Configuration Parameters

Model Files Model Files

Runs Parallel Plot Runs Parallel Plot

few-shot-satellite-image-classification-ops-sat's People

Contributors

shendoxparadox avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.