Giter Club home page Giter Club logo

kuiper's Introduction

logo.png

Table of Contents

Kuiper

Digital Investigation Platform

What is Kuiper?

Kuiper is a digital investigation platform that provides a capabilities for the investigation team and individuals to parse, search, visualize collected evidences (evidences could be collected by fast triage script like Hoarder). In additional, collaborate with other team members on the same platform by tagging artifacts and present it as a timeline, as well as setting rules for automating the detection. The main purpose of this project is to aid in streamlining digital investigation activities and allow advanced analytics capabilities with the ability to handle a large amounts of data.

diagram.png

Why Kuiper?

Today there are many tools used during the digital investigation process, though these tools help to identify the malicious activities and findings, as digital analysts there are some shortages that needs to be optimized:

  • Speeding the work flow.
  • Increase the accuracy.
  • Reduce resources exhaustion.

With a large number of cases and a large number of team members, it becomes hard for team members collaboration, as well as events correlation and building rules to detect malicious activities. Kuiper solve these shortages.

How Kuiper Will Help Optimize the Investigation?

  • Centralized server: Using a single centralized server (Kuiper) that do all the processing on the server-side reduce the needed hardware resources (CPU, RAM, Hard-disk) for the analysts team, no need for powerful laptop any more. In addition, all evidences stored in single server instead of copying it on different machines during the investigation.
  • Consistency: Depending on different parsers by team members to parse same artifacts might provide inconsistency on the generated results, using tested and trusted parsers increases the accuracy.
  • Predefined rules: Define rules on Kuiper will save a lot of time by triggering alerts on past, current, and future cases, for example, creating rule to trigger suspicious encoded PowerShell commands on all parsed artifacts, or suspicious binary executed from temp folder, within Kuiper you can defined these rules and more.
  • Collaboration: Browsing the parsed artifacts on same web interface by team members boost the collaboration among them using tagging and timeline feature instead of every analyst working on his/her own machine.

Use Cases

  • Case creation: Create cases for the investigation and each case contain the list of machines scoped.
  • Bulk evidences upload: Upload multiple files (artifacts) collected from scoped machines via Hoarder, KAPE, or files collected by any other channel.
  • Evidence processing: Start parsing these artifact files concurrently for selected machines or all.
  • Holistic view of evidences: Browse and search within the parsed artifacts for all machines on the opened case.
  • Rules creation: Save search query as rules, these rules could be used to trigger alerts for future cases.
  • Tagging and timeline: Tag suspicious/malicious records, and display the tagged records in a timeline. For records or information without records (information collected from other external sources such as FW, proxy, WAF, etc. logs) you can add a message on timeline with the specific time.
  • Parsers management: Collected files without predefined parser is not an issue anymore, you can write your own parser and add it to Kuiper and will parse these files. read more how to add parser from Add Custom Parser

Examples

Create cases and upload artifacts create_cases

Investigate parsed artifacts in Kuiper create_cases

Kuiper Components

Components Overview

Kuiper use the following components:

  • Flask: A web framework written in Python, used as the primary web application component.

  • Elasticsearch: A distributed, open source search and analytics engine, used as the primary database to store parser results.

  • MongoDB: A database that stores data in JSON-like documents that can vary in structure, offering a dynamic, flexible schema, used to store Kuiper web application configurations and information about parsed files.

  • Redis: A in-memory data structure store, used as a database, cache and message broker, used as a message broker to relay tasks to celery workers.

  • Celery: A asynchronous task queue/job queue based on distributed message passing, used as the main processing engine to process relayed tasks from redis.

  • Gunicorn: Handle multiple clients HTTPs requests

Getting Started

Requirements

  • OS: 64-bit Ubuntu 18.04.1 LTS (Xenial) (preferred)
  • RAM: 4GB (minimum), 64GB (preferred)
  • Cores: 4 (minimum)
  • Disk: 25GB for testing purposes and more disk space depends on the amount of data collected.
  • Docker: Docker version 20.10.17
  • Docker-Compose: docker-compose version 1.29.2

Notes

  • If you want to use RAM more than 64GB to increase Elasticsearch performence, it is recommended to use multiple nodes for Elasticsearch cluster instead in different machines
  • For parsing, Celery generate workers based on CPU cores (worker per core), each core parse one machine at a time and when the machine finished, the other queued machines will start parsing, if you have large number of machines to process in the same time you have to increase the cores number
  • To install docker and docker-compose on Ubuntu run the following
# Install Docker
sudo apt-get update
sudo apt-get install ca-certificates curl gnupg lsb-release
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update
sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin
sudo docker -v

# Install Docker Compose
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
sudo docker-compose -v

Installation

Starting from version 2.2.0, Kuiper run over dockers, there are 7 docker images:

  • Flask: the main docker which host the web application (check docker image).
  • Mongodb: stores the cases and machines metadata.
  • Elasticsearch (es01): stores the parsed artifacts data.
  • Nginx: reverse proxy for the flask container.
  • Celery: artifacts parser component check docker image.
  • Redis: queue for celery workers
  • NFS (Network File System): container that stores the shared files between Flask and Celery containers.

To run the docker use the following command:

sysctl -w vm.max_map_count=262144
git clone https://github.com/DFIRKuiper/Kuiper.git
cd Kuiper
docker-compose pull
docker-compose up -d

Issues

1 - Note: when you first run the dockers, Elasticsearch will fail to run and give the following error

ERROR: [1] bootstrap checks failed
[1]: max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144]

To solve the issue run the command

sysctl -w vm.max_map_count=262144

2- Note: if you faced the following issue

Creating network "kuiper_kuiper" with driver "bridge"
Creating kuiper_es01    ... done
Creating kuiper_mongodb ... done
Creating kuiper_redis   ... done
Creating kuiper_flask   ... error
Creating kuiper_nfs     ... done
Creating kuiper_celery  ... 

ERROR: for kuiper_flask  Cannot start service flask: error while mounting volume '/var/lib/docker/volumes/kuiper_kuiper_nfs/_data': failed to mount local volume: mount :/:/var/lib/docker/vCreating kuiper_celery  ... done

ERROR: for flask  Cannot start service flask: error while mounting volume '/var/lib/docker/volumes/kuiper_kuiper_nfs/_data': failed to mount local volume: mount :/:/var/lib/docker/volumes/kuiper_kuiper_nfs/_data, data: addr=172.30.250.10: permission denied
ERROR: Encountered errors while bringing up the project.

To solve the issue, run the command again

docker-compose up -d

Troubleshooting

To check the dockers, run the command

docker-compose ps -a

It should show the results

     Name                   Command               State                         Ports                       
------------------------------------------------------------------------------------------------------------
kuiper_celery    /bin/sh -c cron && python  ...   Up                                                        
kuiper_es01      /bin/tini -- /usr/local/bi ...   Up      0.0.0.0:9200->9200/tcp,:::9200->9200/tcp, 9300/tcp
kuiper_flask     /bin/sh -c cron && gunicor ...   Up      0.0.0.0:5000->5000/tcp,:::5000->5000/tcp          
kuiper_mongodb   docker-entrypoint.sh /bin/ ...   Up      0.0.0.0:27017->27017/tcp,:::27017->27017/tcp      
kuiper_nfs       /usr/bin/nfsd.sh                 Up      0.0.0.0:2049->2049/tcp,:::2049->2049/tcp          
kuiper_nginx     /docker-entrypoint.sh ngin ...   Up      0.0.0.0:443->443/tcp,:::443->443/tcp, 80/tcp      
kuiper_redis     docker-entrypoint.sh /bin/ ...   Up      0.0.0.0:6379->6379/tcp,:::6379->6379/tcp          

if anyone failed, check the logs for the service that failed

docker-compose logs -f --tail=100 <service>

Kuiper API

Kuiper has a limited feature API, check the repo DFIRKuiperAPI.

Issues Tracking and Contribution

We are happy to receive any issues, contribution, and ideas.

we appreciate sharing any parsers you develop, please send a pull request to be able to add it to the parsers list.

Licenses

  • Each parser has its own license, all parsers placed in the following folder /kuiper/parsers/.

  • All files in this project under GPL-3.0 license, unless mentioned otherwise.

Creators

Saleh Muhaysin, Twitter (@saleh_muhaysin),

Muteb Alqahtani, Twitter(@muteb_alqahtani)

Abdullah Alrasheed, Twitter(@abdullah_rush)

kuiper's People

Contributors

0x534a avatar abdulrhmanalfaifi avatar adanbacki avatar blackdogbarking avatar dependabot[bot] avatar dfirkuiper avatar heck-gd avatar humoud avatar knorahsa avatar leba-gd avatar lrapp-adan avatar malshalan16 avatar mayhamad avatar mnr-hmm avatar salehmuhaysin avatar yasseralamri avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kuiper's Issues

Generate Timeline View csv when browsing artifacts

Hello,

Would it be possible to add the timeline export view when browsing artifacts on machines?

I like the timeline export with the views, but looking to get a workflow in where i get that but on an individual machine basis
Screen Shot 2022-08-05 at 15 26 18
Want to be able to click the purple export button, but have it export the way the timeline export is. As it exports now it does not include much data
Screen Shot 2022-08-05 at 15 28 38
above is all it includes. Doesnt seem to use any of the Timeline Views

Problem when I run Kuiper

Hello, I'm newbie here. When I run $ ./kuiper_install.sh -run command. It's show me like this and I don't know how to enter the program interface. I run on 64-bit Ubuntu 18.04.1 LTS. Thanks for your attention.
image
image

image

list docker-compose version

Hello,

Install guide is missing commands to install docker-compose, also recommended version for docker-compose as well.

seems to be the most difficult part of the install process is getting the right docker version.

if you install docker-compose through apt-get it says its an incompatible version with the yaml.

please list the preferred docker-compose version(s)

Thanks!

Add general Registry parser

Describe the parser
Parser that parses out all registry keys in all hives. displays values, names, etc
As it seems now, i am missing SOFTWARE keys.

Kuiper seems to parse some registry keys based on specific sources i see in regsk.

But there does not seem to be a parser that pulls out every single key from SAM, SYSTEM, SOFTWARE, etc

Details of parser artifact file
Unable to pass this reg hive over at the moment, but can find another.

sample
provide a sample of the artifact file that has the data

already exists parser
maybe this will help
(https://github.com/williballenthin/python-registry)

Additional context
Add any other context or screenshots about parser.

Kuiper does not show the below registry keys, i have processed the same data with log2timeline and it processes it as shown below

Log2timeline also includes the contents, as you can see there is a binary in the \phone\ key, i have redacted the user.
Screen Shot 2022-09-15 at 2 20 22 PM

what content should be in .kjson?

i find many parser in kuiper is builtin ,such as yum_sources
图片

i just dont understand how to create a kjson file?,like yum_sources.kjson, what is the formate it should be?

i has try write this content to yum_sources.kjson

{"baseurl":"www.xxxx.com/baseurl/xxxx"}

when upload to kuiper,and parse it
图片
kjson parser has error,and dont tell me why?
the Artifacts view also has nothing :
图片

so , could you please write some wiki to tell us how to create a *.kjson ,masure it can be parsed by kuiper and get a Artifacts result
i want to write some parser to parse some artifacts from linux machine,but your wiki is not easy to understand

Error uploading Kape ZIP file

Case Panel shows this error, after uploading a ZIP file from Kape: "Error extract the zip content: compression type 9 (deflate64)"

In this case, Kape was set up to collect artifacts and make a ZIP folder. Does Kuiper work with vmdk cointainer inside ZIP file from Kape?

VirualMachine

Hello!
i am very newbie
downloaded the kuiper virtual machine
I don't know how to connect to vmware from the main system now. Please tell me

Crate users

Dear, thanks for creating Kuiper.

Is there any way to create new users to access to the portal?

and log their actions?

Thanks in advance

Install Script

Describe the bug
gunicorn do not start because don't have certificate files.
The install script do not create certificates for gunicorn

Additional context
In my enviroment i create manually the certificates

Autoruns parser missing configuration file

Describe the bug
The Autoruns parser does not show up in the Kuiper parsers list, I think it is related to missing configuration file.

To Reproduce
Check the parser list in Kuiper

Expected behavior
The parser showing up in Kuiper and executed correctly

ping @mayHamad

Thanks

TypeError: string indices must be integers, not str

Hello,

Selecting all artifacts to parse on Kuiper, Kuiper doesn´t finish to parse due to the following error on Kuiper-Celery.log:

ERROR/ForkPoolWorker-3] Task app.controllers.parser_management.run_parserss[98d328a8-a0f7-4225-a00c-03c4a51233ec] raised unexpected: TypeError('string indices must be integers, not str',) Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 385, in trace_task R = retval = fun(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/flask_celery.py", line 229, in __call__ return task_base.__call__(self, *_args, **_kwargs) File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 648, in __protected_call__ return self.run(*args, **kwargs) File "/home/kuiper/Kuiper/app/controllers/parser_management.py", line 296, in run_parserss start_parsing_files(main_case_id , machine_case_id , parser_files_mapping[parser]['parser_details'] , parser_files_mapping[parser]['files_to_be_parsed']) File "/home/kuiper/Kuiper/app/controllers/parser_management.py", line 247, in start_parsing_files p = db_es.bulk_queue_push(json_res[ i : (i+chunks_size) ] ,case_id, source = parser_details['name'] , machine = machine_id , data_type=parser_details['name'] , data_path = f) File "/home/kuiper/Kuiper/app/database/elkdb.py", line 175, in bulk_queue_push push_es = self.bulk_to_elasticsearch( bulk_queue , case_id ) File "/home/kuiper/Kuiper/app/database/elkdb.py", line 201, in bulk_to_elasticsearch status = err["index"]["status"] TypeError: string indices must be integers, not str

Thank you for your effort.

Multiple Elastic Search node guide

Hello!

My kuiper VM has 90b of ram assigned to it, wanted to know what the best way is to add a second ES node.

"If you want to use RAM more than 64GB to increase Elasticsearch performence, it is recommended to use multiple nodes for Elasticsearch cluster instead in different machines"

I would have to edit the docker-compose.yaml right?

Would you be able to provide assistance in for the best way to add? i followed a guide to adding a es02 node in the yaml file and i don think i got it to work quite right

Health page erroring out on fresh install

Describe the bug
Hello, have just ran a new install of Kuiper following the guide, noticed im getting an error with the health page now.

Maybe a new bug, or if I am doing something wrong pls lmk.

To Reproduce

  1. Fresh install
  2. run kuiper
  3. go to health page
  4. get below error

Screen Shot 2022-10-17 at 2 31 19 PM

"flask" and "celery" not working in Kuiper Docker installation

I'm trying to install Kuiper version 2.2.0. I didn't get any errors in the Docker installation. All items installed successfully. But when I run "docker ps -a" command, I found that "kuiper_flask" and "kuiper_celery" are not working. I ran "docker-compose up -d" again but it didn't fix it.

2021-12-08_11-02-34

I also get the error "502 Bad Gateway nginx/1.21.4". I think this is because of "flask" and "celery".

image

I tried opening Kuiper with Chrome(Version 96.0.4664.45 (Official Build) (64-bit)), Edge (Version 96.0.1054.43 (Official build) (64-bit)) and Firefox(Version 95.0 (64-bit)) .

The operating system I am trying to install Kuiper is "Ubuntu 20.10"

In addition; Cannot download because the download quota from "https://mega.nz/folder/i9UjyaQQ#DK97l_CzzhCa-y_E1dPULQ" is full. Can you offer a different option for download?

I look forward to your return as soon as possible. I wish you good work.

Incorrect UserAssist Timestamps

Describe the bug
UserAssist Timestamps are not getting extracted from the artefact. When I click on the row, I could see the timestamp under a different field. Also are timestamps in UTC?

To Reproduce
Upload the NTUSER.DAT file

Bug in BrowserHistory (Firefox & Chrome parsers)

Describe the bug
Bug when parsing Firefox & Chrome artifacts.

  • When parsing Firefox (places.sqlite) : Browser_History Parser: list index out of range - Line No. 14
  • When parsing Chrome (History) : is not JSON serializable - pushed rec. 0

To Reproduce
Steps to reproduce the behavior:
For Firefox artifact:

  1. Any places.sqlite file with the column visit_type in the table moz_historyvisits is 9

For Chrome artifact:

  1. Any History file with the column hash (from my testing, always)

Expected behavior
No error in parsing

Kuiper Deb creation - riscv

Hi, I was trying kuiper on riscv and able do native build without any issue. But In this sources Makefile don´t have install support.
I have done as bellow:
*Git clone https://github.com/emqx/kuiper
*Cd kuiper
*make build_with_edgex

Note : In same build environment I can able run server. I copied binary to /usr/bin but still a issue in on board like missing conf files.
Can you help us to copy all relative file to respective path in board.

Is any layer support available for youcto.

Thanks,

installation issue

OS fresh installation Ubuntu 18 with latest update.

follow exactly same instruction i got multiple errors. i would love to use this platform , it would be helpful if you posted a video how to install it. maybe something changes in latest ubuntu update

thank you,

How to uploaded "parsed" Hoarder zip?

After I used hoarder --no_raw_files and got the zip, I tried to upload to Kuiper. However, there is nothing happened after uploading and i tried to click process. However, its pops up error. May I know how to upload "parsed" result to Kuiper?

VM Microphone Access

I see that the VM provided accesses my microphone on startup... Is there a reason for this?

Timeline Export - Missing Elements

Describe the bug
When exporting a timeline, some entries are populated with empty fields.
image

To Reproduce
Steps to reproduce the behavior:
I am not sure exactly how to reproduce this error.

In the above screenshot - we see event ID 22 shows up fine, but other event log entries do not.

The entries with empty data are always at the bottom of the excel document.

The empty sections in the timeline DO have values populated in the timeline.

Example of missing Event IDs - but have them populated in the timeline view:
image

Example of missing UAL entries - but with data in the timeline view:
image

Expected behavior
One would assume at least the event IDs would be populated on events - even if a description cant be derived.

Screenshots
Above

RUnning Kuiper

Dears,

I'm facing issues when running the tool "./kuiper_install.sh -run", the following is displayed:

"Usage: celery worker [OPTIONS]

Try 'celery worker --help' for help.

Error: no such option: -A"

Kindly assist

Parsing issues with both Kape and Hoarder.

Have been testing out the platform today with the .ova and it works great, no problem getting it up and running. The problem I'm having right now is that it seems to be some kind of import error when trying to just drag and drop the Kape zip for example.

I tried the "KapeTriage" feature and exported it to a zip. When trying to import it directly I just get:

Error extract the zip content: 'utf8' codec can't decode byte 0x99 in position 81: invalid start byte

Any tips for how I can keep troubleshooting this?

Edit Timeline Template?

Hello,

Have a question with the Timeline/tag export sheet.

Whats the best way to customize it myself?

I found timeline.xlsx in /kuiper/app/utils/build_timeline and edited it in Libreoffice (when i edit in excel and send to kuiper, Kuiper throws error saying ZIP not found). I added a few columns to the template at timeline.xlsx. Then when i go to localhost:5000/case/caseID/timeline and download a new timeline, it does not use my template at all.

Any ideas?

Thanks

nginx fail to start after successful installation

Hi, I'm trying to install Kuiper in a brand new Ubuntu 18.04 vm and after correctly installed all the requirements I receive the following nginx error:

Job for nginx.service failed because the control process exited with error code.
See "systemctl status nginx.service" and "journalctl -xe" for details.

  • ERROR: Failed reloading Nginx config (Error Code: 1).

Checking further with the command "journalctl -xe" it seems to be a certificate error:

Oct 18 11:01:08 kuiper nginx[11196]: nginx: [emerg] BIO_new_file("/home/kuiper/kuiper/cert/MyCertificate.crt") failed (SSL: error:02001002:system library:fopen:No such file or directory:fop
Oct 18 11:01:08 kuiper nginx[11196]: nginx: configuration file /etc/nginx/nginx.conf test failed
Oct 18 11:01:08 kuiper systemd[1]: nginx.service: Control process exited, code=exited status=1
Oct 18 11:01:08 kuiper systemd[1]: nginx.service: Failed with result 'exit-code'.
Oct 18 11:01:08 kuiper systemd[1]: Failed to start A high performance web server and a reverse proxy server.

The indicates certificates path: /home/kuiper/kuiper/cert/MyCertificate.crt isn't present for this reason I tried to modified this path in the nginx conf file "kuiper-nginx.conf" whithout success:

server {
listen 443 ssl;
ssl_certificate /home/kuiper/kuiper/cert/MyCertificate.crt;
ssl_certificate_key /home/kuiper/kuiper/cert/MyKey.key;

Can you please help me to solve this issue?

Error: "docker-compose.yaml" is unsupported.

Hi,

First of all, thank you for share your project.

I was trying to install this from first time, according to:
https://github.com/DFIRKuiper/Kuiper#Installation

However, Im facing the following issues:

root@kuiper-dfir:/opt/Kuiper# docker-compose pull
ERROR: Version in "./docker-compose.yaml" is unsupported. You might be seeing this error because you're using the wrong Compose file version. Either specify a supported version (e.g "2.2" or "3.3") and place your service definitions under the services key, or omit the version key and place your service definitions at the root of the file to use version 1.
For more on the Compose file format versions, see https://docs.docker.com/compose/compose-file/

root@kuiper-dfir:/opt/Kuiper# docker-compose up -d
ERROR: Version in "./docker-compose.yaml" is unsupported. You might be seeing this error because you're using the wrong Compose file version. Either specify a supported version (e.g "2.2" or "3.3") and place your service definitions under the services key, or omit the version key and place your service definitions at the root of the file to use version 1.
For more on the Compose file format versions, see https://docs.docker.com/compose/compose-file/

I don have enough experience with docker. Could you help me?:

Ubuntu version:

DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=18.04
DISTRIB_CODENAME=bionic
DISTRIB_DESCRIPTION="Ubuntu 18.04.6 LTS"
NAME="Ubuntu"
VERSION="18.04.6 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.6 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic

Docker:

Docker version 20.10.12, build e91ed57
docker-compose version 1.17.1, build unknown

Thanks in advance.

Regards.

Ability to Edit Machine info

Hello,

Would it be possible to add the ability to edit Machine Info?

Or maybe I cant seem to find it. After I upload a triage package, I'd like to rename the Machine name and IP address within Kuiper.

Thanks

ImportError: No module named amcache.amcache_interface

Hello,

Uploading a Hoarder zip file with Amcache hive, but if you select to parse all Amcache on Kuiper, Kuiper not parse any other hives and shows this error on Kuiper-celery.log

Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 385, in trace_task R = retval = fun(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/flask_celery.py", line 229, in __call__ return task_base.__call__(self, *_args, **_kwargs) File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 648, in __protected_call__ return self.run(*args, **kwargs) File "/home/kuiper/Kuiper/app/controllers/parser_management.py", line 296, in run_parserss start_parsing_files(main_case_id , machine_case_id , parser_files_mapping[parser]['parser_details'] , parser_files_mapping[parser]['files_to_be_parsed']) File "/home/kuiper/Kuiper/app/controllers/parser_management.py", line 215, in start_parsing_files parser_module = importlib.import_module('app.parsers.' + parser_folder + '.' + parser_file ) File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module __import__(name)

I think there are two amcache parsers on Kuiper:

image

If you disable amcache in red colour, it works.

Thank you so much.

Failed To parse Artifacts

Salam Team,
Thanks for the tool and your efforts.
i did clean install for kuiper and all the services booted without issues . i accessed the web interface uploaded multiple artifacts but all of them failed to parse and tried only uploading MFT file and also failed . i tried to troubleshoot from the logs but no log show the parsing error.

Creating kuiper_redis ... done
Creating kuiper_mongodb ... done
Creating kuiper_es01 ... done
Creating kuiper_celery ... done
Creating kuiper_flask ... done
Creating kuiper_nginx ... done

image

image

kindly let me know where to troubleshoot the parsing errors .

OS :
Ubuntu 22.04.1 LTS
Linux 5.15.0-52-generic #58-Ubuntu SMP Thu Oct 13 08:03:55 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux

FireEye HX .mans triage collections parser

Discribe the parser
mans_to_es is an open source tool for parsing FireEye HX .mans triage collections and send them to ElasticSearch.
Mans file is a zipped collection of xml that we parse using xmltodict.
It uses pandas and multiprocessing to speed up the parsing with xml files.

Details of parser artifact file
mans to es python script actually read manifest.json file which define linkage of different xml files mentioned in .mans zipped folder
this actually work only on FireEye HX .mans but if analyst used FireEye Redline tool to collect artifacts then this script partially work because of the artifacts names in manifest.json is different (e.g HX .mans has "generator": "process-handle" where as Redline manifest.json has "generator":"w32process-handle")

sample
https://github.com/casimkhan/samples
HX sample 1yTQNggTPz3c3nGcaHrFco.mans (triage taken from Mac Os)
Redline sample from windows (Redline-Sample.mans)

already exists parser
https://github.com/LDO-CERT/mans_to_es/tree/master/mans_to_es

Additional context
Ideally, when user submit triage .zip files from KAPE or hoarder if you can add .mans as another option and once its in Kuiper user can click on parse then mans parser can parse and add it into time line

Processing the machine image stuck on 99%

I've installed the latest version of Kuiper, and when I'm uploading a .zip output from Hoarder, the image process (parsers) is stuck at 99%.
image

The screenshot is attached.
There are any logs or something that can help me with it??

I need help with understanding what the cause of it is.

Docker Windows Install

Greetings,

I am wondering if there is any guidance on using Kuiper on a docker server hosted on WSL2 (windows).

Everything boots fine except the nfs share container. Errors like "exportnfs - nfs not supported" on that container.

Is there any direction that you might be able to point me to for modifying the container to work on a WSL2 environment?

Error uploading Kape ZIP file

I'm trying to upload Kape ZIP file, and I'm getting an error message:
Error extract the zip content: compression type 9 (deflate64)

I've installed the latest Kuiper version.

TImeline not working

Hi,

Kuiper is working fine for us except for the timeline. It always is empty, no matter how many events have been selected in the Artifacts tab.

Where could we see what can be happening?

Congratulations for Kuiper. It looks a great DFIR tool.

Best regards

YARP issue

Hello,

I've run into a few registry hives where kuiper will list a yarp issue, example for ntuser.dat
Screen Shot 2022-10-16 at 11 30 01 PM

I manually run yarp on the same ntuser.dat and get this:

Screen Shot 2022-10-16 at 11 31 18 PM

I ran regipy on it and it printed with no issues https://github.com/mkorman90/regipy

Screen Shot 2022-10-16 at 11 36 19 PM

Not sure if a yarp setting needs to be change or maybe use regipy instead

MFT parser adjustment

Describe the bug
MFT_parser as it is now parses only 1 entry in kuiper but holds all 6-8 timestamps in one entry

Expected behavior
MFT_parser should parse individual entries for EACH timestamp it detects. This is better for timeline analysis and seeing when files are created, accessed, etc in the timeline

Screenshots
Screen Shot 2022-10-27 at 3 41 48 PM

So if we look at these two time columns here, the time for the "time stamp" entry might be based on data.FNCreated. but if we see Data.SILastAccessed, its a different timestamp at 2022-09-14.

There should be different entries for each of these timestamps.

So when i search for this file "soo.ps1" there should be 7-8 entries for it, with each entry being a different timestamp attribute

Then in details maybe it has the variable for what attribute it is, FN,SI, created, etc

Add horizontal scrolling to alerts overview table

Hi!

please find the detailed description of the feature request below.

Is your feature request related to a problem? Please describe.
When adding rules with longer names, descriptions and queries, the alert overview table expands the wrapper div element with the id alerts_table_wrapper. Thus, the number of found alerts which is placed in the end of the table is not visible any more and there is no scroll bar in order to make them visible. The only option to see the number of found alerts is to expand the browser window.

Describe the solution you'd like
It would be prefereable to add the CSS option overlow:auto to the wrapper div with the ID alerts_table_wrapper in order to make horizontal scrolling possible.

Describe alternatives you've considered
he only alternative option is to manually expand the browser window so that the table fits into the wrapper div element.

Additional context
No additional context.

Thank you very much for the effort that you put into Kuiper!

Connection refused

root@debian:~/saad/Kuiper$ sudo ./kuiper_install.sh -run Running Kuiper! Kuiper can be accessed at http://[configuredIP]:5000 nohup: redirecting stderr to stdout root@debian:~/saad/Kuiper$ Traceback (most recent call last): File "/usr/local/bin/celery", line 8, in <module> sys.exit(main()) File "/usr/local/lib/python2.7/dist-packages/celery/__main__.py", line 16, in main _main() File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 322, in main cmd.execute_from_commandline(argv) File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 496, in execute_from_commandline super(CeleryCommand, self).execute_from_commandline(argv))) File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 288, in execute_from_commandline argv = self.setup_app_from_commandline(argv) File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 502, in setup_app_from_commandline self.app = self.find_app(app) File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 524, in find_app return find_app(app, symbol_by_name=self.symbol_by_name) File "/usr/local/lib/python2.7/dist-packages/celery/app/utils.py", line 368, in find_app sym = symbol_by_name(app, imp=imp) File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 527, in symbol_by_name return imports.symbol_by_name(name, imp=imp) File "/usr/local/lib/python2.7/dist-packages/kombu/utils/imports.py", line 57, in symbol_by_name module = imp(module_name, package=package, **kwargs) File "/usr/local/lib/python2.7/dist-packages/celery/utils/imports.py", line 111, in import_from_cwd return imp(module, package=package) File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module __import__(name) File "/home/root/saad/Kuiper/app/__init__.py", line 34, in <module> from controllers import case_management,admin_management File "/home/root/saad/Kuiper/app/controllers/case_management.py", line 18, in <module> import parser_management File "/home/root/saad/Kuiper/app/controllers/parser_management.py", line 18, in <module> from app.database.dbstuff import * File "/home/root/saad/Kuiper/app/database/dbstuff.py", line 27, in <module> MClient = MongoClient(DB_IP + ":" + str(DB_PORT) ) File "/usr/local/lib/python2.7/dist-packages/pymongo/mongo_client.py", line 377, in __init__ raise ConnectionFailure(str(e)) pymongo.errors.ConnectionFailure: [Errno 111] Connection refused

SIGMA RULES

is there any way to upload sigma rules to KUIPER?

Thank you

Search broken on large shard/index

Hello,

I have a large case in Kuiper with 74 hosts and have noticed that as it got bigger i was unable to conduct any type of filtering in Kuiper.

My shard health for this case is yellow, with its Index being 49.2 Gb

I can view the data with no filter just fine:
Screen Shot 2022-10-31 at 10 49 07 PM

But when I apply a simple filter i get this:

Screen Shot 2022-10-31 at 10 51 24 PM

I installed an elasticsearch health checker plugin, and i get this error when trying to search it on the plugin:

Screen Shot 2022-10-31 at 10 53 08 PM

Here are some logs from the Kuiper.log file:

"2022-11-01 05:56:26.730553","[DEBUG]","case_management.py.case_browse_artifacts_ajax[Lin.989]","case","Case[index2]: Query artifacts","{"sort": {"Data.@timestamp": {"order": "asc"}}, "query": {"query_string": {"query": "!(data_type:"tag") AND (admin)", "default_field": "catch_all"}}, "from": 0, "size": 30}"

"2022-11-01 05:56:26.731607","[DEBUG]","elkdb.py.query[Lin.211]","elasticsearch","Query to index [index2]","{"sort": {"Data.@timestamp": {"order": "asc"}}, "query": {"query_string": {"query": "!(data_type:"tag") AND (admin)", "default_field": "catch_all"}}, "track_total_hits": true, "from": 0, "size": 30}"

"2022-11-01 05:56:26.732678","[DEBUG]","case_management.py.browse_artifacts_list_ajax[Lin.891]","case","Case[index2]: Query artifacts list","{"query": {"query_string": {"query": "!(data_type:"tag") AND (admin)", "default_field": "catch_all"}}, "aggs": {"data_type": {"terms": {"field": "data_type.keyword", "order": {"_key": "asc"}, "size": 500}}}, "size": 0}"

"2022-11-01 05:56:26.733403","[DEBUG]","elkdb.py.query[Lin.211]","elasticsearch","Query to index [index2]","{"query": {"query_string": {"query": "!(data_type:"tag") AND (admin)", "default_field": "catch_all"}}, "track_total_hits": true, "aggs": {"data_type": {"terms": {"field": "data_type.keyword", "order": {"_key": "asc"}, "size": 500}}}, "size": 0}"

Tried searching around for help with this issue, is this because Kuiper is set to put all the data in one Share/Indice maybe?

export file

Hi, the data is not output in the export file.

image

image

Permisssions problems with the logs directory

Describe the bug
After the installation, in the first running of Kuiper, it raises a python error trace, regarding a lack of writting permissions in the logs directory

To Reproduce
Steps to reproduce the behavior:
1.- Install Kuiper from the scratch in a brand new machine
2.- Run Kuiper
3.- Python error trace

Expected behavior
Don't get that error ;)

Fortunately It's easy to solve, I changed ownership of logs directory and all went fine.

BTW, timeline is now working fine for me :)

Error extract the zip content

Error extract the zip content: 'utf8' codec can't decode byte 0xa4 in position 72: invalid start byte

Windows Server 2012

The letter 'Z' should be available as part of a case name.

Describe the bug
When creating a case with a 'z' in the name, the 'z' is removed.

To Reproduce
Steps to reproduce the behavior:

  1. go to the "Cases" on the Kuiper web UI.
  2. click on the '+' sign.
  3. create a case with a name containing the letter 'z'. For example, 'Zoo Zeeland'.
  4. look at the name of the newly created case: 'ooeeland'.

Expected behavior
The letter 'z' should be available as part of a case name.

Screenshot from 2021-08-30 13-32-11

Screenshot from 2021-08-30 13-32-35

Minor Installation Issue Nginx

Describe the bug
During the installation after running
sudo ./kuiper_install.sh -install

The installatio seemed to hang when installing Nginx. This is because it needed confirmation from the user to install the package but this wasn't printed to the terminal. Saw the prompt at the installation log.

docker-compose issue to start all images

I installed Kuiper on a new machine with this setting :
OS: 64-bit Ubuntu 18.04.6 LTS
RAM: 8GB
Cores: 4
Disk: 50GB

Docker Engine 20.10.12
docker-compose 1.29.2

Sadly, i tried many times and i always have error and the last one is the all images didn't started correctly :
image

Any ideas how to make it works ?

Option to generate new timeline instead of using version or checking for older ones

Hello,

Looked at some of the code for building master Timeline, it seems that when I hit download timeline and it activates 'timeline_build_ajax' it builds it once. Then unless I tag something new, when i hit download, it will download the previously generated timeline.

Ran into this issue because I was editing Timeline Views and adding new ones. When I wanted to test my new Timeline Views, I realized Kuiper was not checking for my new Timeline Views because a new one wasnt being generated.

Can there be an option or something to have Kuiper generate a new timeline sheet each time? instead of checking for existing timelines?

tldr: how to generate brand new timeline each time i export tagged items in Timeline page?

Feature Request: Automatic Processing After Upload

Describe the solution you'd like
A setting to allow for a default list of processers to launch immediately after a successful upload of evidence.

Describe alternatives you've considered
Alternatively, if the API supported running a processing job, this could work also. Right now it seems only uploads are supported via the API.

Export Results Bug

Unfortunately I can not provide screenshots.

I am attempting to export results to a csv. In doing so, everything is exported, but the CSV only contains the date/timestamp columns. No other fields from the events are exported

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.