dfirkuiper / kuiper Goto Github PK
View Code? Open in Web Editor NEWDigital Forensics Investigation Platform
Digital Forensics Investigation Platform
root@debian:~/saad/Kuiper$ sudo ./kuiper_install.sh -run Running Kuiper! Kuiper can be accessed at http://[configuredIP]:5000 nohup: redirecting stderr to stdout root@debian:~/saad/Kuiper$ Traceback (most recent call last): File "/usr/local/bin/celery", line 8, in <module> sys.exit(main()) File "/usr/local/lib/python2.7/dist-packages/celery/__main__.py", line 16, in main _main() File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 322, in main cmd.execute_from_commandline(argv) File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 496, in execute_from_commandline super(CeleryCommand, self).execute_from_commandline(argv))) File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 288, in execute_from_commandline argv = self.setup_app_from_commandline(argv) File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 502, in setup_app_from_commandline self.app = self.find_app(app) File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 524, in find_app return find_app(app, symbol_by_name=self.symbol_by_name) File "/usr/local/lib/python2.7/dist-packages/celery/app/utils.py", line 368, in find_app sym = symbol_by_name(app, imp=imp) File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 527, in symbol_by_name return imports.symbol_by_name(name, imp=imp) File "/usr/local/lib/python2.7/dist-packages/kombu/utils/imports.py", line 57, in symbol_by_name module = imp(module_name, package=package, **kwargs) File "/usr/local/lib/python2.7/dist-packages/celery/utils/imports.py", line 111, in import_from_cwd return imp(module, package=package) File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module __import__(name) File "/home/root/saad/Kuiper/app/__init__.py", line 34, in <module> from controllers import case_management,admin_management File "/home/root/saad/Kuiper/app/controllers/case_management.py", line 18, in <module> import parser_management File "/home/root/saad/Kuiper/app/controllers/parser_management.py", line 18, in <module> from app.database.dbstuff import * File "/home/root/saad/Kuiper/app/database/dbstuff.py", line 27, in <module> MClient = MongoClient(DB_IP + ":" + str(DB_PORT) ) File "/usr/local/lib/python2.7/dist-packages/pymongo/mongo_client.py", line 377, in __init__ raise ConnectionFailure(str(e)) pymongo.errors.ConnectionFailure: [Errno 111] Connection refused
Hi,
Kuiper is working fine for us except for the timeline. It always is empty, no matter how many events have been selected in the Artifacts tab.
Where could we see what can be happening?
Congratulations for Kuiper. It looks a great DFIR tool.
Best regards
OS fresh installation Ubuntu 18 with latest update.
follow exactly same instruction i got multiple errors. i would love to use this platform , it would be helpful if you posted a video how to install it. maybe something changes in latest ubuntu update
thank you,
Describe the bug
After the installation, in the first running of Kuiper, it raises a python error trace, regarding a lack of writting permissions in the logs directory
To Reproduce
Steps to reproduce the behavior:
1.- Install Kuiper from the scratch in a brand new machine
2.- Run Kuiper
3.- Python error trace
Expected behavior
Don't get that error ;)
Fortunately It's easy to solve, I changed ownership of logs directory and all went fine.
BTW, timeline is now working fine for me :)
is there any way to upload sigma rules to KUIPER?
Thank you
Dears,
I'm facing issues when running the tool "./kuiper_install.sh -run", the following is displayed:
"Usage: celery worker [OPTIONS]
Try 'celery worker --help' for help.
Error: no such option: -A"
Kindly assist
Hi, I'm trying to install Kuiper in a brand new Ubuntu 18.04 vm and after correctly installed all the requirements I receive the following nginx error:
Job for nginx.service failed because the control process exited with error code.
See "systemctl status nginx.service" and "journalctl -xe" for details.
Checking further with the command "journalctl -xe" it seems to be a certificate error:
Oct 18 11:01:08 kuiper nginx[11196]: nginx: [emerg] BIO_new_file("/home/kuiper/kuiper/cert/MyCertificate.crt") failed (SSL: error:02001002:system library:fopen:No such file or directory:fop
Oct 18 11:01:08 kuiper nginx[11196]: nginx: configuration file /etc/nginx/nginx.conf test failed
Oct 18 11:01:08 kuiper systemd[1]: nginx.service: Control process exited, code=exited status=1
Oct 18 11:01:08 kuiper systemd[1]: nginx.service: Failed with result 'exit-code'.
Oct 18 11:01:08 kuiper systemd[1]: Failed to start A high performance web server and a reverse proxy server.
The indicates certificates path: /home/kuiper/kuiper/cert/MyCertificate.crt isn't present for this reason I tried to modified this path in the nginx conf file "kuiper-nginx.conf" whithout success:
server {
listen 443 ssl;
ssl_certificate /home/kuiper/kuiper/cert/MyCertificate.crt;
ssl_certificate_key /home/kuiper/kuiper/cert/MyKey.key;
Can you please help me to solve this issue?
Hello,
Uploading a Hoarder zip file with Amcache hive, but if you select to parse all Amcache on Kuiper, Kuiper not parse any other hives and shows this error on Kuiper-celery.log
Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 385, in trace_task R = retval = fun(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/flask_celery.py", line 229, in __call__ return task_base.__call__(self, *_args, **_kwargs) File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 648, in __protected_call__ return self.run(*args, **kwargs) File "/home/kuiper/Kuiper/app/controllers/parser_management.py", line 296, in run_parserss start_parsing_files(main_case_id , machine_case_id , parser_files_mapping[parser]['parser_details'] , parser_files_mapping[parser]['files_to_be_parsed']) File "/home/kuiper/Kuiper/app/controllers/parser_management.py", line 215, in start_parsing_files parser_module = importlib.import_module('app.parsers.' + parser_folder + '.' + parser_file ) File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module __import__(name)
I think there are two amcache parsers on Kuiper:
If you disable amcache in red colour, it works.
Thank you so much.
Greetings,
I am wondering if there is any guidance on using Kuiper on a docker server hosted on WSL2 (windows).
Everything boots fine except the nfs share container. Errors like "exportnfs - nfs not supported" on that container.
Is there any direction that you might be able to point me to for modifying the container to work on a WSL2 environment?
Unfortunately I can not provide screenshots.
I am attempting to export results to a csv. In doing so, everything is exported, but the CSV only contains the date/timestamp columns. No other fields from the events are exported
Hello!
My kuiper VM has 90b of ram assigned to it, wanted to know what the best way is to add a second ES node.
"If you want to use RAM more than 64GB to increase Elasticsearch performence, it is recommended to use multiple nodes for Elasticsearch cluster instead in different machines"
I would have to edit the docker-compose.yaml right?
Would you be able to provide assistance in for the best way to add? i followed a guide to adding a es02 node in the yaml file and i don think i got it to work quite right
Hello,
Have a question with the Timeline/tag export sheet.
Whats the best way to customize it myself?
I found timeline.xlsx in /kuiper/app/utils/build_timeline and edited it in Libreoffice (when i edit in excel and send to kuiper, Kuiper throws error saying ZIP not found). I added a few columns to the template at timeline.xlsx. Then when i go to localhost:5000/case/caseID/timeline and download a new timeline, it does not use my template at all.
Any ideas?
Thanks
i find many parser in kuiper is builtin ,such as yum_sources
i just dont understand how to create a kjson file?,like yum_sources.kjson, what is the formate it should be?
i has try write this content to yum_sources.kjson
{"baseurl":"www.xxxx.com/baseurl/xxxx"}
when upload to kuiper,and parse it
kjson parser has error,and dont tell me why?
the Artifacts view also has nothing :
so , could you please write some wiki to tell us how to create a *.kjson ,masure it can be parsed by kuiper and get a Artifacts result
i want to write some parser to parse some artifacts from linux machine,but your wiki is not easy to understand
I'm trying to install Kuiper version 2.2.0. I didn't get any errors in the Docker installation. All items installed successfully. But when I run "docker ps -a" command, I found that "kuiper_flask" and "kuiper_celery" are not working. I ran "docker-compose up -d" again but it didn't fix it.
I also get the error "502 Bad Gateway nginx/1.21.4". I think this is because of "flask" and "celery".
I tried opening Kuiper with Chrome(Version 96.0.4664.45 (Official Build) (64-bit)), Edge (Version 96.0.1054.43 (Official build) (64-bit)) and Firefox(Version 95.0 (64-bit)) .
The operating system I am trying to install Kuiper is "Ubuntu 20.10"
In addition; Cannot download because the download quota from "https://mega.nz/folder/i9UjyaQQ#DK97l_CzzhCa-y_E1dPULQ" is full. Can you offer a different option for download?
I look forward to your return as soon as possible. I wish you good work.
Describe the bug
During the installation after running
sudo ./kuiper_install.sh -install
The installatio seemed to hang when installing Nginx. This is because it needed confirmation from the user to install the package but this wasn't printed to the terminal. Saw the prompt at the installation log.
Hello,
I have a large case in Kuiper with 74 hosts and have noticed that as it got bigger i was unable to conduct any type of filtering in Kuiper.
My shard health for this case is yellow, with its Index being 49.2 Gb
I can view the data with no filter just fine:
But when I apply a simple filter i get this:
I installed an elasticsearch health checker plugin, and i get this error when trying to search it on the plugin:
Here are some logs from the Kuiper.log file:
"2022-11-01 05:56:26.730553","[DEBUG]","case_management.py.case_browse_artifacts_ajax[Lin.989]","case","Case[index2]: Query artifacts","{"sort": {"Data.@timestamp": {"order": "asc"}}, "query": {"query_string": {"query": "!(data_type:"tag") AND (admin)", "default_field": "catch_all"}}, "from": 0, "size": 30}"
"2022-11-01 05:56:26.731607","[DEBUG]","elkdb.py.query[Lin.211]","elasticsearch","Query to index [index2]","{"sort": {"Data.@timestamp": {"order": "asc"}}, "query": {"query_string": {"query": "!(data_type:"tag") AND (admin)", "default_field": "catch_all"}}, "track_total_hits": true, "from": 0, "size": 30}"
"2022-11-01 05:56:26.732678","[DEBUG]","case_management.py.browse_artifacts_list_ajax[Lin.891]","case","Case[index2]: Query artifacts list","{"query": {"query_string": {"query": "!(data_type:"tag") AND (admin)", "default_field": "catch_all"}}, "aggs": {"data_type": {"terms": {"field": "data_type.keyword", "order": {"_key": "asc"}, "size": 500}}}, "size": 0}"
"2022-11-01 05:56:26.733403","[DEBUG]","elkdb.py.query[Lin.211]","elasticsearch","Query to index [index2]","{"query": {"query_string": {"query": "!(data_type:"tag") AND (admin)", "default_field": "catch_all"}}, "track_total_hits": true, "aggs": {"data_type": {"terms": {"field": "data_type.keyword", "order": {"_key": "asc"}, "size": 500}}}, "size": 0}"
Tried searching around for help with this issue, is this because Kuiper is set to put all the data in one Share/Indice maybe?
Describe the bug
gunicorn do not start because don't have certificate files.
The install script do not create certificates for gunicorn
Additional context
In my enviroment i create manually the certificates
Error extract the zip content: 'utf8' codec can't decode byte 0xa4 in position 72: invalid start byte
Windows Server 2012
Describe the bug
When creating a case with a 'z' in the name, the 'z' is removed.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
The letter 'z' should be available as part of a case name.
Hi!
please find the detailed description of the feature request below.
Is your feature request related to a problem? Please describe.
When adding rules with longer names, descriptions and queries, the alert overview table expands the wrapper div
element with the id alerts_table_wrapper
. Thus, the number of found alerts which is placed in the end of the table is not visible any more and there is no scroll bar in order to make them visible. The only option to see the number of found alerts is to expand the browser window.
Describe the solution you'd like
It would be prefereable to add the CSS option overlow:auto
to the wrapper div
with the ID alerts_table_wrapper
in order to make horizontal scrolling possible.
Describe alternatives you've considered
he only alternative option is to manually expand the browser window so that the table fits into the wrapper div
element.
Additional context
No additional context.
Thank you very much for the effort that you put into Kuiper!
Dear, thanks for creating Kuiper.
Is there any way to create new users to access to the portal?
and log their actions?
Thanks in advance
Hello,
Looked at some of the code for building master Timeline, it seems that when I hit download timeline and it activates 'timeline_build_ajax' it builds it once. Then unless I tag something new, when i hit download, it will download the previously generated timeline.
Ran into this issue because I was editing Timeline Views and adding new ones. When I wanted to test my new Timeline Views, I realized Kuiper was not checking for my new Timeline Views because a new one wasnt being generated.
Can there be an option or something to have Kuiper generate a new timeline sheet each time? instead of checking for existing timelines?
tldr: how to generate brand new timeline each time i export tagged items in Timeline page?
Hello!
i am very newbie
downloaded the kuiper virtual machine
I don't know how to connect to vmware from the main system now. Please tell me
Describe the parser
Parser that parses out all registry keys in all hives. displays values, names, etc
As it seems now, i am missing SOFTWARE keys.
Kuiper seems to parse some registry keys based on specific sources i see in regsk.
But there does not seem to be a parser that pulls out every single key from SAM, SYSTEM, SOFTWARE, etc
Details of parser artifact file
Unable to pass this reg hive over at the moment, but can find another.
sample
provide a sample of the artifact file that has the data
already exists parser
maybe this will help
(https://github.com/williballenthin/python-registry)
Additional context
Add any other context or screenshots about parser.
Kuiper does not show the below registry keys, i have processed the same data with log2timeline and it processes it as shown below
Log2timeline also includes the contents, as you can see there is a binary in the \phone\ key, i have redacted the user.
I installed Kuiper on a new machine with this setting :
OS: 64-bit Ubuntu 18.04.6 LTS
RAM: 8GB
Cores: 4
Disk: 50GB
Docker Engine 20.10.12
docker-compose 1.29.2
Sadly, i tried many times and i always have error and the last one is the all images didn't started correctly :
Any ideas how to make it works ?
Describe the bug
Bug when parsing Firefox & Chrome artifacts.
Browser_History Parser: list index out of range - Line No. 14
is not JSON serializable - pushed rec. 0
To Reproduce
Steps to reproduce the behavior:
For Firefox artifact:
places.sqlite
file with the column visit_type
in the table moz_historyvisits
is 9
For Chrome artifact:
History
file with the column hash
(from my testing, always)Expected behavior
No error in parsing
Describe the solution you'd like
A setting to allow for a default list of processers to launch immediately after a successful upload of evidence.
Describe alternatives you've considered
Alternatively, if the API supported running a processing job, this could work also. Right now it seems only uploads are supported via the API.
Hello,
Install guide is missing commands to install docker-compose, also recommended version for docker-compose as well.
seems to be the most difficult part of the install process is getting the right docker version.
if you install docker-compose through apt-get it says its an incompatible version with the yaml.
please list the preferred docker-compose version(s)
Thanks!
Describe the bug
When exporting a timeline, some entries are populated with empty fields.
To Reproduce
Steps to reproduce the behavior:
I am not sure exactly how to reproduce this error.
In the above screenshot - we see event ID 22 shows up fine, but other event log entries do not.
The entries with empty data are always at the bottom of the excel document.
The empty sections in the timeline DO have values populated in the timeline.
Example of missing Event IDs - but have them populated in the timeline view:
Example of missing UAL entries - but with data in the timeline view:
Expected behavior
One would assume at least the event IDs would be populated on events - even if a description cant be derived.
Screenshots
Above
Hello,
Would it be possible to add the ability to edit Machine Info?
Or maybe I cant seem to find it. After I upload a triage package, I'd like to rename the Machine name and IP address within Kuiper.
Thanks
Describe the bug
The Autoruns parser does not show up in the Kuiper parsers list, I think it is related to missing configuration file.
To Reproduce
Check the parser list in Kuiper
Expected behavior
The parser showing up in Kuiper and executed correctly
ping @mayHamad
Thanks
I see that the VM provided accesses my microphone on startup... Is there a reason for this?
I'm trying to upload Kape ZIP file, and I'm getting an error message:
Error extract the zip content: compression type 9 (deflate64)
I've installed the latest Kuiper version.
Hello,
Selecting all artifacts to parse on Kuiper, Kuiper doesn´t finish to parse due to the following error on Kuiper-Celery.log:
ERROR/ForkPoolWorker-3] Task app.controllers.parser_management.run_parserss[98d328a8-a0f7-4225-a00c-03c4a51233ec] raised unexpected: TypeError('string indices must be integers, not str',) Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 385, in trace_task R = retval = fun(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/flask_celery.py", line 229, in __call__ return task_base.__call__(self, *_args, **_kwargs) File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 648, in __protected_call__ return self.run(*args, **kwargs) File "/home/kuiper/Kuiper/app/controllers/parser_management.py", line 296, in run_parserss start_parsing_files(main_case_id , machine_case_id , parser_files_mapping[parser]['parser_details'] , parser_files_mapping[parser]['files_to_be_parsed']) File "/home/kuiper/Kuiper/app/controllers/parser_management.py", line 247, in start_parsing_files p = db_es.bulk_queue_push(json_res[ i : (i+chunks_size) ] ,case_id, source = parser_details['name'] , machine = machine_id , data_type=parser_details['name'] , data_path = f) File "/home/kuiper/Kuiper/app/database/elkdb.py", line 175, in bulk_queue_push push_es = self.bulk_to_elasticsearch( bulk_queue , case_id ) File "/home/kuiper/Kuiper/app/database/elkdb.py", line 201, in bulk_to_elasticsearch status = err["index"]["status"] TypeError: string indices must be integers, not str
Thank you for your effort.
Discribe the parser
mans_to_es is an open source tool for parsing FireEye HX .mans triage collections and send them to ElasticSearch.
Mans file is a zipped collection of xml that we parse using xmltodict.
It uses pandas and multiprocessing to speed up the parsing with xml files.
Details of parser artifact file
mans to es python script actually read manifest.json file which define linkage of different xml files mentioned in .mans zipped folder
this actually work only on FireEye HX .mans but if analyst used FireEye Redline tool to collect artifacts then this script partially work because of the artifacts names in manifest.json is different (e.g HX .mans has "generator": "process-handle" where as Redline manifest.json has "generator":"w32process-handle")
sample
https://github.com/casimkhan/samples
HX sample 1yTQNggTPz3c3nGcaHrFco.mans (triage taken from Mac Os)
Redline sample from windows (Redline-Sample.mans)
already exists parser
https://github.com/LDO-CERT/mans_to_es/tree/master/mans_to_es
Additional context
Ideally, when user submit triage .zip files from KAPE or hoarder if you can add .mans as another option and once its in Kuiper user can click on parse then mans parser can parse and add it into time line
Describe the bug
MFT_parser as it is now parses only 1 entry in kuiper but holds all 6-8 timestamps in one entry
Expected behavior
MFT_parser should parse individual entries for EACH timestamp it detects. This is better for timeline analysis and seeing when files are created, accessed, etc in the timeline
So if we look at these two time columns here, the time for the "time stamp" entry might be based on data.FNCreated. but if we see Data.SILastAccessed, its a different timestamp at 2022-09-14.
There should be different entries for each of these timestamps.
So when i search for this file "soo.ps1" there should be 7-8 entries for it, with each entry being a different timestamp attribute
Then in details maybe it has the variable for what attribute it is, FN,SI, created, etc
Salam Team,
Thanks for the tool and your efforts.
i did clean install for kuiper and all the services booted without issues . i accessed the web interface uploaded multiple artifacts but all of them failed to parse and tried only uploading MFT file and also failed . i tried to troubleshoot from the logs but no log show the parsing error.
Creating kuiper_redis ... done
Creating kuiper_mongodb ... done
Creating kuiper_es01 ... done
Creating kuiper_celery ... done
Creating kuiper_flask ... done
Creating kuiper_nginx ... done
kindly let me know where to troubleshoot the parsing errors .
OS :
Ubuntu 22.04.1 LTS
Linux 5.15.0-52-generic #58-Ubuntu SMP Thu Oct 13 08:03:55 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Hello!
Can I install on Kali Linux?
Describe the bug
UserAssist Timestamps are not getting extracted from the artefact. When I click on the row, I could see the timestamp under a different field. Also are timestamps in UTC?
To Reproduce
Upload the NTUSER.DAT file
Hello,
Would it be possible to add the timeline export view when browsing artifacts on machines?
I like the timeline export with the views, but looking to get a workflow in where i get that but on an individual machine basis
Want to be able to click the purple export button, but have it export the way the timeline export is. As it exports now it does not include much data
above is all it includes. Doesnt seem to use any of the Timeline Views
Hi, I was trying kuiper on riscv and able do native build without any issue. But In this sources Makefile don´t have install support.
I have done as bellow:
*Git clone https://github.com/emqx/kuiper
*Cd kuiper
*make build_with_edgex
Note : In same build environment I can able run server. I copied binary to /usr/bin but still a issue in on board like missing conf files.
Can you help us to copy all relative file to respective path in board.
Is any layer support available for youcto.
Thanks,
Case Panel shows this error, after uploading a ZIP file from Kape: "Error extract the zip content: compression type 9 (deflate64)"
In this case, Kape was set up to collect artifacts and make a ZIP folder. Does Kuiper work with vmdk cointainer inside ZIP file from Kape?
Have been testing out the platform today with the .ova and it works great, no problem getting it up and running. The problem I'm having right now is that it seems to be some kind of import error when trying to just drag and drop the Kape zip for example.
I tried the "KapeTriage" feature and exported it to a zip. When trying to import it directly I just get:
Error extract the zip content: 'utf8' codec can't decode byte 0x99 in position 81: invalid start byte
Any tips for how I can keep troubleshooting this?
After I used hoarder --no_raw_files and got the zip, I tried to upload to Kuiper. However, there is nothing happened after uploading and i tried to click process. However, its pops up error. May I know how to upload "parsed" result to Kuiper?
Hello,
I've run into a few registry hives where kuiper will list a yarp issue, example for ntuser.dat
I manually run yarp on the same ntuser.dat and get this:
I ran regipy on it and it printed with no issues https://github.com/mkorman90/regipy
Not sure if a yarp setting needs to be change or maybe use regipy instead
Hi,
First of all, thank you for share your project.
I was trying to install this from first time, according to:
https://github.com/DFIRKuiper/Kuiper#Installation
However, Im facing the following issues:
root@kuiper-dfir:/opt/Kuiper# docker-compose pull
ERROR: Version in "./docker-compose.yaml" is unsupported. You might be seeing this error because you're using the wrong Compose file version. Either specify a supported version (e.g "2.2" or "3.3") and place your service definitions under the services
key, or omit the version
key and place your service definitions at the root of the file to use version 1.
For more on the Compose file format versions, see https://docs.docker.com/compose/compose-file/
root@kuiper-dfir:/opt/Kuiper# docker-compose up -d
ERROR: Version in "./docker-compose.yaml" is unsupported. You might be seeing this error because you're using the wrong Compose file version. Either specify a supported version (e.g "2.2" or "3.3") and place your service definitions under the services
key, or omit the version
key and place your service definitions at the root of the file to use version 1.
For more on the Compose file format versions, see https://docs.docker.com/compose/compose-file/
I don have enough experience with docker. Could you help me?:
Ubuntu version:
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=18.04
DISTRIB_CODENAME=bionic
DISTRIB_DESCRIPTION="Ubuntu 18.04.6 LTS"
NAME="Ubuntu"
VERSION="18.04.6 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.6 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic
Docker:
Docker version 20.10.12, build e91ed57
docker-compose version 1.17.1, build unknown
Thanks in advance.
Regards.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.