Giter Club home page Giter Club logo

cuckoo-modified's People

Contributors

ameily avatar botherder avatar brad-sp avatar dmaciejak avatar doomedraven avatar geudrik avatar gtback avatar hughpearse avatar ikiril01 avatar init99 avatar jamu avatar jbremer avatar jekil avatar jgajek avatar keithjjones avatar killerinstinct avatar lehmz avatar merx1030 avatar mschloesser-r7 avatar pashashocky avatar pdelsante avatar r3comp1le avatar rep avatar robertsjw avatar seankim777 avatar shane-carr avatar spender-sandbox avatar thorsten-sick avatar wmetcalf avatar xayon avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cuckoo-modified's Issues

AWS Support

Any interest in a PR that adds AWS support to the UI/datastore.

The UI has hooks to grab information from S3 buckets, and there is an output AWS plugin that moves stuff from the local FS to S3. I've run over 12k samples through it and it seems to be working fine. I didn't patch the API for retrieval since getting data from S3 is pretty trivial. The only downside to the patch is you lose search functionality. Open to suggestions/thoughts.

Analysis crashed cuckoomon

Got a signature popping on a piece of malware with the message "Crashed cuckoomon during analysis. Report this issue to the github repo"

pid: 2388
message: Exception reported at offset 0xf241 in cuckoomon itself

On win10-x64 [Errno 13] Permission denied: 'C:\\3648.ini' The package "modules.packages.regsv

Hi,

I am using cuckoo modified build from following source.

https://codeload.github.com/brad-accuvant/cuckoo-modified/zip/master

I am trying to submit sample in windows 10 x64 VM in vmware. and getting following error.

See following logs of debug:

2016-03-16 12:39:33,406 [lib.cuckoo.core.guest] DEBUG: win10: analysis not completed yet (status=2)
2016-03-16 12:39:33,771 [lib.cuckoo.core.resultserver] DEBUG: New connection from: 172.16.148.130:49504
2016-03-16 12:39:33,772 [lib.cuckoo.core.resultserver] DEBUG: LogHandler for live analysis.log initialized.
2016-03-16 12:39:34,411 [lib.cuckoo.core.guest] DEBUG: win10: analysis not completed yet (status=2)
2016-03-16 12:39:35,417 [lib.cuckoo.core.scheduler] ERROR: Analysis failed: The package "modules.packages.regsvr" start function encountered an unhandled exception: [Errno 13] Permission denied: 'C:\3648.ini'
2016-03-16 12:39:35,636 [lib.cuckoo.core.plugins] DEBUG: Stopped auxiliary module: Sniffer
2016-03-16 12:39:35,637 [lib.cuckoo.core.plugins] DEBUG: Stopped auxiliary module: Tor
2016-03-16 12:39:35,637 [modules.machinery.vmware] DEBUG: Stopping vm /media/ali/I/Imported/Windows10x64/Windows10x64.vmx

Please advise how to get rid of this error?

I also tried steps mentioned on following link
http://answers.microsoft.com/en-us/windows/forum/windows_7-security/windows-7-cannot-save-files-to-c-even-after-making/938f2b50-b063-475b-8c5e-905d136df2e3?tab=question&status=AllReplies&auth=1#tabs

but issue is still facing.

I can successfully submit and analysis sample in windows 7 32 bit. But this issue is valid for win10 x64 on vmware.

-thanks

VM detection possible by checking the difference between CPU timestamp counters (rdtsc) forcing VM exit

I just ran the latest pafish. When following the install guide for KVM, the AV/sandbox detection countermeasures work great (thanks Brad!).

Only one method of detection is successful

CPU VM traced by checking the difference between CPU timestamp counters (rdtsc) forcing VM exit

The relevant C code is here

I know nothing about rdtsc. Is there a way to get around this, maybe by changing the KVM configuration?

Feature Request: Re-run signatures

When tuning signatures, it's kind of a pain to keep resubmitting the files. Would it be possible to re-queue the analysis for processing, thus re-running the signatures but not having to do all the dynamic analysis again?

Type exceptions Memory Error

Hi,
I'm using cuckoo sandbox version 2.0, it works well but when I want to analyze a large file (e.g 85 MB), an error occurs : unable to upload malware to analysis machine (type exceptions Memory Error)
Although I changed the upload-max-size and analysis-size-limit in the file cuckoo.conf, an error still occurs

Extra information:
Host: Ubuntu version 14.04 (run on ESX)
Python version 2.7
Guest: Windows7 32 bit
Any suggestion please?
Thanks,
Shaahin.

Merging Cuckoo-2.0 RC1 Changes

The Cuckoo team has released Cuckoo-2.0 RC1 today: https://cuckoosandbox.org/2016-01-21-cuckoo-sandbox-20-rc1.html

There are a lot of great features added. TL;DR:

  • Monitoring 64-bit Windows applications and samples.
  • Mac OS X, Linux, and Android analysis support.
  • Integration with Suricata, Snort, and Moloch.
  • Interception and decryption of TLS/HTTPS traffic.
  • Per analysis network routing including VPN support.
  • Over 300 signatures for isolating and identifying malicious behavior.
  • Volatility baseline capture to highlight the changes during the analysis.
  • Extraction of URLs from process memory dumps.
  • Possibility to run extra services in separate VMs next to the analysis.
  • Maliciousness scoring - does this analysis show malicious behavior?
    Many bug fixes, improvements, tweaks and automation improvements.

Many of these features are already implemented in this fork, but we are missing cross-platform analysis, TLS/HTTPS decryption, network routing (partial), new signatures, Volatility baselines, URL extraction, and same-network services. Maybe we would benefit from listing changes that wouldn't conflict with this fork?

Feature request: tag analysis optimization

The idea here is for a user to be able to insert a certain tag, say 'network' and processing would only analyze the network for that sample. Is this possible to do already, or i could implement it and submit a pull request?

Would this have a significant effect on performance on large batches? What would be another way of optimizing this or similar behaviour?

problem with latest cuckoomon DLL

latest cuckoomon DLL, somehow, failed to hook APIs on previously samples.
Hooks fine using cuckoomon DLL from previous commit:
b713be0#diff-d5a3b633e57a670a51110586b006ba72

You may try the following samples for reference:
MD5 2444348a2db8061607e9e20ae95b003d
SHA1 05821f79ada8104be9ec2c01bdbbe7d120c6f94b
SHA256 bc2c2036664aa1954a039cbce4da4a66e492c648408ceb784acda70df43d95ef
SHA512 6a9061bcae88af68cfb6b5c578b854af903cabf2593a2c4b846b71610e29184415cc9c3b5e40fbc7733c93fe984219478ad537d36be938dc8e4067ef9235f6cc

webui analysis page lockups when cuckoo has ~20,000 of tasks?

Hi,

I guess the really short version is - we're seeing extremely degraded Webui performance for large numbers of tasks in the db.What can we do to further diagnose it ? (other than python cuckoo.py --clean)

For the django UI search (say 2-3 mins) and individual web ui task results pages (30s-5mins for properly processed files). Api seems fine for json reports for the same pages. reporting is mongo, cuckoo is mysql. Plenty of CPUs and memory for the server. Latest source, cuckoo-modified, community-modified, apt-get updates , pip lib updates.

Retention is configured but the performance for search and analysis loading is very bad.

bit more info is below

django 1.9 support?

Hi,

not sure if anyone inquired before.
Wondering whether anyone else tried installing and using the web interface + api with django 1.9?

I could be wrong, but this looks like it's enough to make it work for most things (just an import fix?) ? (or has someone tried it , looked at the web server logs after and decided there's more issues with it). The change could be compatible with older versions too

web/analysis/views.py

< from wsgiref.util import FileWrapper
< #from django.core.servers.basehttp import FileWrapper

web/api/views.py

< from wsgiref.util import FileWrapper
< #from django.core.servers.basehttp import FileWrapper

thanks.
mb

KeyError suricata

Did a fresh install and the web interface fails when attempting to look at the reports with the following error:

KeyError at /analysis/1/
'suricata'
Request Method: GET
Request URL: http://192.168.1.5:8000/analysis/1/
Django Version: 1.6.1
Exception Type: KeyError
Exception Value:
'suricata'
Exception Location: /home/cuckoo/cuckoo-modified/web/analysis/views.py in report, line 672
Python Executable: /usr/bin/python
Python Version: 2.7.6

ES reporting module example guides?

Hi,

a question not an issue again, apologies again.

any good examples of configuring ElasticSearch reporting for CSB compared to mongo ?
I'm just curious what the specific benefits are ... Am I missing out and there's something amazing about the ES syntax searches and possibly visualization via ELK(?)? Or is that people would use that stack already and find it convenient to have CSB reporting slot into it?

It sounds like the CSB Django Web UI will be still enabled (just uses ES instead of Mongo for fetching task info?). then people have an alt UI via for example Kibana for I suppose searching and visualization?

wondering if there's any configuration caveats, performance expectations, limitations for stored info ( I recall reading either in the reporting module, config or commits there might've been caveats?) , data visualization examples (which I presume go a UI e.g. kibana?)
anyone willing to share their experience for running it as a reporting db for larger(?) cuckoo instances say - 20,000-500,000 tasks and monitoring it?

I sort of also wonder at what point people would use distributed cuckoo . it seems like that's more for separating your process.py per task instances into more manageable bits and it'd still aggregate results to a single instance? (implying mongodb and reporting aren't really an expected bottleneck for at least api usage? )

thanks .
mb.

Database migration fails

/data/cuckoo/utils/db_migration$ alembic upgrade head
INFO [alembic.migration] Context impl PostgresqlImpl.
INFO [alembic.migration] Will assume transactional DDL.
INFO [alembic.migration] Running upgrade 4b09c454108c -> f111620bb8, Add shrike and Parent ID Columns
Traceback (most recent call last):
File "/usr/local/bin/alembic", line 9, in
load_entry_point('alembic==0.7.5.post1', 'console_scripts', 'alembic')()
File "/usr/local/lib/python2.7/dist-packages/alembic/config.py", line 439, in main
CommandLine(prog=prog).main(argv=argv)
File "/usr/local/lib/python2.7/dist-packages/alembic/config.py", line 433, in main
self.run_cmd(cfg, options)
File "/usr/local/lib/python2.7/dist-packages/alembic/config.py", line 416, in run_cmd
**dict((k, getattr(options, k)) for k in kwarg)
File "/usr/local/lib/python2.7/dist-packages/alembic/command.py", line 165, in upgrade
script.run_env()
File "/usr/local/lib/python2.7/dist-packages/alembic/script.py", line 390, in run_env
util.load_python_file(self.dir, 'env.py')
File "/usr/local/lib/python2.7/dist-packages/alembic/util.py", line 243, in load_python_file
module = load_module_py(module_id, path)
File "/usr/local/lib/python2.7/dist-packages/alembic/compat.py", line 79, in load_module_py
mod = imp.load_source(module_id, path, fp)
File "./env.py", line 65, in
run_migrations_online()
File "./env.py", line 58, in run_migrations_online
context.run_migrations()
File "", line 7, in run_migrations
File "/usr/local/lib/python2.7/dist-packages/alembic/environment.py", line 742, in run_migrations
self.get_context().run_migrations(**kw)
File "/usr/local/lib/python2.7/dist-packages/alembic/migration.py", line 309, in run_migrations
step.migration_fn(*_kw)
File "/data/cuckoo/utils/db_migration/versions/add_shrike_and_parent_id_columns.py", line 250, in upgrade
_perform(upgrade=True)
File "/data/cuckoo/utils/db_migration/versions/add_shrike_and_parent_id_columns.py", line 132, in _perform
op.drop_table("tasks")
File "", line 7, in drop_table
File "/usr/local/lib/python2.7/dist-packages/alembic/operations.py", line 962, in drop_table
self._table(name, *_kw)
File "/usr/local/lib/python2.7/dist-packages/alembic/ddl/impl.py", line 190, in drop_table
self._exec(schema.DropTable(table))
File "/usr/local/lib/python2.7/dist-packages/alembic/ddl/impl.py", line 105, in _exec
return conn.execute(construct, _multiparams, *_params)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 729, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/ddl.py", line 69, in _execute_on_connection
return connection._execute_ddl(self, multiparams, params)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 783, in _execute_ddl
compiled
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 958, in _execute_context
context)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1159, in _handle_dbapi_exception
exc_info
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 199, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 951, in _execute_context
context)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 436, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.InternalError: (InternalError) cannot drop table tasks because other objects depend on it
DETAIL: constraint errors_task_id_fkey on table errors depends on table tasks
constraint guests_task_id_fkey on table guests depends on table tasks
constraint tasks_tags_task_id_fkey on table tasks_tags depends on table tasks
HINT: Use DROP ... CASCADE to drop the dependent objects too.
'\nDROP TABLE tasks' {}

cuckoo web interface issues

just finishes installing cuckoo, and having issues starting the web interface.
here is the error message.

sudo /opt/cuckoo/web/manage.py runserver
Traceback (most recent call last):
File "/opt/cuckoo/web/manage.py", line 14, in
execute_from_command_line(sys.argv)
File "/usr/local/lib/python2.7/dist-packages/django/core/management/init.py", line 338, in execute_from_command_line
utility.execute()
File "/usr/local/lib/python2.7/dist-packages/django/core/management/init.py", line 303, in execute
settings.INSTALLED_APPS
File "/usr/local/lib/python2.7/dist-packages/django/conf/init.py", line 48, in getattr
self.setup(name)
File "/usr/local/lib/python2.7/dist-packages/django/conf/_init.py", line 44, in setup
self.wrapped = Settings(settings_module)
File "/usr/local/lib/python2.7/dist-packages/django/conf/__init.py", line 92, in init
mod = importlib.import_module(self.SETTINGS_MODULE)
File "/usr/lib/python2.7/importlib/init.py", line 37, in import_module
import(name)
File "/opt/cuckoo/web/web/settings.py", line 16, in
from lib.cuckoo.common.config import Config
ImportError: No module named lib.cuckoo.common.config

maec reporting module - max recursion depth exceeded errors

Hi,

we have the MAEC reporting module enabled for the time being (although thinking of migrating off it, it might take a while)...
Was wondering if anyone else was seeing the following type of errors for it:

2016-02-15 17:17:03,510 [lib.cuckoo.core.plugins] ERROR: Failed to run the reporting module "MAEC41Report":
Traceback (most recent call last):
  File "/home/kittens/utils/../lib/cuckoo/core/plugins.py", line 630, in process
    current.run(self.results)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 85, in run
    self.addActions()
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 129, in addActions
    self.createProcessActions(process)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 653, in createProcessActions
    action_dict = self.apiCallToAction(call, pos)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 276, in apiCallToAction
    action_dict["associated_objects"] = self.processActionAssociatedObjects(mapping_dict["parameter_associated_objects"], parameter_list)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 381, in processActionAssociatedObjects
    self.processRegKeys(associated_objects_list)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 538, in processRegKeys
    associated_object = self.processRegKeyHandle(associated_object["properties"]["hive"], associated_object)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 558, in processRegKeyHandle
    self.processRegKeyHandle(handle_mapped_key["properties"]["hive"], current_dict)
.... about 2000 lines later: 
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 558, in processRegKeyHandle
    self.processRegKeyHandle(handle_mapped_key["properties"]["hive"], current_dict)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 558, in processRegKeyHandle
    self.processRegKeyHandle(handle_mapped_key["properties"]["hive"], current_dict)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 558, in processRegKeyHandle
    self.processRegKeyHandle(handle_mapped_key["properties"]["hive"], current_dict)
  File "/home/kittens/utils/../modules/reporting/maec41.py", line 558, in processRegKeyHandle
    self.processRegKeyHandle(handle_mapped_key["properties"]["hive"], current_dict)
RuntimeError: maximum recursion depth exceeded

we seem to get them often enough - about once in 60-70 files - making logs a bit hard to read, but usually we just grep it out.
Thought it'd drop by and ask if anyone can suggest narrowing it down a bit.

Any hints on providing a bit more helpful info to help diagnose the issue? E.g. I can put in a few interim debug statements into the MAEC reporting module ?

thanks
mb

Dropped Files And Memory Dumps Missing?

So I've been noticing that dropped files does not include the embedded exe's from some word docs that are written to disk and executed.

Debug logs:
2016-02-10 16:32:04,204 [lib.cuckoo.core.resultserver] DEBUG: New process (pid=2900, ppid=2336, name=WINWORD.EXE, path=C:\Program Files (x86)\Microsoft Office\Office15\WINWORD.EXE)
2016-02-10 16:32:11,052 [lib.cuckoo.core.resultserver] DEBUG: New process (pid=1924, ppid=2900, name=kitabc2.exe, path=C:\Users\justin\AppData\Local\Temp\kitabc2.exe)

Additionally, this kitabc2.exe does not show up in the list of modified files and the process does not have a memory dump. The path and file is in the list of executed commands however. Any clue as to why this may be happening?

zip and rar

I have a windows xp virtualbox, I had installed winrar 3.2 and python-rarfile, but when I submit a x.rar file, the virtualbox shutdown and raised a error: Analysis failed: The package "modules.packages.rar" start function encountered an unhandled exception: Unrar not installed? (rarfile.UNRAR_TOOL='unrar'). Then I followed the rarfile API, I set down rarfile.UNRAR_TOOL('C:\Program file\winRAR\UnRAR.exe'), but this time it raised an error: Analysis failed: The package "modules.packages.rar" start function raised an error: Unable to execute the initial process, analysis aborted.
I don't know how to deal with it.

Error: adding task to database

Hello,

when i try to submit a File I was getting the following error:

lib.cuckoo.common.exceptions.CuckooDatabaseError: DB schema version mismatch: found 4b09c454108c, expected f111620bb8. Try to apply all migrations (cd utils/db_migration/ && alembic upgradehead).`

When i changed the Schematic Version in database.py to "4b09c454108c" and since then I'm getting this error:

Error: adding task to database

How do I fix this?

"ERROR :-( Error adding task to Cuckoo's database."

Hi!
I am using latest cuckoo-modified from spender-sandbox.
I have here A new cuckoo installation with vbox and all the dependencies.
For some reason i get the following error when i try to submit samples on the web dashboard
via /submit/ page.
the issue occurs when trying to submit sample for analysis quarantine file or trying to analyse some URLs.
i added some screenshots to demonstrate the issue.
--elad

cuckoo1
cuckoo2

No folders made on host

Running the latest commit, the one from a few hours ago, and noticed that there is no creation of folder on the host when running analysis.
Just a copy of the binary and the dump.pcap is created.
Pinging between host and guest is no problem.

Can this be solved ?

Thanks

tor.py and /usr/sbin/torstart | torstop

Looking into the tor stuff and tor.py needs /usr/sbin/torstart and usr/sbin/torstop in order to operate but I can't find any reference to these kinds of files? Is there something missing?

sample kills communication to cuckoo server

Sample hash: 1ea976dd806b8a92baab296060b097d266f675ae8a1196bbf6d31ad583effc06

The sample executes fine, but then while executing it (the VM seems to stop listening).

2016-02-09 15:11:03,641 [lib.cuckoo.core.guest] DEBUG: Cuckoo-Dev2_Win7-32-victim1: analysis not completed yet (status=2)
2016-02-09 15:11:04,554 [lib.cuckoo.core.resultserver] DEBUG: New connection from: 192.168.253.31:49174
2016-02-09 15:11:04,556 [lib.cuckoo.core.resultserver] DEBUG: File upload request for shots/0006.jpg
2016-02-09 15:11:04,571 [lib.cuckoo.core.resultserver] DEBUG: Uploaded file length: 134172
2016-02-09 15:11:04,572 [lib.cuckoo.core.resultserver] DEBUG: Connection closed: 192.168.253.31:49174
2016-02-09 15:11:04,646 [lib.cuckoo.core.guest] DEBUG: Cuckoo-Dev2_Win7-32-victim1: analysis not completed yet (status=2)
2016-02-09 15:11:05,400 [lib.cuckoo.core.resultserver] DEBUG: Connection closed: 192.168.253.31:49166
2016-02-09 15:11:05,401 [lib.cuckoo.core.resultserver] DEBUG: Connection closed: 192.168.253.31:49167
2016-02-09 15:11:05,650 [lib.cuckoo.core.guest] DEBUG: Cuckoo-Dev2_Win7-32-victim1: error retrieving status: [Errno 111] Connection refused
2016-02-09 15:11:06,653 [lib.cuckoo.core.guest] DEBUG: Cuckoo-Dev2_Win7-32-victim1: error retrieving status: [Errno 111] Connection refused
2016-02-09 15:11:07,656 [lib.cuckoo.core.guest] DEBUG: Cuckoo-Dev2_Win7-32-victim1: error retrieving status: [Errno 111] Connection refused
2016-02-09 15:11:08,659 [lib.cuckoo.core.guest] DEBUG: Cuckoo-Dev2_Win7-32-victim1: error retrieving status: [Errno 111] Connection refused

This is from the cuckoo.log with debugging turned on. Any thoughts on where I can start digging into this?

Analyzing RTF documents

Is there some kind of script to analyze RTF documents ?
When I run one I get an error stating :
" is not a supported file type, cannot extract VBA Macros "

Thanks
Miguël

Feature Request: Option to run dropped files and contacted domains through VirusTotal

Imported from brad-sp/cuckoo-modified#183

"The dropped files tab gives the ability to "Search For Analysis". A useful feature would be to have the hash of the dropped files also be queried, and the VT signature indicate if the file dropped files identified by AVs.

A similar feature for URL reputation would also be handy."

"You would almost certainly need a private API account for VT for this to work reliably, with the one built into Cuckoo or even an Intelligence API account, you would very easily hit limits that would then affect the ability to query VT results for main file submission.

That said, I'd be happy to merge a PR for it -- I think upstream Cuckoo may have added support for this already."

"Yep, have access to a VT API key"

Cuckoo segmentation fault when processing some pdf file

I tried analysing a pdf file which is 2.3MB. Other submissions work fine.
Cuckoo crashes with the following errors :

2016-03-22 18:52:49,548 [lib.cuckoo.core.plugins] DEBUG: Executing processing module "NetworkAnalysis" on analysis at "/home/user/cuckoo/storage/analyses/50"
2016-03-22 18:52:49,646 [lib.cuckoo.core.plugins] DEBUG: Executing processing module "Static" on analysis at "/home/user/cuckoo/storage/analyses/50"
2016-03-22 18:52:49,647 [modules.processing.static] DEBUG: Starting to load PDF
2016-03-22 18:52:59,252 [modules.processing.static] DEBUG: About to parse with PDFParser
2016-03-22 18:53:20,080 [lib.cuckoo.core.plugins] DEBUG: Executing processing module "Strings" on analysis at "/home/user/cuckoo/storage/analyses/50"
2016-03-22 18:53:20,393 [lib.cuckoo.core.plugins] DEBUG: Executing processing module "TargetInfo" on analysis at "/home/user/cuckoo/storage/analyses/50"
segmentation fault

It looks like TargetInfo crashes, is there a way to avoid it or disable it?

Checking if analysis already exists

This is not an issue but more a question.

Is it possible to implement, when adding a file for analysis, that it first checks if there exists already an analysis for that file ?
If already exists, showing some kind of web page with some indication of the existing analysis and the possibility to view the analysis or submit that fie for a new analysis.

Thanks
Miguël

Issues Fork

Should I just re-lodge all my existing issues here instead? :)

Can we add OSX support?

Cuckoo sandbox guys now have a linux, darwin and android analyzer modules, can you possibly just merge them here?

Would there be any other modifications to get it working? I've done some tests with the darwin analyzer module, and the only thing I managed to pull were hosts and strings out of samples. No behavioural analysis.

TypeError in API

Hello,

I'm getting a TypeError when attempting to create a file though the API.

TypeError at /api/tasks/create/file/

demux_sample_and_add_to_db() got an unexpected keyword argument 'shrike_url'

Request Method:     POST
Request URL:    https://cuckoo.beer.io/api/tasks/create/file/
Django Version:     1.6.1
Exception Type:     TypeError
Exception Value:    

demux_sample_and_add_to_db() got an unexpected keyword argument 'shrike_url'

Exception Location:     /opt/cuckoo-modified/web/api/views.py in tasks_create_file, line 321
Python Executable:  /usr/bin/python
Python Version:     2.7.6
Python Path:    

['/usr/local/lib/python2.7/dist-packages/PyV8-1.0_dev-py2.7-linux-x86_64.egg',
 '/usr/local/lib/python2.7/dist-packages/pefile-1.2.10.post139-py2.7.egg',
 '/usr/lib/python2.7',
 '/usr/lib/python2.7/plat-x86_64-linux-gnu',
 '/usr/lib/python2.7/lib-tk',
 '/usr/lib/python2.7/lib-old',
 '/usr/lib/python2.7/lib-dynload',
 '/usr/local/lib/python2.7/dist-packages',
 '/usr/lib/python2.7/dist-packages',
 '/usr/lib/python2.7/dist-packages/PILcompat',
 '/usr/lib/python2.7/dist-packages/gtk-2.0',
 '/usr/lib/pymodules/python2.7',
 '/opt/cuckoo-modified/',
 '/opt/cuckoo-modified/web',
 '/opt/cuckoo-modified/web/..',
 '/opt/cuckoo-modified/',
 '/opt/cuckoo-modified/',
 '/opt/cuckoo-modified/',
 '/opt/cuckoo-modified/',
 '/opt/cuckoo-modified/',
 '/opt/cuckoo-modified/']

Server time:    Wed, 27 Jan 2016 13:44:49 -0500

I'll keep poking at this but figured you guys might find a solution quicker than I. Thanks for the awesome stuff!

process.py locking up on certain pdfs

hi,

we seem to have a 'lucky stream' of corrupted PCAP extracts where PDFs cause the processing to hang indefinitely...(normally executable or other PDFs submit, process and report ok)
While I'll try to post a PCAP, that might not be possible (can Brad message me privately re that?) .

Anyhow, we're running processing tasks not within the main cuckoo process, but via the utils/process.py helper.

e.g. /utils/process.py -d -p 7 auto

eventually enough of those PDFs queue up that processing halts completely as all process processes hang if enough corrupted PDFs get submitted.
2016-02-18 23:03:57,947 [modules.processing.static] DEBUG: Starting to load PDF
2016-02-18 23:04:00,365 [modules.processing.static] DEBUG: About to parse with PDFParser
2016-02-18 23:04:03,622 [modules.processing.static] DEBUG: About to parse with PDFParser
2016-02-18 23:04:06,989 [modules.processing.static] DEBUG: About to parse with PDFParser
2016-02-18 23:04:07,711 [modules.processing.static] DEBUG: About to parse with PDFParser
2016-02-18 23:04:18,115 [modules.processing.static] DEBUG: About to parse with PDFParser
2016-02-18 23:04:18,923 [modules.processing.static] DEBUG: About to parse with PDFParser
2016-02-18 23:04:19,794 [modules.processing.static] DEBUG: About to parse with PDFParser
^and then nothing will happen , and without extra debug statements in static.py after instantiating pdf parser and trying to see if it freezes there or on result processing not too sure. Suspect in PDFParser/peepdf PDFCore

occasionally it seems to restart processing (probably if i kick it over )and then the same set of tasks freezes again

as far as I can tell static analysis happens via peepdf (that's a bit newer in their repo by the way ? (using that still makes processing freeze ). While on the subject - did they merge the google summer of code pdf malscore changes and did we pull them into csb or brad-csb?)

*So anyhow , was hoping there was a quick workaround someone (who knows a bit more python and about safely killing processing tasks and marking them as processing or reporting failed) can help with, for example a watchdog timer in process.py that we get an extra config setting for. and we see processing taking over 600 seconds that task gets marked as failed and the helper process for that task ID terminates itself, preferably safely. *

thought/please help/questions?
Mb.

proxy support for VT lookup module?

hi,

not sure if it'll cater for all flavours (or is you also need to add a hosts entry if DNS is delegated to the proxy instead of being resolved by the client (i.e. if internal DNS will not resolve external names and that's delegated to the proxy)) . At the moment you'll get strange timeouts and processing delays for certain environments

in any case:

https://github.com/spender-sandbox/cuckoo-modified/blob/master/modules/processing/virustotal.py

fairly sure it's just replacing
r = requests.get(url, params=data, verify=True, timeout=int(timeout))

with
proxies = {"http": "http://proxyname:8080", "https": "http://proxyname:8080"}
r = requests.get(url, params=data, verify=True, timeout=int(timeout), proxies=proxies)

of course that's not taken from the processing config, but if any interest - I suspect that's not difficult to add?

thanks,
mb

API request for IOCS django error

I was getting a django error for some samples that didnt contact any hosts in:

web/api/views.py:1274 - KeyError exception for the 3 del's
web/api/views.py:1299 - KeyError exception for 3 buf asignments

Fixed using a try, maybe there is a more elegant solution to this?

Admin Delete Task

If you delete a task through the Admin tab of the web interface, the corresponding reporting is not deleted from storage/analysis/TaskNumber/. In addition, if you delete the newest task and submit another one, Cuckoo will finish the reporting very quickly by processing the previously gathered output from the "deleted" task.

The delete functionality should clean up in storage/analysis/ along with checking/recreating that the symbolic link for storage/analysis/latest is pointing to the correct location.

Of course looking through this, I find this in web/analysis/views.py, so known open issue?:

@require_safe
def remove(request, task_id):
"""Remove an analysis.
@todo: remove folder from storage.
"""

File not executing during analysis

When I run a certain file, this is not execute for analysis, no screenshots shown

When I run within the cuckoosandbox-dev it is executed and I see the screenshots of the execution.

What could be the problem ?

Thanks
Miguël

Crash on opening IE

IE is crashing in my sandboxes during injection when using cuckoo-modified now. VM was transfered over from other machine which was working fine but may be working with slightly newer version of cuckoo-modified. Removed all plugins, different versions of IE in browser etc and still happens. Only time browser crash doesn't occur is when injection is disabled. Tried in cuckoo2-dev tree (latest build as of now) and works fine for URL and html files as it opens IE without crash.

Other browsers such as Firefox and Chrome are fine.

PassiveTotal Links On Suricata Alerts Page

Happened to notice when turning on Suricata integration, the links to [VT] for source IP,dest IP, and host appear, however the links for PassiveTotal [PT] appear to be absent even when it is enabled in auxiliary.conf.

image

MultiCore Processing

So I tried switching off processing as per:
https://github.com/spender-sandbox/cuckoo-modified/blob/daec595861696c3d0e73d0bee0c6e22af9603f26/docs/book/src/usage/performance.rst

"Multi-Core processing

By switching off processing ( conf/cuckoo.conf, process_results in [cuckoo]) the processing step can be done in a separate utils/process.py task running several process."

However the results never seem to get processed and each task is just listed as "processing" in the web-ui. Is there some other step I am missing here?

Office Martians not counted

Hi,

I have enabled the show office martians option and browser martians. While browser martians are shown office martian files are not shown in the account on the recent tab.

recent database.py changes look to make cuckoo indefinitely wait in waiting for analysis tasks?

Hi,

not sure what I'm doing wrong, but after I've cloned the latest master (including 464f9be)
cuckoo main (as in cuckoo.py ) starts, loads modules, machinery etc.

Then indefinitely stays in

2016-03-01 20:36:18,036 [lib.cuckoo.common.abstracts] DEBUG: Getting status for puppies1
2016-03-01 20:36:18,108 [lib.cuckoo.common.abstracts] DEBUG: Getting status for puppies2
2016-03-01 20:36:18,135 [lib.cuckoo.core.scheduler] INFO: Loaded 2 machine/s
2016-03-01 20:36:18,144 [lib.cuckoo.core.scheduler] INFO: Waiting for analysis tasks.
...
nothing after.

newly submitted or already queued tasks don't seem to get picked up by cuckoo main, but Web UI displays the new tasks in the queue fine.

looking at https://github.com/spender-sandbox/cuckoo-modified/blob/master/lib/cuckoo/core/database.py

Possibly the today's session close commits or maybe the session.query one?
reverting today's database.py commits makes it work ok.

Running mysql + mongo for the DBs.

help extracting IoCs from tasks directly from mongo?

hello and heh.

this isn't really an issue (user issue :D) I was just hoping to ask a question (is there a more appropriate place to ?) about extracting info from the mongo reporting db...(I'm not quite sure what the division of labor between mongo and mysql is but i think all of reporting info goes to mongo )

I'd like a batch (aka python script) way of extracting IoCs for a particular query.
say the query is for a wave of malspam... let's say a combination of some or all of the following: .doc & date > a & date < b & signatures matched contains abc & signatures (name or desc, or for example office martians >= 1 )

then extract specific IoCs from the task report. say ips + domains from the network info section . preferably into a per file + basic info csv (say fname + sha1 + ip + domain)

Process.py seems to hang if encounters very large strings

Some samples have strings of 11mb, the processing module seems to hang at:

'Processing strings'

pressing ctrl+c solves the issue and the report is regenerated, although then no strings show.

Can we add a limit to the size of the string that we can process and add to mongo?

Sample of ISRStealer fails to communicate

I recently came across a sample of ISRStealer ( 1a5a0d2ae44eb5d19fa83dfbea710e3890eabc39700d2271dcd6807d444d024e on VirusTotal). It communicates when run in a commercial sandbox, but it doesn't create any network traffic in my instance of Cuckoo. Any ideas?

Type Error in `views.py`

I ran across this issue a while back, not sure the issue made the transition with you.

brad-sp/cuckoo-modified@3fa0daf

screen shot 2016-02-01 at 4 30 46 pm

The fix I proposed shouldn't be utilized as such, but it prevents the issue as a stop-gap. Thoughts as to why this might be happening?

Angler False Positive

Hello,

I seem to be getting Angler detection in the Malscore for low or even XLS files.

image

My last repo update was Jan-26 - was there a patch to this?

'VolatilityManager' object has no attribute 'zipdump'

Saw this in debug output with Volatility enabled:

2016-01-30 14:50:20,622 [modules.processing.memory] ERROR: Generic error executing volatility
Traceback (most recent call last):
File "/home/cuckoo/cuckoo-modified/modules/processing/memory.py", line 1155, in run
results = vol.run(manager=machine_manager, vm=task_machine)
File "/home/cuckoo/cuckoo-modified/modules/processing/memory.py", line 1068, in run
self.zipdump()
AttributeError: 'VolatilityManager' object has no attribute 'zipdump'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.