smicallef / spiderfoot Goto Github PK
View Code? Open in Web Editor NEWSpiderFoot automates OSINT for threat intelligence and mapping your attack surface.
Home Page: http://www.spiderfoot.net
License: MIT License
SpiderFoot automates OSINT for threat intelligence and mapping your attack surface.
Home Page: http://www.spiderfoot.net
License: MIT License
After starting new scan against my own domain scanning starts normally. The URL in question after successful startup is http://127.0.0.1:5001/startscan
. My first reaction in that page was to reload it to get status. Reloading that page ends up as error:
404 Not Found
Missing parameters: modulelist,scantarget,scanname
Traceback (most recent call last):
File "/home/fgeek/utils/builds/python/2.7.5/lib/python2.7/site-packages/cherrypy/_cprequest.py", line 656, in respond
response.body = self.handler()
File "/home/fgeek/utils/builds/python/2.7.5/lib/python2.7/site-packages/cherrypy/lib/encoding.py", line 188, in __call__
self.body = self.oldhandler(*args, **kwargs)
File "/home/fgeek/utils/builds/python/2.7.5/lib/python2.7/site-packages/cherrypy/_cpdispatch.py", line 40, in __call__
raise sys.exc_info()[1]
HTTPError: (404, 'Missing parameters: modulelist,scantarget,scanname')
In my opinion if the parameters are not inputted it should either show the Scans-page or the specific scaninfo?id=-page. It's also not good habbit to give backtrace error in web ui if debugging is not enabled.
Also while SpiderFoot uses POST messages it makes it harder to navigate in the web interface as user can't click back button in the web browser. For example from specific scan to list of all scans.
I got the package via Git, but I get these errors when I try to start it.
root@Kali:/Scripts/spiderfoot# ls/Scripts/spiderfoot# ./sf.py
dyn evtypes.sql ext LICENSE LICENSE.tp modules README setup.py sfdb.py sflib.py sf.py sfscan.py sfwebui.py spiderfoot.schema static THANKYOU VERSION
root@Kali:
./sf.py: line 11: $'\r': command not found
./sf.py: line 14: $'\r': command not found
./sf.py: line 16: syntax error near unexpected token (' '/sf.py: line 16:
cmd_subfolder = os.path.realpath(os.path.abspath(os.path.join(os.path.split(inspect.getfile(inspect.currentframe()))[0],"ext")))
root@Kali:~/Scripts/spiderfoot#
Installation instructions is missing swig. For example in Debian this can be installed with command: sudo apt-get install swig
. Spiderfoot needs swig for M2Crypto.
Unhandled exception (UnboundLocalError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/home/gianz/spiderfoot/sfscan.py", line 228, in startScan\n psMod.notifyListeners(firstEvent)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_ir.py", line 305, in handleEvent\n self.notifyListeners(asevt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_ir.py", line 281, in handleEvent\n self.nbreported[asn] = True\n', "UnboundLocalError: local variable 'asn' referenced before assignment\n"]
Extraction of Human Names stops when the first char with diacritics is encountered. Example: there is a name "Martin Smišnál" where only "Martin Smi" is extracted.
Currently the code to generate CSV files is this:
# # Get result data in CSV format
def scaneventresultexport(self, id, type):
dbh = SpiderFootDb(self.config)
data = dbh.scanResultEvent(id, type)
blob = "\"Updated\",\"Type\",\"Module\",\"Source\",\"Data\"\n"
for row in data:
lastseen = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(row[0]))
escapedData = cgi.escape(row[1].replace("\n", "#LB#").replace("\r", "#LB#"))
escapedSrc = cgi.escape(row[2].replace("\n", "#LB#").replace("\r", "#LB#"))
blob = blob + "\"" + lastseen + "\",\"" + row[4] + "\",\""
blob = blob + row[3] + "\",\"" + escapedSrc + "\",\"" + escapedData + "\"\n"
cherrypy.response.headers['Content-Disposition'] = "attachment; filename=SpiderFoot.csv"
cherrypy.response.headers['Content-Type'] = "application/csv"
cherrypy.response.headers['Pragma'] = "no-cache"
return blob
scaneventresultexport.exposed = True
Unfortunately, this code allows double quotes to be included in fields without escaping them. That means there's no reliable way of telling when each field ends, since splitting each row using the comma character wouldn't work either (fields may also contain commas).
One possible solution is to escape double quote characters. But a better solution, IMO, is to use a proper parser like the "csv" module from the standard library, instead of manually concatenating strings. This not only fixes parsing problems in a standardized way, it also lets you control the specific CSV dialect. I'm working on a patch right now, I'll send you a pull request when it's ready, for your consideration.
Due to an internal issue when processing HTTP header information, server and other data is not being processed, resulting in a reduced result set.
I encounter below error when I want to run spiderfoot:
prompt# python ./sf.py
Traceback (most recent call last):
File "./sf.py", line 113, in
sfModules[modName]['object'] = getattr(mod, modName)()
AttributeError: 'module' object has no attribute 'sfp_tcpportscan'
Anyone an idea how to solve?
Unhandled exception (AddrFormatError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "sfscan.pyc", line 228, in startScan\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_bingsearch.pyc", line 83, in handleEvent\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 162, in handleEvent\n', ' File "modules\sfp_dns.pyc", line 330, in processHost\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_bingsearch.pyc", line 83, in handleEvent\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 162, in handleEvent\n', ' File "modules\sfp_dns.pyc", line 334, in processHost\n', ' File "modules\sfp_dns.pyc", line 417, in processDomain\n', ' File "modules\sfp_dns.pyc", line 330, in processHost\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_spider.pyc", line 259, in handleEvent\n', ' File "modules\sfp_spider.pyc", line 283, in spiderFrom\n', ' File "modules\sfp_spider.pyc", line 100, in processUrl\n', ' File "modules\sfp_spider.pyc", line 198, in contentNotify\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 162, in handleEvent\n', ' File "modules\sfp_dns.pyc", line 330, in processHost\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_googlesearch.pyc", line 106, in handleEvent\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_spider.pyc", line 259, in handleEvent\n', ' File "modules\sfp_spider.pyc", line 309, in spiderFrom\n', ' File "modules\sfp_spider.pyc", line 100, in processUrl\n', ' File "modules\sfp_spider.pyc", line 198, in contentNotify\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 162, in handleEvent\n', ' File "modules\sfp_dns.pyc", line 330, in processHost\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_bingsearch.pyc", line 104, in handleEvent\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_spider.pyc", line 259, in handleEvent\n', ' File "modules\sfp_spider.pyc", line 283, in spiderFrom\n', ' File "modules\sfp_spider.pyc", line 123, in processUrl\n', ' File "modules\sfp_spider.pyc", line 179, in linkNotify\n', ' File "sflib.pyc", line 1243, in matches\n', ' File "netaddr\strategy\ipv4.pyc", line 105, in valid_str\n', 'AddrFormatError: Empty strings are not supported!\n']
SpiderFoot 2.0 is GPLv2 licensed, but there is no LICENSE file in the package.
SpiderFoot should be able to execute several scans simultaneous or at least have ability to queue scans. In my opinion this is a must have feature. Missing from 2.0.4 at least.
Unhandled exception (UnicodeEncodeError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/home/gianz/spiderfoot/sfscan.py", line 228, in startScan\n psMod.notifyListeners(firstEvent)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_bingsearch.py", line 83, in handleEvent\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 162, in handleEvent\n self.processHost(match, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_bingsearch.py", line 83, in handleEvent\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 162, in handleEvent\n self.processHost(match, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 331, in processHost\n self.processDomain(dom, evt)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 414, in processDomain\n self.processHost(name, domevt, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_bingsearch.py", line 104, in handleEvent\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 259, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 309, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 100, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 198, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 162, in handleEvent\n self.processHost(match, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 206, in handleEvent\n self.processHost(addr, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 241, in handleEvent\n self.processHost(sip, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 241, in handleEvent\n self.processHost(sip, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 241, in handleEvent\n self.processHost(sip, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 243, in handleEvent\n self.processHost(sip, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 206, in handleEvent\n self.processHost(addr, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_googlesearch.py", line 106, in handleEvent\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 259, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 283, in spiderFrom\n links = self.processUrl(startingPoint) # fetch first page\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 100, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 198, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 162, in handleEvent\n self.processHost(match, parentEvent, affiliate=False)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 327, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_bingsearch.py", line 104, in handleEvent\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 259, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 309, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 100, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 198, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_names.py", line 137, in handleEvent\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1120, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_socialprofiles.py", line 92, in handleEvent\n searchStr = sites[site][0].format(eventData).replace(" ", "%20")\n', "UnicodeEncodeError: 'ascii' codec can't encode character u'\xe9' in position 3: ordinal not in range(128)\n"]
Hi, very often when i add a new domain to scan i get:
500 Internal Server Error
The server encountered an unexpected condition which prevented it from fulfilling the request.
Traceback (most recent call last):
File "cherrypy_cprequest.pyc", line 656, in respond
File "cherrypy\lib\encoding.pyc", line 188, in call
File "cherrypy_cpdispatch.pyc", line 34, in call
File "sfwebui.pyc", line 243, in startscan
File "mako\template.pyc", line 443, in render
File "mako\runtime.pyc", line 783, in _render
File "mako\runtime.pyc", line 815, in _render_context
File "mako\runtime.pyc", line 841, in _exec_template
File "dyn_newscan_tmpl", line 64, in render_body
IndexError: string index out of range
Stupidly forgot to change this before the release, so by default it binds to 0.0.0.0. Has no functional impact to the user, however is less secure.
Will fix in 2.0.1 along with other fixes.
If a fetch returns a 301 Moved Temporarily response for a remote site, it is automatically followed and fetched, but the content is not content from the target.
Seems like global config settings need a spiderfoot restart before taking effect, but this should not be the case.
Unhandled exception (TypeError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/home/tom/spiderfoot/sfscan.py", line 195, in startScan\n module.start()\n', ' File "/home/tom/spiderfoot/modules/sfp_bingsearch.py", line 64, in start\n useragent=self.opts['_useragent'], timeout=self.opts['_fetchtimeout']))\n', ' File "/home/tom/spiderfoot/sflib.py", line 825, in bingIterate\n if firstPage['code'] == 400 or "/challengepic?" in firstPage['content']:\n', "TypeError: argument of type 'NoneType' is not iterable\n"]
If you edit a module in the modules directory when running on Windows, it will have no effect, as py2exe has compiled them into the executable. All they are used for is getting a list of modules.
when running a scan i get this traceback:
404 Not Found
Missing parameters: modulelist,scantarget,scanname
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/cherrypy/_cprequest.py", line 656, in respond
response.body = self.handler()
File "/usr/local/lib/python2.7/site-packages/cherrypy/lib/encoding.py", line 188, in __call__
self.body = self.oldhandler(*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/cherrypy/_cpdispatch.py", line 40, in __call__
raise sys.exc_info()[1]
HTTPError: (404, 'Missing parameters: modulelist,scantarget,scanname')
Insert following to Scan Name: "><img src="http://paste.nerv.fi/22296149-fabulous.jpeg">
and enjoy. Version 2.0.4.
References:
Should be fixed at least when authentication is implemented :)
I scanned my own site and this page was picked up as having a password field
http://www.digininja.org/blog/burp_intruder_types.php
The field is as text with the angle brackets encoded so this should not be reported on.
When initializing new scan you should use example.org or example.com as example, because scantarget.com might be used by real organization. Not a big issue obviously, but for example Nmap had working domain in examples and the owner did not like it. You can guess why ;)
stacktrace:
Unhandled exception (BadZipfile) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/home/gianz/spiderfoot_5002/sfscan.py", line 195, in startScan\n module.start()\n', ' File "/home/gianz/spiderfoot_5002/modules/sfp_bingsearch.py", line 81, in start\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot_5002/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot_5002/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot_5002/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot_5002/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot_5002/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot_5002/modules/sfp_spider.py", line 300, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot_5002/modules/sfp_spider.py", line 128, in processUrl\n self.urlEvents[link] = self.linkNotify(link, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot_5002/modules/sfp_spider.py", line 181, in linkNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot_5002/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot_5002/modules/sfp_filemeta.py", line 114, in handleEvent\n doc = openxmllib.openXmlDocument(data=ret['content'], mime_type=mtype)\n', ' File "/home/gianz/spiderfoot_5002/ext/openxmllib/init.py", line 58, in openXmlDocument\n return class_(file_, mime_type=mime_type)\n', ' File "/home/gianz/spiderfoot_5002/ext/openxmllib/document.py", line 65, in init\n openxmldoc = zipfile.ZipFile(file_, 'r', zipfile.ZIP_DEFLATED)\n', ' File "/usr/lib/python2.7/zipfile.py", line 770, in init\n self._RealGetContents()\n', ' File "/usr/lib/python2.7/zipfile.py", line 811, in _RealGetContents\n raise BadZipfile, "File is not a zip file"\n', 'BadZipfile: File is not a zip file\n']
Here's what I got:
Unhandled exception (AttributeError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/spiderfoot/spiderfoot/sfscan.py", line 119, in startScan\n dns.resolver.restore_system_resolver()\n', "AttributeError: 'module' object has no attribute 'restore_system_resolver'\n"]
I did point this direct to an internal IP address however...if that makes a difference. Thank you.
Hello,
I have been triying to export results to a CSV file, but with no success... :(
You receive the message: "Waiting to connect to localhost...", but it never finish...and finnaly it does not dump data to the CSV file...
Have someone experimented this issue? Someone can provide some info about this, please?
Thanks!
A smallish scan I was running crashed with the following error message:
2014-01-06 10:30:09 SpiderFoot ERROR
Unhandled exception (UnicodeEncodeError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "sfscan.pyc", line 152, in startScan\n', ' File "modules\\sfp_googlesearch.pyc", line 167, in start\n', ' File "sflib.pyc", line 794, in notifyListeners\n', ' File "modules\\sfp_spider.pyc", line 249, in handleEvent\n', ' File "modules\\sfp_spider.pyc", line 299, in spiderFrom\n', ' File "modules\\sfp_spider.pyc", line 127, in processUrl\n', ' File "modules\\sfp_spider.pyc", line 180, in linkNotify\n', ' File "sflib.pyc", line 794, in notifyListeners\n', ' File "modules\\sfp_crossref.pyc", line 103, in handleEvent\n', ' File "sflib.pyc", line 678, in fetchUrl\n', "UnicodeEncodeError: 'ascii' codec can't encode character u'\\ufffd' in position 33: ordinal not in range(128)\n"]
Guessing the problem isn't the UnicodeEncodeError
itself, as that could be the target's fault - just that it brought down the whole scan.
The full certificate is not properly checked for host mismatches.
Supported versions of CherryPy, Python, etc. not mentioned in available documentation.
If a file extension in the filter list is found anywhere in the URL, it will get filtered out.
So... Python 3 is starting to be used more and more and, more tellingly, being shipped as default on many Linux distros.
I have started a fork at https://github.com/jspc/spiderfoot/tree/python3 which works under python3 merely as a WIP. Now this repo has no tests written for it as far as I can see (which is bad anyway) so a simple smoke test is all I've been able to do.
Alongside the changes in this fork for Python3 there is also:
.gitignore
file.The main reason for this fork: should this project not be using Python3? Why would it not use Python3?
Unhandled exception (error) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/home/tom/spiderfoot/sfscan.py", line 195, in startScan\n module.start()\n', ' File "/home/tom/spiderfoot/modules/sfp_bingsearch.py", line 81, in start\n self.notifyListeners(evt)\n', ' File "/home/tom/spiderfoot/sflib.py", line 1033, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/tom/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.notifyListeners(evt)\n', ' File "/home/tom/spiderfoot/sflib.py", line 1033, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/tom/spiderfoot/modules/sfp_dns.py", line 194, in handleEvent\n self.processHost(host, parentEvent)\n', ' File "/home/tom/spiderfoot/modules/sfp_dns.py", line 264, in processHost\n self.notifyListeners(evt)\n', ' File "/home/tom/spiderfoot/sflib.py", line 1033, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/tom/spiderfoot/modules/sfp_sharedip.py", line 121, in handleEvent\n self.notifyListeners(evt)\n', ' File "/home/tom/spiderfoot/sflib.py", line 1033, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/tom/spiderfoot/modules/sfp_malcheck.py", line 478, in handleEvent\n url = self.lookupItem(cid, typeId, eventData)\n', ' File "/home/tom/spiderfoot/modules/sfp_malcheck.py", line 419, in lookupItem\n return self.resourceList(cid, target, itemType)\n', ' File "/home/tom/spiderfoot/modules/sfp_malcheck.py", line 405, in resourceList\n re.match(rxTgt, line, re.IGNORECASE):\n', ' File "/usr/lib/python2.7/re.py", line 137, in match\n return _compile(pattern, flags).match(string)\n', ' File "/usr/lib/python2.7/re.py", line 242, in _compile\n raise error, v # invalid expression\n', 'error: multiple repeat\n']
Unhandled exception (IOError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/home/rydell/src/git/spiderfoot/sfscan.py", line 92, in startScan\n self.sf.cachePut("internet_tlds", self.config['_internettlds'])\n', ' File "/home/rydell/src/git/spiderfoot/sflib.py", line 164, in cachePut\n fp = file(cacheFile, "w")\n', "IOError: [Errno 2] No such file or directory: u'/home/rydell/src/git/spiderfoot/cache/95c5dab788d19e124540cb1e96e6277f0871c648f4b3f2526fa1f765'\n"]
after manually creating the cache directory inside the spiderfoot directory from git, everything seems to work fine. Just thought I should report it.
test
I was playing with spiderfoot and suddenly...
2013-08-14 01:45:50 SpiderFoot ERROR
Unhandled exception encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "sfscan.pyc", line 142, in startScan\n', ' File "modules\sfp_similar.pyc", line 221, in start\n', ' File "modules\sfp_similar.pyc", line 148, in scrapeDomaintools\n', ' File "modules\sfp_similar.pyc", line 204, in storeResult\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_xref.pyc", line 129, in handleEvent\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 102, in handleEvent\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 131, in handleEvent\n', ' File "modules\sfp_dns.pyc", line 176, in processHost\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_spider.pyc", line 240, in handleEvent\n', ' File "modules\sfp_spider.pyc", line 289, in spiderFrom\n', ' File "modules\sfp_spider.pyc", line 103, in processUrl\n', ' File "modules\sfp_spider.pyc", line 187, in contentNotify\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 102, in handleEvent\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 131, in handleEvent\n', ' File "modules\sfp_dns.pyc", line 176, in processHost\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_ripe.pyc", line 82, in handleEvent\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 102, in handleEvent\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_spider.pyc", line 240, in handleEvent\n', ' File "modules\sfp_spider.pyc", line 263, in spiderFrom\n', ' File "modules\sfp_spider.pyc", line 103, in processUrl\n', ' File "modules\sfp_spider.pyc", line 187, in contentNotify\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_dns.pyc", line 102, in handleEvent\n', ' File "sflib.pyc", line 534, in notifyListeners\n', ' File "modules\sfp_spider.pyc", line 240, in handleEvent\n', ' File "modules\sfp_spider.pyc", line 289, in spiderFrom\n', ' File "modules\sfp_spider.pyc", line 90, in processUrl\n', ' File "sflib.pyc", line 466, in fetchUrl\n', "UnicodeDecodeError: 'ascii' codec can't decode byte 0xf3 in position 53: ordinal not in range(128)\n"]
Please report this as a bug: (I'm not sure if I should report this here or if there's a better place/method.)
['Traceback (most recent call last):\n', ' File "sfscan.pyc", line 228, in startScan\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_bingsearch.pyc", line 104, in handleEvent\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_spider.pyc", line 259, in handleEvent\n', ' File "modules\sfp_spider.pyc", line 309, in spiderFrom\n', ' File "modules\sfp_spider.pyc", line 100, in processUrl\n', ' File "modules\sfp_spider.pyc", line 198, in contentNotify\n', ' File "sflib.pyc", line 1120, in notifyListeners\n', ' File "modules\sfp_pageinfo.pyc", line 93, in handleEvent\n', ' File "sflib.pyc", line 146, in info\n', ' File "sflib.pyc", line 110, in _dblog\n', ' File "sfdb.pyc", line 276, in scanLogEvent\n', ' File "sflib.pyc", line 126, in fatal\n', "NameError: global name 'exit' is not defined\n"]
Command pip install M2Crypto
is not enough to get M2Crypto installed and spiderfoot running as per regression. There is forks available, but I think this problem should be fixed in the upstream. This issue is just a heads up for SpiderFoot projects so that you can update instructions accordingly.
Error is: ImportError: /home/test/utils/builds/python/2.7.5/lib/python2.7/site-packages/M2Crypto/__m2crypto.so: undefined symbol: SSLv2_method
and I tested that the patch provided works.
Tested with:
I get the following error:
Unhandled exception (BaseException) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/home/Download/spiderfoot-2.1.0/sfscan.py", line 91, in startScan\n self.config['_internettlds'] = self.sf.optValueToData(self.config['_internettlds'])\n', ' File "/home/Download/spiderfoot-2.1.0/sflib.py", line 72, in optValueToData\n self.error("Unable to open option URL, " + val + ".")\n', ' File "/home/Download/spiderfoot-2.1.0/sflib.py", line 96, in error\n raise BaseException("Internal Error Encountered: " + error)\n', 'BaseException: Internal Error Encountered: Unable to open option URL, http://mxr.mozilla.org/mozilla-central/source/netwerk/dns/effective_tld_names.dat?raw=1.\n']
In clean Debian system these packages are needed to install spiderfoot with your guide: python2.7-dev libxml2-dev libxslt1-dev swig git python python-pip
Unhandled exception encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/root/spiderfoot/sfscan.py", line 134, in startScan\n module.start()\n', ' File "/root/spiderfoot/modules/sfp_ripe.py", line 125, in start\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_subdomain.py", line 71, in handleEvent\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_ripe.py", line 77, in handleEvent\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_subdomain.py", line 71, in handleEvent\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_dns.py", line 101, in handleEvent\n self.processHost(addr, event)\n', ' File "/root/spiderfoot/modules/sfp_dns.py", line 130, in processHost\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_spider.py", line 239, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/root/spiderfoot/modules/sfp_spider.py", line 288, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/root/spiderfoot/modules/sfp_spider.py", line 102, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/root/spiderfoot/modules/sfp_spider.py", line 186, in contentNotify\n self.notifyListeners(event)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_subdomain.py", line 71, in handleEvent\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_ripe.py", line 77, in handleEvent\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_subdomain.py", line 71, in handleEvent\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_dns.py", line 99, in handleEvent\n self.processHost(host, event)\n', ' File "/root/spiderfoot/modules/sfp_dns.py", line 130, in processHost\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_spider.py", line 239, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/root/spiderfoot/modules/sfp_spider.py", line 299, in spiderFrom\n nextLinks = self.cleanLinks(links)\n', ' File "/root/spiderfoot/modules/sfp_spider.py", line 160, in cleanLinks\n if filter(checkExts, self.opts['filterfiles']):\n', ' File "/root/spiderfoot/modules/sfp_spider.py", line 159, in \n checkExts = lambda ext: '.' + str.lower(ext) in str.lower(str(link))\n', "UnicodeEncodeError: 'ascii' codec can't encode character u'\xae' in position 72: ordinal not in range(128)\n"]
I'm not very experienced at this, but I've read a great deal of positive reviews on SpiderFoot. I am seeking to identify a email address footprints, could someone please inform me how I could achieve this? I am not sure of the correct format to enter the email address or the scan criteria to select. Many thanks in advance!
stack trace:
Unhandled exception (MemoryError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "/home/gianz/spiderfoot/sfscan.py", line 195, in startScan\n module.start()\n', ' File "/home/gianz/spiderfoot/modules/sfp_bingsearch.py", line 81, in start\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 274, in spiderFrom\n links = self.processUrl(startingPoint) # fetch first page\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 105, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 189, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 300, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 128, in processUrl\n self.urlEvents[link] = self.linkNotify(link, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 181, in linkNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 274, in spiderFrom\n links = self.processUrl(startingPoint) # fetch first page\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 105, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 189, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 300, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 105, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 189, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 300, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 105, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 189, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 300, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 105, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 189, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 274, in spiderFrom\n links = self.processUrl(startingPoint) # fetch first page\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 105, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 189, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 300, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 128, in processUrl\n self.urlEvents[link] = self.linkNotify(link, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 181, in linkNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 300, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 105, in processUrl\n self.contentNotify(url, fetched, self.urlEvents[url])\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 189, in contentNotify\n self.notifyListeners(event)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 120, in handleEvent\n self.processHost(match, parentEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_dns.py", line 271, in processHost\n self.notifyListeners(evt)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 1050, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 250, in handleEvent\n return self.spiderFrom(spiderTarget)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 300, in spiderFrom\n freshLinks = self.processUrl(link)\n', ' File "/home/gianz/spiderfoot/modules/sfp_spider.py", line 117, in processUrl\n links = sf.parseLinks(url, fetched['content'], self.baseDomain)\n', ' File "/home/gianz/spiderfoot/sflib.py", line 578, in parseLinks\n self.error("Error applying regex2 to: " + data)\n', 'MemoryError\n']
Unhandled exception (UnicodeEncodeError) encountered during scan. Please report this as a bug: ['Traceback (most recent call last):\n', ' File "sfscan.pyc", line 152, in startScan\n', ' File "modules\sfp_dns.pyc", line 337, in start\n', ' File "modules\sfp_dns.pyc", line 255, in processHost\n', ' File "sflib.pyc", line 812, in notifyListeners\n', ' File "modules\sfp_bingsearch.pyc", line 97, in handleEvent\n', ' File "sflib.pyc", line 812, in notifyListeners\n', ' File "modules\sfp_malcheck.pyc", line 395, in handleEvent\n', ' File "modules\sfp_malcheck.pyc", line 334, in lookupItem\n', ' File "modules\sfp_malcheck.pyc", line 247, in resourceQuery\n', "UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-1: ordinal not in range(128)\n"]
Python's urllib2.urlopen() automatically follows redirects, but we need more control over this in cases where the redirecting site supplies a cookie that we don't catch.
when I run the sf.py, this error occures: (kali linux - Python 2.7.3)
Starting web server at http://127.0.0.1:5001...
Traceback (most recent call last):
File "./sf.py", line 91, in
cherrypy.engine.autoreload.unsubscribe()
AttributeError: 'module' object has no attribute 'engine'
Traceback (most recent call last):
File "sf.py", line 64, in
mod = import('modules.' + modName, globals(), locals(), [modName])
File "/usr/local/bin/spiderfoot/modules/sfp_sslcert.py", line 16, in
import M2Crypto
File "/usr/local/lib/python2.7/dist-packages/M2Crypto/init.py", line 22, in
import __m2crypto
ImportError: /usr/local/lib/python2.7/dist-packages/M2Crypto/__m2crypto.so: undefined symbol: SSLv2_method
SpiderFoot has several problems with XHTML sites. You can use http://www.nerv.fi as an example if you want. This might also be Javascript related issue.
Some examples:
http://www.nerv.fi/layout/news/{$HOST}info/help
in Web technologyhttp://www.nerv.fi/news/2013-04-30/2013-01-29/cve-2013-0238-nervin-irc-verkon-palvelimet-paivitetaan
in HTTP Headers. I don't even know where SpiderFoot got that URL since the service doesn't have URL syntax like that. Same thing in Linked URL - Internal so it might be how the parsing worksWould be great to have the possibility to supply a file for the sub-domains enumeration (instead of just a sub-domains coma separated field).
Local or remote file, doesn't matter :)
Just for the test, wanted to add these sub-domains: http://ethicalhack3r.co.uk/files/fuzzing/top1mil-subdomains/subdomains-top1mil-20000.txt ;)
The default database shipped is created on a 64-bit system, and seems to cause issues on 32-bit.
Ship the next release with a non-existent database, and create the database silently.
File "/root/spiderfoot/modules/sfp_dns.py", line 130, in processHost\n self.notifyListeners(evt)\n', ' File "/root/spiderfoot/sflib.py", line 521, in notifyListeners\n listener.handleEvent(sfEvent)\n', ' File "/root/spiderfoot/modules/sfp_geoip.py", line 64, in handleEvent\n hostip = json.loads(res['content'])\n', ' File "/usr/lib/python2.6/json/init.py", line 307, in loads\n return _default_decoder.decode(s)\n', ' File "/usr/lib/python2.6/json/decoder.py", line 319, in decode\n obj, end = self.raw_decode(s, idx=_w(s, 0).end())\n', 'TypeError: expected string or buffer\n']
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.