uafgeotools / cube_conversion Goto Github PK
View Code? Open in Web Editor NEWExtract waveforms and metadata from DATA-CUBE digitizers and write to miniSEED files
License: MIT License
Extract waveforms and metadata from DATA-CUBE digitizers and write to miniSEED files
License: MIT License
The code gives an error because the parameter REVERSE_POLARITY_LIST
is not defined.
Currently, because of the one-to-one relationship set up by digitizer_sensor_pairs.json
, a setup with more than one sensor attached to a single digitizer uses the same sensor sensitivity for all three sensors. For example, imagine a three-element array configuration where digitizer AEX is attached to SN49, SN50, and SN51. With the current code, the sensitivity for SN49 would be used for all three sensors.
A possible fix for this is to allow for lists (which are valid JSON) to be used in the pairs file. E.g., the top line in digitizer_sensor_pairs.json
could be
"AEX" : ["SN49", "SN50", "SN51"]
and we could put some logic here in the script to select and apply the correct sensitivity to each element.
————————
However — is this even possible? Because we would need to know the correspondence between the .pri0
, .pri1
, and .pri2
files and the sensor. Is it possible to know this, e.g. from noting how the sensors are attached to the breakout box? If not, then there's no use in implementing the above, and users should instead use something like:
"AEX" : "any_made_up_id"
Which will result in a KeyError
when looking up the sensitivity, and use DEFAULT_SENSITIVITY
for all elements, which seems like the most logical behavior. Of course, this would need to be documented!
And speaking of documentation, non-UAF users who don't have their own sensor naming convention like us will want to do the above anyways — pinging @leighton-watson. 😃
————————
Pinging @davidfee5, @jegestrich, and @amiezzi as UAF DATA-CUBE3 users.
We've noticed at least a few times that when using cube2mseed we will have notable data gaps related to poor time tags in the raw CUBE files. See attached image.
The cube2mseed option fringe-samples
can correct this if you choose the NOMINAL
or CONSTANT
value (see documentation for more details). Attached here is the same data converted with fringe-samples=CONSTANT
.
I think we should update the code to allow the user to specify the fringe-samples
value. First we should probably verify the timing is ok still and what value we actually want to use.
These lines of code
cube_conversion/cube_convert.py
Lines 78 to 81 in 23b83a7
5F
is provided. Evidently the regex check here needs to be tweaked to allow for numbers 0–9 as well as letters!This requires a few mostly trivial changes, including but not limited to:
BITWEIGHT
and sticking to it*.json
supplemental files — right now they are UAF-specific!The currently implemented method is taking the median of the GPS positions. However, if there are many outliers and/ or asymmetric spread this does not necessarily capture the most common GPS position. This can be obtained by taking the mode of the GPS positions. Doing that for the three directions separately does not capture the most common position.
One idea is to take a look at clustering of positions in a point cloud. Since the latitude and longitude GPS positions are binned in increments of 10-5 ° and elevation is binned in increments of 1 m by default of the GPS sensor (or digitizer?) we can calculate a 3D histogram and take the position of maximum counts as GPS location.
Another idea is to calculate the 2D histogram with latitude and longitude and average (or median or mode) the elevation since the elevation has a higher uncertainty anyways and has a higher chance of being bimodal or wider spread.
What do you all think?
John Lyons reports the following error message:
Coordinates exported to HV.FIS8.02.DDF.json
Traceback (most recent call last):
File "/home/jlyons/Python/cubeconversion/cube_convert.py", line 445, in <module>
png_filename = json_filename.rstrip('.json') + '.png'
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/figure.py", line 2180, in savefig
self.canvas.print_figure(fname, **kwargs)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/backends/backend_qt5agg.py", line 88, in print_figure
super().print_figure(*args, **kwargs)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/backend_bases.py", line 2056, in print_figure
**kwargs)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/backends/backend_agg.py", line 527, in print_png
FigureCanvasAgg.draw(self)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/backends/backend_agg.py", line 388, in draw
self.figure.draw(self.renderer)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/artist.py", line 38, in draw_wrapper
return draw(artist, renderer, *args, **kwargs)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/figure.py", line 1709, in draw
renderer, self, artists, self.suppressComposite)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/image.py", line 135, in _draw_list_compositing_images
a.draw(renderer)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/artist.py", line 38, in draw_wrapper
return draw(artist, renderer, *args, **kwargs)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/axes/_base.py", line 2645, in draw
mimage._draw_list_compositing_images(renderer, self, artists)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/image.py", line 135, in _draw_list_compositing_images
a.draw(renderer)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/artist.py", line 38, in draw_wrapper
return draw(artist, renderer, *args, **kwargs)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/axis.py", line 1204, in draw
ticks_to_draw = self._update_ticks()
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/axis.py", line 1080, in _update_ticks
major_locs = self.get_majorticklocs()
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/axis.py", line 1325, in get_majorticklocs
return self.major.locator()
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/ticker.py", line 1799, in __call__
return self.tick_values(vmin, vmax)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/ticker.py", line 1808, in tick_values
return self.raise_if_exceeds(locs)
File "/home/jlyons/anaconda3/lib/python3.7/site-packages/matplotlib/ticker.py", line 1520, in raise_if_exceeds
len(locs), locs[0], locs[-1]))
RuntimeError: Locator attempting to generate 8471 ticks from -39970.0 to 2380.0: exceeds Locator.MAXTICKS
The 2-D histogram plotting code places major ticks every 5 m, with minor ticks every 1 m. Usually this results in a reasonable number of ticks. However in this case, the range of one of the axes was "-39970.0 to 2380.0" m which is > 42 km! Probably related to a really poor GPS lock at one point.
Enforcing maximum x- and y-axis limits will likely be a sufficient solution for this error.
Right now the code automatically "removes the response" by simply dividing by a calibration value. However, we might not want to automatically do this. Potentially a future thing to address...
I want to convert data from DataCube. I added digitizer code (and station when it is needed) to digitizer_sensor_pairs.json, digitizer_offsets.json and sensor_sensitivities.json.
My input command to a specific days is:
python /Users/joaofontiela/geophysics/gipptools-2021.168/bin/cube_conversion/cube_convert.py -v AB01-ADY/06180000.ADY AB01-ADY/teste_arraybeja AB AB01 01 AUTO --grab-gps
and the output is:
Network code: AB
Station code: AB01
Location code: 01
Channel code: Automatic
Traceback (most recent call last):
File "/Users/joaofontiela/geophysics/gipptools-2021.168/bin/cube_conversion/cube_convert.py", line 128, in
raise FileNotFoundError('No raw files found.')
FileNotFoundError: No raw files found.
Does anyone have an idea on how to fix this error?
There is an error in the channel naming scheme for high sample rates. According to the SEED manual, infrasound data with sample rates ≥ 250 to < 1000 Hz should either be “DDF” or “CDF”, with “D” for when the sensor corner period is < 10 sec and “C” for when the corner period is >=10 sec. We had "DDF" here even though our sensors have a corner period >=10 sec.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.