Giter Club home page Giter Club logo

laika's Introduction

Introduction

Laika is an open-source GNSS processing library. Laika is similar to projects like RTKlib and GPSTK, but in Python and with a strong focus on readability, usability and easy integration with other optimizers. Laika can process raw GNSS observations with data gathered online from various analysis groups to produce data ready for position/velocity estimation. Laika is designed to produce accurate results whilst still being readable and easy to use. Laika is the perfect tool to develop accurate GNSS-only or GNSS-fusion localisation algorithms.

alt text

The GNSS problem

GNSS satellites orbit the earth broadcasting signals that allow the receiver to determine the distance to each satellite. These satellites have known orbits and so their positions are known. This makes determining the receiver's position a basic 3-dimensional trilateration problem. In practice observed distances to each satellite will be measured with some offset that is caused by the receiver's clock error. This offset also needs to be determined, making it a 4-dimensional trilateration problem.

Since this problem is generally overdetermined (more than 4 satellites to solve the 4d problem) there is a variety of methods to compute a position estimate from the measurements. Laika provides a basic weighted least squares solver for experimental purposes. This is far from optimal due to the dynamic nature of the system, this makes a Bayesian estimator like a Kalman filter the preferred estimator.

However, the above description is over-simplified. Getting accurate distance estimates to satellites and the satellite's position from the receiver observations is not trivial. This is what we call processing of the GNSS observables and it is this procedure laika is designed to make easy.

Astrodog

Astrodog is the main component of the laika. It is a python object, and like the soviet space dogs to which it owes its name, an astrodog will do everything to make the life of its owner easier. Which in this case is fetch and process all the necessary data to transform raw GNSS observables into usable distance measurements and satellite positions ready for position estimation.

Satellite info

Astrodog has a get_sat_info function that will provide an accurate position, velocity and clock error for any satellite at any time in history.

Pseudorange corrections

Astrodog has a get_delay function that will provide a pseudorange delay correction for any satellite at any time in history for the requested receiver position. This delay correction includes a correction for the tropospheric signal delay, ionospheric signal delay and differential code biases (DCBs) of the transmitting satellite.

This delay can either be estimated with mathematical models or with DGPS station observations, which is more accurate, but slower and only supported in the continental United States.

Architecture

GNSS processing requires getting data from the internet from various analysis groups such as NASA's CDDIS. AstroDog downloads files from FTP servers from these groups when it needs them. Downloading and parsing all this data can be slow. AstroDog caches all downloaded files locally to avoid re-downloading.

These files are then parsed by AstroDog and kept in memory. Every one of these parsed objects (DCBs, ionospheric models, satellite orbit polynomials, etc.) has a valid location area and/or a valid time window. Within those windows these objects can provide information relevant to GNSS processing.

Design principles of laika

  • Should work without configuration or setup
  • Default configuration should not compromise accuracy for anything

Laika's accuracy

To confirm the quality of Laika's GNSS processing, we ran laika's processing and a simple Kalman filter (procedure described in examples) on 2000 minutes of driving of a regular commute in San Francisco. The data comes from a "u-blox M8" chip. The fixes computed with laika's processed data are compared to the live navigation fixes given by the u-blox chip. They compared by looking at the standard deviation of all measured altitudes within every 5×5 m² in the dataset. There is no way to compare horizontal accuracy without ground truth, but there is no reason to believe that vertical and horizontal accuracy are not equally correlated for laika computed positions and u-blox's live positions. Data with the antenna on the roof and antenna inside the car are compared separately, since the results are significantly different. altitude distributionplot

Examples

Installation

Laika runs in Python 3.8.2, and has only been tested on Ubuntu 20.04. Running in a virtual environment is recommended.

laika

If you do not yet have numpy and scipy installed. Install them with pip. Having accelerated numpy will make laika much faster.

pip install numpy scipy

Then laika can be installed with

python setup.py install

The tests should now pass.

Eathdata account

It is no longer possible to download GNSS data from NASA servers without an account. You can make an account here. Then create a .netrc file in the laika folder with content:

machine urs.earthdata.nasa.gov login your_username password your_password

notebook examples

The notebook examples require some visualisation packages. To install them first you need

sudo apt-get install libfreetype6-dev

and then with pip

pip install -r requirements_examples.txt --user

Then you should be able to run the notebooks. The notebooks can be opened by running jupyter notebook and then navigating to the desired .ipynb file.

Useful GNSS references

laika's People

Contributors

adeebshihadeh avatar adriangb avatar akx avatar arne182 avatar busterbeam avatar dimar01123 avatar fivitti avatar fredyshox avatar gast04 avatar geohot avatar gijskoning avatar gregjhogan avatar gregorkikelj avatar haraschax avatar hewers avatar incognitojam avatar jnewb1 avatar lmaynard6 avatar mbalesni avatar mengyou658 avatar mitchellgoffpc avatar pbkompasz avatar pd0wm avatar rbiasini avatar rtacr avatar sahandzou avatar tacaswell avatar tylerni7 avatar valgur avatar vanillagorillaa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

laika's Issues

read_raw_ublox C1C observable std calculation

Could you please explain why in the read_raw_ublox function of raw_gnss.py you calculate the C1C standard deviation according to the transcript below?

observables_std['C1C'] = np.sqrt(i.pseudorangeStdev)*10

Why not just use i.pseudorangeStdev?

Best regards,
Giovanni.

pycurl error for SSL_CIPHER_LIST = 'DEFAULT@SECLEVEL=1' in https_download_file

Please excuse the lack of expertise. I have been trying to get Laika working and ran in to an error while going through the walkthrough. Details of my system are below. Digging down into what is going on, the issue is the function https_download_file which is failing. Specifically, the setting

crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT@SECLEVEL=1')

causes crl.perform() to fail with the error

pycurl.error: (59, 'failed setting cipher list: DEFAULT@SECLEVEL=1')

This results in the error RuntimeError: No orbit data found on either servers from get_orbit_data. The fix for me was to change the setting to

crl.setopt(crl.SSL_CIPHER_LIST, 'DEFAULT')

I guess this is due to the version of openssl that I have installed, and I have no idea what implications the above change has, but nonetheless thought it was a good idea to report the issue and what got it working.


Hardware: MacBook Pro with M1 chip
OS: MacOS 11.6
Python: 3.9.7 (installed with pyenv)
OpenSSL: 2.8.3 (installed with homebrew)
PycURL: 7.44.1 (compiled against the above library)

Happy to dig further into my setup to find the issue if it is helpful. Otherwise hopefully this helps if anyone else has the same issue.

Questions regarding laika's input measurement specs for implementing DGPS

Hi, I'm trying to modify laika to create an ad-hoc DGPS system from Android phones via their Raw GNSS API. I have been experimenting with GNSS Logger app/library from Google [1]. I was trying to use this output as input to your library to measure GPS first, and then extend it to a functioning DGPS solution.

  1. I'm having trouble with providing inputs to laika. The documentation mentions there are 8 values per row (in the comma2k19 dataset repository here), but the numpy arrays (raw_gnss_ublox) contains 10 values (and laika expects 10 values as input). The following is my current understanding (I calculated gpsWeek, gpsSecondsOfWeek, pseudorange using [2], from what I got out of comma2k19 dataset docs.)
return np.array([
        svid, # PRN
        gpsWeek, # week of gps_time of reception (gps_week),
        gpsSecondsOfWeek, # time pf week of gps_time of reception (s),
        np.nan, # GLONASS channel, as GPS, it's nan
        pseudorange, # calculated via the android params
        -1, # pseudorange_std, if it's uncertainty, is there a reference to see how it's calculated?
        pseudorangeRateMetersPerSecond,
        pseudorange_rate_std, # pseudorangeRateUncertaintyMetersPerSecond from android API?
        -1, #  S1C --> Signal Strength ??
        -1, # L1C --> Carrier Cycles ??
    ])

(a) What expected as input for pseudorange_std and pseudorange_rate_std parameters? Is it uncertainty (I assumed as std could mean standard deviation? If so, is there are reference where I can see this and thus implement it?
(b) Are the last two values necessary? I couldn't find any indication what they are, apart from digging in the library and figuring out the keys, which when I read the RINEX documentation.

  1. I'm trying to understand dgps.py. I'm trying to replace ground station data with GNSS data (that I understand is downloaded from the internet) from a different source. I want to know what exactly is being sent to AstroDog so that I can compute this via the ad-hoc ground station?

I understand the principles of DGPS, however, the design is bit unclear (from reading parse_dgps). From what I understand, sending back a good DGPSDelay object is all need to do. What would that entail? Is there any other class I should be looking at?

[1] The outputs of the GNSSLogger library are as follows,

ElapsedRealtimeMillis,TimeNanos,LeapSecond,TimeUncertaintyNanos,FullBiasNanos,BiasNanos,BiasUncertaintyNanos,DriftNanosPerSecond,DriftUncertaintyNanosPerSecond,HardwareClockDiscontinuityCount,Svid,TimeOffsetNanos,State,ReceivedSvTimeNanos,ReceivedSvTimeUncertaintyNanos,Cn0DbHz,PseudorangeRateMetersPerSecond,PseudorangeRateUncertaintyMetersPerSecond,AccumulatedDeltaRangeState,AccumulatedDeltaRangeMeters,AccumulatedDeltaRangeUncertaintyMeters,CarrierFrequencyHz,CarrierCycles,CarrierPhase,CarrierPhaseUncertainty,MultipathIndicator,SnrInDb,ConstellationType,AgcDb,CarrierFrequencyHz

[2] This is my pseudorange code. raw is the GNSS output from the Android API.

WEEKSEC = 604800
SPEED_OF_LIGHT = 2.99792458e8
def compute_gps_times_and_pseudorange_vals(raw):
    # from https://www.gsa.europa.eu/system/files/reports/gnss_raw_measurement_web_0.pdf
    tRx_GNSS = float(raw['TimeNanos']) - (float(raw['FullBiasNanos']) + float(raw['BiasNanos'])) 
    gps_week = tRx_GNSS // (WEEKSEC*1e9)
    tRx = np.mod(tRx_GNSS,WEEKSEC*1e9)
    gps_seconds_of_week = np.round(tRx * 1e-9)
    tTx = float(raw['ReceivedSvTimeNanos']) + float(raw['TimeOffsetNanos'])
    pr =  (tRx - tTx) * SPEED_OF_LIGHT * 1e-9
    return pr, gps_week, gps_seconds_of_week

DGPS documentation

Hi, I've heard a lot of recommendation about your library but I'm having a bit of a hard time operating it. I'm trying to do a coordinate correction experiment on the DGPS method. Can you please give me some examples of a simple use of the DGPS functionality?
Thanks.

get_all_sat_info speed

Is get_all_sat_info() capable of returning sat states for multiple epochs at once? I am looking to develop a GNSS simulation and couldn't figure out a way to do so (other than a list comprehension).

I may have overlooked something in the examples.

Missing dependency in setup.py

This is a pretty simple issue at first glance.

You have a missing dependency in setup.py for the library atomicwrites. The actual library calls are in downloader.py, for example:

with atomic_write(filepath_attempt, mode='w', overwrite=True) as wf:

This means that any project that uses laika as a dependency itself will fail unless it explicitly knows to install atomicwrites itself. But on the project homepage, the author of that package recommends its deprecation:

I thought it'd be a good time to deprecate this package. Python 3 has os.replace and os.rename which probably do well enough of a job for most usecases.

So, what's the need for that atomic_write function? Could it not be simply replaced by a basic wrapper using native library calls?

Ntrip data for correction

Will Laika support RTCM correction data? Would be really nice to have, and perhaps not to difficult to add.

Quiet ignoring Galileo, Beidou when pull_orbit is False

If pull_orbit is set to False then Laika fetch sat info only for GPS and GLONASS. But user doesn't have any information that it fails for other constellations. Maybe Laika should throw exception (NotImplemented) when user creates AstroDog instance with pull_orbit=False and not supported constellations?

consult about the performance of NEO M8T with laika

Thanks for the great work!

In this issue, #4, anuragxel recommend M8T.
does any one have test neo m8t with Laika lib?
how about the performance?
Beside M8T, is there any other good and cheap device from ublox(should be compatible with Laika) ?
Thanks

`url_bases` vs `url_base` mixup in `downloader.py`

The function download_and_cache_file_return_first_success accepts an array of url_bases and hands them off to download_and_cache_file, which expects just a single url_base. This causes the url concatenation to fail in download_file.

To fix this, I'd suggest something like this in download_and_cache_file_return_first_success:

if not os.path.isfile(filepath) or overwrite:
  for url_base in url_bases: # <-- critical that we iterate over these somehow
    try:
      data_zipped = download_file(url_base, folder_path, filename_zipped)
      break
    except (DownloadFailed, pycurl.error, socket.timeout):
      data_zipped = None

  if data_zipped is None:
    unix_time = time.time()
    os.makedirs(folder_path_abs, exist_ok=True)
    with atomic_write(filepath_attempt, mode='w', overwrite=True) as wf:
      wf.write(str(unix_time))
    raise DownloadFailed(f"Could not download {folder_path + filename_zipped} from {url_bases} ")

I can put up a PR if you want; just not sure what the contribution guidelines are. Thanks!

Typo on clock_correction computation

Hi,

I think you have made a typo here.

This is located in class GPSEphemeris(Ephemeris):, def get_sat_info(self, time):
You do :

...
tdiff = time - eph['toe']  # Time of clock
clock_err = eph['af0'] + tdiff * (eph['af1'] + tdiff * eph['af2'])
clock_rate_err = eph['af1'] + 2 * tdiff * eph['af2']
# Orbit propagation
tdiff = time - eph['toe']  # Time of ephemeris (might be different from time of clock)

Shouldn't it be :

...
tdiff = time - eph['toc']  # Time of clock
...

instead ?

It would make more sense (to use Toc instead of Toe to compute the clock_error) and would fit with the comments you have made.

Thanks,
Emilien

Running with live data

Hi!

I think my question may be a little be bit stupid, but still I am wondering if I can use this library for live data?

I have ublox F9P and kind of raspberry pi, and I want to get better results with gps data, however all examples are dealing with binary files. If I want to use it with device, I should just read from serial, instead of file, is it that straightforward? Or I should consider other approach?

Thanks in advance.

Satellite Outages reported as online

I'm using Laika to obtain satellite positions in the sky at different times, but I was comparing data from Laika with real measurements and for one of the tests, G28 is offline (sensor doesn't show it, and almanac I use with another similar tool to Laika, also don't report it), but Laika does report it. Is there a way to get Laika to report whether the satellite is online or not? I dug through the code a bit and I can look at the healthy attribute from the PolyEphemeris, but that one still seems to be set to True regardless. Not sure if it's a bug or not, just hoping to get some help so I can use Laika to estimate satellite positions.

Thanks in advance!

Walkthrough jupyter notebook doesn't work

First time user who gets runtime error on trying the Walkthrough jupyter notebook:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
Cell In[4], line 11
      9 # We use RINEX3 PRNs to identify satellites
     10 sat_prn = 'G07'
---> 11 sat_pos, sat_vel, sat_clock_err, sat_clock_drift, ephemeris = dog.get_sat_info(sat_prn, time)
     12 print("Satellite's position in ECEF (m) : \n", sat_pos, '\n')
     13 print("Satellite's velocity in ECEF (m/s) : \n", sat_vel, '\n')

File ~/opt/anaconda3/envs/missile-tid/lib/python3.10/site-packages/laika/astro_dog.py:265, in AstroDog.get_sat_info(self, prn, time)
    263 eph = None
    264 if self.pull_orbit:
--> 265   eph = self.get_orbit(prn, time)
    266 if not eph and self.pull_nav:
    267   eph = self.get_nav(prn, time)

File ~/opt/anaconda3/envs/missile-tid/lib/python3.10/site-packages/laika/astro_dog.py:106, in AstroDog.get_orbit(self, prn, time)
    104 def get_orbit(self, prn: str, time: GPSTime):
    105   skip_download = time in self.orbit_fetched_times
--> 106   orbit = self._get_latest_valid_data(self.orbits[prn], self.cached_orbit[prn], self.get_orbit_data, time, skip_download)
    107   if orbit is not None:
    108     self.cached_orbit[prn] = orbit

File ~/opt/anaconda3/envs/missile-tid/lib/python3.10/site-packages/laika/astro_dog.py:363, in AstroDog._get_latest_valid_data(self, data, latest_data, download_data_func, time, skip_download, recv_pos)
    361   download_data_func(time, recv_pos)
    362 else:
--> 363   download_data_func(time)
    364 latest_data = get_closest(time, data, recv_pos=recv_pos)
    365 if is_valid(latest_data):

File ~/opt/anaconda3/envs/missile-tid/lib/python3.10/site-packages/laika/astro_dog.py:211, in AstroDog.get_orbit_data(self, time, only_predictions)
    209   ephems_sp3 = self.download_parse_orbit(time)
    210 if sum([len(v) for v in ephems_sp3.values()]) < 5:
--> 211   raise RuntimeError(f'No orbit data found. For Time {time.as_datetime()} constellations {self.valid_const} valid ephem types {self.valid_ephem_types}')
    213 self.add_orbits(ephems_sp3)

RuntimeError: No orbit data found. For Time 2018-01-07 00:00:00 constellations ['GPS', 'GLONASS'] valid ephem types (<EphemerisType.FINAL_ORBIT: 1>, <EphemerisType.RAPID_ORBIT: 2>, <EphemerisType.ULTRA_RAPID_ORBIT: 3>)

Output of this code block:

# For example if we want the position and speed of satellite 7 (a GPS sat)
# at the start of January 7th 2018. Laika's custom GPSTime object is used throughout
# and can be initialized from python's datetime.

from datetime import datetime
from laika.gps_time import GPSTime
time = GPSTime.from_datetime(datetime(2018, 1, 7))

# We use RINEX3 PRNs to identify satellites
sat_prn = 'G07'
sat_pos, sat_vel, sat_clock_err, sat_clock_drift, ephemeris = dog.get_sat_info(sat_prn, time)
print("Satellite's position in ECEF (m) : \n", sat_pos, '\n')
print("Satellite's velocity in ECEF (m/s) : \n", sat_vel, '\n')
print("Satellite's clock error (s) : \n", sat_clock_err, '\n\n')

# we can also get the pseudorange delay (tropo delay + iono delay + DCB correction)
# in the San Francisco area
receiver_position = [-2702584.60036925, -4325039.45362552, 3817393.16034817]
delay = dog.get_delay(sat_prn, time, receiver_position)
print("Satellite's delay correction (m) in San Fransisco \n", delay)

Which prints out:

Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18006/final/Sta19826.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18008/final/Sta19831.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18007/final/Sta19830.sp3
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igs19826.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igs19830.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igs19831.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igr19826.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igr19831.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igr19830.sp3.Z
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18007/rapid/Sta19830.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18008/rapid/Sta19831.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18006/rapid/Sta19826.sp3
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19831_18.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igu19826_18.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19830_18.sp3.Z
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18006/ultra/Sta19826.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18007/ultra/Sta19830.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/18008/ultra/Sta19831.sp3
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19831_12.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igu19826_12.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19830_12.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19831_06.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19831_00.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igu19826_06.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19830_06.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1983/igu19830_00.sp3.Z
Downloading https://github.com/commaai/gnss-data/raw/master/gnss/products/1982/igu19826_00.sp3.Z

I tried signing up for an Earthdata account and installing laika from source with a .netrc file in the root folder with python setup.py install. Also tried wiping the cache, which is at /tmp/gnss. Cache looks like this:

 ls /tmp/gnss
cddis_products   russian_products

ls /tmp/gnss/cddis_products
1982 1983

ls /tmp/gnss/cddis_products/1982
igr19826.sp3.attempt_time    igs19826.sp3.attempt_time    igu19826_00.sp3.attempt_time igu19826_06.sp3.attempt_time igu19826_12.sp3.attempt_time igu19826_18.sp3.attempt_time

Running on a Mac with python 3.10.

dog.get_delay fails : IndexError: index 4 is out of bounds for axis 0 with size 4

Hi,

First of all thank you for the awesome repo. I would like to help solving the following issue

Simple code to reproduce

import laika
from laika import AstroDog
from laika.lib.coordinates import geodetic2ecef
from laika.gps_time import GPSTime
import datetime

constellations = ['GPS', 'GLONASS', 'GALILEO']
dog = AstroDog(valid_const=constellations, dgps=True)
prn ="G07"
pos_ecef = geodetic2ecef([46, 6, 123])
gps_time = GPSTime.from_datetime(datetime.datetime(2022, 11, 25, 9, 43, 6))

delay = dog.get_delay(prn, gps_time, pos_ecef)

print(delay)

Result :

Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/22328/rapid/Sta22374.sp3
Downloading https://github.com/commaai/gnss-data-alt/raw/master/MCC/PRODUCTS/22329/rapid/Sta22375.sp3
...
pulling from https://geodesy.noaa.gov/corsdata/coord/coord_14/ to /tmp/gnss/cors_coord/zsu4_14.coord.txt
pulling from https://geodesy.noaa.gov/corsdata/coord/coord_14/ to /tmp/gnss/cors_coord/ztl4_14.coord.txt
...
  File "/home/jonathan/Desktop/cloud_locate/laika/laika/dgps.py", line 31, in download_and_parse_station_postions
    with open(coord_file_path, 'r+') as coord_file:
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/gnss/cors_coord/ab27_14.coord.txt'

(Let's download it manually : cd /tmp/gnss/cors_coord && wget https://geodesy.noaa.gov/corsdata/coord/coord_14/ab49_14.coord.txt )
Then, relaunching the same code now gives :

Traceback (most recent call last):
  File "/tmp/test_laika.py", line 14, in <module>
    delay = dog.get_delay(prn, gps_time, pos_ecef)
  File "/home/jonathan/Desktop/cloud_locate/laika/laika/astro_dog.py", line 325, in get_delay
    return self._get_delay_dgps(prn, rcv_pos, time)
  File "/home/jonathan/Desktop/cloud_locate/laika/laika/astro_dog.py", line 340, in _get_delay_dgps
    dgps_corrections = self.get_dgps_corrections(time, rcv_pos)
  File "/home/jonathan/Desktop/cloud_locate/laika/laika/astro_dog.py", line 124, in get_dgps_corrections
    latest_data = self._get_latest_valid_data(self.dgps_delays, self.cached_dgps, self.get_dgps_data, time, recv_pos=recv_pos)
  File "/home/jonathan/Desktop/cloud_locate/laika/laika/astro_dog.py", line 361, in _get_latest_valid_data
    download_data_func(time, recv_pos)
  File "/home/jonathan/Desktop/cloud_locate/laika/laika/astro_dog.py", line 239, in get_dgps_data
    station_names = get_closest_station_names(recv_pos, k=8, max_distance=MAX_DGPS_DISTANCE, cache_dir=self.cache_dir)
  File "/home/jonathan/Desktop/cloud_locate/laika/laika/dgps.py", line 65, in get_closest_station_names
    return np.array(station_ids)[idxs]
IndexError: index 4 is out of bounds for axis 0 with size 4

Any suggestion on why this happens and how to solve it ?

GPS Prediction

Hello, can I predict GPS satellites visibility with this software??

Kalman Filter Notebook doesn't work

Hi there,

I'm just getting started to understand, how things done with this library, but especially the KF and DGPS things are interesting to me. Unfortunately the Notebook for KF isn't working, because of the missing laika_repo. You can also see this in the notebook on github in the 8th cell.

Best regards,
J.

KeyError when read BEIDOU or QZNSS satellite info

When I execute code:

dog = AstroDog(valid_const=["QZNSS"])
time = datetime.datetime(2020, 5, 1)
sat_info = dog.get_sat_info("J01", time)

I got:

File "/home/lv/Projects/magisterka/src/sources/laika.py", line 26, in source
    sat_info = dog.get_sat_info(sat_prn, time)
  File "/home/lv/Projects/magisterka/src/.env/src/laika/laika/astro_dog.py", line 233, in get_sat_info
    eph = self.get_orbit(prn, time)
  File "/home/lv/Projects/magisterka/src/.env/src/laika/laika/astro_dog.py", line 95, in get_orbit
    self.get_orbit_data(time)
  File "/home/lv/Projects/magisterka/src/.env/src/laika/laika/astro_dog.py", line 174, in get_orbit_data
    self.add_ephem(ephem, self.orbits)
  File "/home/lv/Projects/magisterka/src/.env/src/laika/laika/astro_dog.py", line 143, in add_ephem
    ephems[prn].append(new_ephem)
KeyError: 'J07'

After execute this code from get_orbit_data from astro_dog.py:

file_paths_sp3_ru = download_orbits_russia(time, cache_dir=self.cache_dir)
ephems_sp3_ru = parse_sp3_orbits(file_paths_sp3_ru, self.valid_const)

then it returns items with prns: ['J01', 'J02', 'J03', 'J07'].
Laika's code expects that prn numbers are continuous and throws KeyError when it try to add "J07" ephemeris.

Another situation is when you try to fetch BeiDou satellite info. Laika has hardcoded BeiDou size equals 14, but now we have 48 satellites in constellation.
When I fetch orbits using Laika then I get prns: [ C01, C03, C05, C06, C08, C09, C10, C11, C12, C13, C14, C16, C19, C20, C21, C22, C23, C24, C25, C26, C27, C28, C29, C30, C32, C33, C34, C35, C36, C37].

As temporary solution you can increase constellation sizes. But I think that it should be refactored. Now Laika crashes after launch next satellite. If you want I can help you with that.

Error in EKF_sym() class - C code generation and compilation probably missing?!

Within in the given Kalman filter example (Jupyter notebook), the following error occurs:

ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-2-744c422db1fc> in <module>
      3 from laika_repo.examples.kalman.gnss_kf import GNSSKalman
      4 from laika_repo.examples.kalman.kalman_helpers import run_car_ekf_offline, ObservationKind
----> 5 ekf = GNSSKalman()
      6 init_state = ekf.x
      7 init_state[:3] = est_pos

~/one/laika_repo/examples/kalman/gnss_kf.py in __init__(self, N, max_tracks)
     60     name = 'gnss'
     61     # init filter
---> 62     self.filter = EKF_sym(name, Q, x_initial, P_initial, self.dim_state, self.dim_state, maha_test_kinds=maha_test_kinds)
     63 
     64   @property

~/one/laika_repo/examples/kalman/ekf_sym.py in __init__(self, name, Q, x_initial, P_initial, dim_main, dim_main_err, N, dim_augment, dim_augment_err, maha_test_kinds)
    177     self.init_state(x_initial, P_initial, None)
    178 
--> 179     ffi, lib = wrap_compiled(name, EXTERNAL_PATH)
    180     kinds, self.feature_track_kinds = [], []
    181     for func in dir(lib):

~/one/laika_repo/examples/kalman/ffi_wrapper.py in wrap_compiled(name, directory)
     39 def wrap_compiled(name, directory):
     40   sys.path.append(directory)
---> 41   mod = __import__(name)
     42   return mod.ffi, mod.lib

ModuleNotFoundError: No module named 'gnss'

Even when I executed the code locally, without the Jupyter notebook but with a common Python script containing the same code, the error occurred. When going through the code, I found out that the EKF_sym object instantiates a filter object when instantiated. The filter relies on the EKF code written in C. This code is not manually written but assembled inside the gen_code function.

During the example, the compiled code is wrapped without existing since the code generation function, the code compiling function and the FFI wrapper are not called during the EKF_sym object instantiation (or before). Hence, the compiled C code with the name gnss cannot be found.

Since the code structure is not clear, I struggle with fixing this myself. Feel free to help me.
In my view, the error should occur in anybody's attempt to run the Kalman example.

Compute_station_pos with error

Hi,
in the Compute_station_pos notebook there is an error. In cell [6] the line
fix, _ = raw.calc_pos_fix(corr)
will fail with the message that raw_gnss.py doesn't contain calc_pos_fix(). What would be the recommended workaround here?

Best regards
Axel

combining different satellites

as far as i know gps adds noise to its signal. is it theoretically possible to use glonass, galileo and beidou data to get more precise data? if yes, does laika do this?

No orbit data

I just installed laika library, and everything looks ok but I'm getting an error:
No orbit data found. For Time 2018-01-07 00:00:00 constellations ['GPS', 'GLONASS'] valid ephem types (<EphemerisType.FINAL_ORBIT: 1>, <EphemerisType.RAPID_ORBIT: 2>, <EphemerisType.ULTRA_RAPID_ORBIT: 3>)

even when I'm using excactly the same code that is provided in Walktrough. I must say that I'm getting this error for any consellation and time I use. I tried with python 3.11 and 3,8.0

live data

Hi,

Im having trouble finding which file im supposed to modify, if i want to run laika on real time data captured by a software defined radio. Any help on this?
If that is possible, what are the laika requirements in order to run on real time SDR data from an SDR front end connected to host PC via USB. Im using the ADALM Pluto SDR which has a python host API (PyADI) .... so in theory there must be a way to run laika right????

Unexpected RuntimeException when try fetch data from future

If no ephemeris are fetched then Laika throws RuntimeException. It is confused for me, because in other cases if AstroDog doesn't found sat info then it returns None.
No ephemeris are fetched for example when you call for data from future (for example next monts/years).

Maybe user should have a possibility to check if any data is available? We have a few different cases when it may to occur:

  • Before launch satellite
  • During mission - satellite may be nonoperational or connection loses
  • User call for data from future and it doesn't exist

Probably Laika shouldn't recognize cause why data isn't available. It isn't its task. But in first case we return None (or throw RuntimeException before first satellite launch date) in second return None and in third throw RuntimeException.

I think that it should be unified or Laika should throw a custom exception.

switch to ftp-ssl for cddis data

This will remove the need for generating a .netrc file before using this library

see python example for ftp-ssl here:
https://cddis.nasa.gov/Data_and_Derived_Products/CDDIS_Archive_Access.html

from ftplib import FTP_TLS
import sys

email = sys.argv[1]
directory = sys.argv[2]
filename = sys.argv[3]

ftps = FTP_TLS(host = 'gdc.cddis.eosdis.nasa.gov')
ftps.login(user='anonymous', passwd=email)
ftps.prot_p()
ftps.cwd(directory)
ftps.retrbinary("RETR " + filename, open(filename, 'wb').write)

should be easy to add since ftplib is already used for normal ftp (and the API looks identical)

Great work! Ublox to laika-format converter?

Hi!
Really cool library! Would it be possible to have a script to dump Ublox raw data (ubx-raw/x) to laika format? I had a look at the 2k19 repo and it seems possible. Would you be able to share a script for that?

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.