Giter Club home page Giter Club logo

iblenv's Introduction

IBLENV installation guide

Unified environment and issue tracker for IBL github repositories.

Update environment

In a terminal, navigate to your working directory, the one in you cloned the iblenv and iblapps repositories previously (typically something like int-brain-lab). Run the following commands in your terminal:

conda activate iblenv
cd iblapps
git pull
cd ..
cd iblenv
git pull
pip install -r requirements.txt --upgrade

If any errors are encountered, it is recommended to follow the "Removing an old installation" instructions and then the "Install from scratch" instructions.

Install from scratch

In order to create the unified environment for using IBL repositories, first download and install Anaconda and git, and follow their installer instructions to add each to the system path. Also, please ensure Anaconda is installed to your home directory. The below instructions will tell you how to set up and activate the unified conda environment (iblenv) and properly install multiple repositories within this environment.

In your git terminal, navigate to the directory in which you want to install the IBL repositories (e.g. create a folder named something like int-brain-lab and work from within it). Then run the following commands:

conda update -n base -c defaults conda
conda create --name iblenv python=3.10 --yes
git clone https://github.com/int-brain-lab/iblapps
pip install --editable iblapps
git clone https://github.com/int-brain-lab/iblenv
cd iblenv
pip install --requirement requirements.txt

Removing an old installation

The following command will completely remove an anaconda environment and all of its packages: conda remove --name iblenv --all

Notes:

  • Whenever you run IBL code in Python you should activate the iblenv environment, i.e. conda activate iblenv
  • If you want to launch GUIs that rely on pyqt (e.g. the IBL data exploration gui or phy) from IPython, you should first run the IPython magic command %gui qt. Additional documentation here for working with iblenv

Troubleshooting:

Spyder

If using Anaconda's Spyder IDE, please take note. When installing Spyder in a virtual environment, like iblenv, conda will add many packages to that virtual environment. In the case of iblenv, some packages installed by spyder are in direct conflict with the pip installed packages. This will create an inconsistent and unstable environment, especially when attempting to perform any sort of update on those packages. For more information about how to work with pip within conda, please read the following article.

It is not recommended to use the Spyder IDE in conjunction with iblenv. Please seek alternatives, like PyCharm or Visual Studio Code.

brotli error

If attempting to set up this environment in older versions of anaconda or a version of anaconda that has been upgraded from an older version of anaconda, you may be presented with the following error when attempting to import the ONE api:

activate iblenv
python -c "from one.api import ONE"
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
...
File "C:\Users\username\Anaconda3\envs\iblenv\lib\site-packages\urllib3\response.py", line 396, in HTTPResponse
    DECODER_ERROR_CLASSES += (brotli.error,)
AttributeError: module 'brotli' has no attribute 'error'

The source of this issue looks to be with the way anaconda handled the brolipy package. One potential solution is to run the following:

activate iblenv
conda install brotlipy
python -c "from one.api import ONE"

If this results in the same error, a full removal of anaconda (windows uninstall followed by the manual removal of various files and directories hiding in several areas) and then a fresh install of Anaconda should correct the problem.

More details can be found in this github issue.

iblenv's People

Contributors

berkgercek avatar gaellechapuis avatar iamamutt avatar juhuntenburg avatar k1o0 avatar kdharris101 avatar mayofaulkner avatar micheleangelofabbri avatar mschart avatar nbonacchi avatar oliche avatar yeebc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

iblenv's Issues

[Usage question] - How to check if a session has or not a dataset type associated with it

Describe the question you have
Is there a better way to check if a session has a particular dataset associated to it (e.g. DLC : dataset_types = ['camera.dlc']) than using one.load and checking whether the output is None ?

Example of query :

ds = one.load(dataset_types=dataset_types, eid=eid)
if ds[0] is None:
   print('no DS found')

Is your question related to a specific product?
ONE usage

using 'atlas_acronym' filter with rest query returns false positive sessions?

I tried to query all sessions that were run on 'ephyschoiceworld' and had a probe go through 'CTX', however it seems that for at least one of the returned sessions, none of the brain regions were in 'CTX' or its children. Sorry in advance if it is me who has made a mistake somewhere!

To reproduce:

import numpy as np

from oneibl.one import ONE
import brainbox.io.one as bbone


one = ONE()

# Get all sessions that had units in cortex and were run on ephys choice world.
ses = one.alyx.rest('sessions', 'list', atlas_acronym="CTX", task_protocol='ephyschoiceworld')

# Get all eids for these sessions.
eid = [None] * len(ses)
for i, s in enumerate(ses):
    url = s['url']
    # Get start index of eid from session url.
    i_eid = [i for i in range(len(url)) if url[i] == '/'][-1] + 1
    eid[i] = url[i_eid:]

# Get brain regions for `eid[0]`
e = eid[0]  # as of this post, the first returned eid was 'b39752db-abdb-47ab-ae78-e8608bbf50ed'
chs = bbone.load_channel_locations(e)
br = np.unique([chs.probe00.acronym, chs.probe01.acronym])

It seems none of the brain regions in br belong to 'CTX'

[Usage question] - How to align trials to clusters properly?

Describe the question you have
Clusters on the same probe in the same subject on the same date have a different number of trials.

To replicate:

Example clusters:

[{'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 241,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 242,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 243,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 244,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 245,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 246,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 247,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 248,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 249,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'},
 {'subject_uuid': UUID('b55c97f2-1024-4b16-9774-2d9049942f98'),
  'session_start_time': datetime.datetime(2020, 1, 27, 16, 52, 31),
  'probe_idx': 0,
  'cluster_id': 250,
  'insertion_data_source': 'Ephys aligned histology track',
  'ontology': 'CCF 2017',
  'acronym': 'ACAd6a'}]

Then run this script to get the number of trials:

trials_spike_times_by_cluster = []
for test_cluster in test_clusters:
    trials_spike_times = (ephys.AlignedTrialSpikes & test_cluster & 'event="stim on"').fetch('trial_spike_times')
    trials_spike_times_by_cluster.append(trials_spike_times)
n_trials_per_cluster = [len(trials_spike_times) for trials_spike_times in trials_spike_times_by_cluster]
n_trials_per_cluster

outputs: [872, 628, 872, 846, 745, 787, 867, 853, 872, 872]

Is your question related to a specific product?
Datajoint

Additional context
Reporting on behalf of @RylanSchaeffer

[Feature request] - Add neural responses to events on GUI to align histology/ephys

Is your feature request related to a problem? Please describe.
Add RF mapping / plotting auditory response in the GUI for aligning ephys-histology.

Describe the solution you'd like
Receptive fields for single units (or multi-unit per depth, TBD) ; auditory responses per unit (raster+Psth; separately for GoCue and NoiseCue)

Describe alternatives you've considered
Datajoint view, but not as practical to use during the alignment process.

Additional context
Recycle code from certification for RF.
https://github.com/int-brain-lab/iblapps/tree/develop/atlaselectrophysiology

[Usage question] - How to run scripts from paper-behavior? (Module not found error)

Describe the question you have
I am trying to run this script https://github.com/int-brain-lab/paper-behavior/blob/65035abc2d4f8d116d92a54d1b3bdd864f564d77/figure1c_number_of_mice.py#L61-L73
but for some reason, when I launch it in Pycharm by doing select all + ctrl E, this error is generated: ModuleNotFoundError: No module named 'paper_behavior_functions' .

When I try running the script by doing python figure1c_number_of_mice.py in the terminal, I get the error (2003, "Can't connect to MySQL server on 'localhost' ([Errno 61] Connection refused)") (I am on the university VPN).

Not sure how I should setup to not have those errors?

Is your question related to a specific product?
https://github.com/int-brain-lab/paper-behavior

[Usage question] - How to open 'pqt' files

DLC camera files (e.g. _ibl_leftCamera.dlc) are sometimes found to be in npy format, and sometimes in pqt format.
How can pqt files be open?
An example session session_uuid with such a file would be: ‘fb9bdf18-76be-452b-ac4e-21d5de3a6f9f’

Memory error when loading .nrrd in atlas.py

Out of memory error when instantiated AllenAtlas

brain_atlas = atlas.AllenAtlas(25)

gives following error

MemoryError                               Traceback (most recent call last)
<ipython-input-5-9f5562a1a02c> in <module>
----> 1 b=AllenAtlas()
~\IBL\int-brain-lab\ibllib-repo\ibllib\atlas\atlas.py in __init__(self, res_um, par, scaling, mock, hist_path)
    653                 _download_atlas_flatiron(file_label, FLAT_IRON_ATLAS_REL_PATH, par)
    654             image, _ = nrrd.read(file_image, index_order='C')  # dv, ml, ap
--> 655             label, _ = nrrd.read(file_label, index_order='C')  # dv, ml, ap
    656             label = np.swapaxes(np.swapaxes(label, 2, 0), 1, 2)  # label[iap, iml, idv]
    657             image = np.swapaxes(np.swapaxes(image, 2, 0), 1, 2)  # image[iap, iml, idv]
~\Anaconda3\envs\iblenv\lib\site-packages\nrrd\reader.py in read(filename, custom_field_map, index_order)
    506     with open(filename, 'rb') as fh:
    507         header = read_header(fh, custom_field_map)
--> 508         data = read_data(header, fh, filename, index_order)
    509
    510     return data, header
~\Anaconda3\envs\iblenv\lib\site-packages\nrrd\reader.py in read_data(header, fh, filename, index_order)
    437
    438             # Decompress and append data
--> 439             decompressed_data += decompobj.decompress(compressed_data[start_index:end_index])
    440
    441             # Update start index
MemoryError:

bb.io.extract_waveforms - regressions

From Noam

wf = bb.io.extract_waveforms(ephys_file, ts, ch, car=False)
Traceback (most recent call last):
  File “<ipython-input-827-25fa5d6dcf6c>“, line 1, in <module>
    wf = bb.io.extract_waveforms(ephys_file, ts, ch, car=False)
  File “C:\Users\Steinmetz Lab User\int-brain-lab\ibllib\brainbox\io\io.py”, line 74, in extract_waveforms
    file_m = s_reader.data  # the memmapped array

So there are 3 problems here:

  • there is a regression, so need to index directly s_reader instead of looking for the data attribute
  • the CAR is done in-place, if someone runs extract_waveforms on the main server with the CAR option, we loose all of the raw data - this has to go and need to work on a copy instead
  • this needs a simple test (we have ephys fixtures for this) so we can at least check for regressions

documentation on behavior of conda-develop

Installing the iblenv works great, but I've struggled to get an clear overview of the expected behavior of conda-develop. Specifically, I have found that different branches/versions of brainbox are available depending on whether I launch python from within the ibllib-repo vs another folder, even when iblenv is active in all cases.

Googling hasn't resulted in a description of what to expect with conda-develop - is there some hidden documentation that could be linked for clarification?

[Usage question] - How to get the brain regions after Ephys-histology alignment is done

Describe the question you have
Once the alignment between the histology and ephys data is made (using the GUI referenced below), how can one access the newly assigned brain regions to clusters and channels? Can this be retrieved via ONE?
In case when multiple alignments are done on a given ephys session, how does the query handle the return of the brain regions? (e.g. does it return only the brain regions of the latest alignment done?)

Is your question related to a specific product?
https://github.com/int-brain-lab/iblapps/tree/develop/atlaselectrophysiology

Additional context
None

Example bug - visualization3D_repeated_site

Cannot run example
https://github.com/int-brain-lab/ibllib/blob/develop/examples/one/histology/visualization3D_repeated_site.py

Error log

Traceback (most recent call last):
  File "/Applications/anaconda3/anaconda3/envs/iblenv/lib/python3.7/site-packages/IPython/core/interactiveshell.py", line 3326, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-2-ca435e83a3a7>", line 38, in <module>
    channels = bbone.load_channel_locations(eid=ses, one=one, probe=probe_label)[probe_label]
  File "/Users/gaelle/Documents/Git/ibllib/brainbox/io/one.py", line 69, in load_channel_locations
    channel_coord = one.load_dataset(eid=eid, dataset_type='channels.localCoordinates')
  File "/Users/gaelle/Documents/Git/ibllib/oneibl/one.py", line 240, in load_dataset
    return self._load(eid, dataset_types=[dataset_type], **kwargs)[0]
  File "/Users/gaelle/Documents/Git/ibllib/oneibl/one.py", line 298, in _load
    eid_str = eid[-36:]
TypeError: unhashable type: 'slice'

[Usage question] - Input not recognized in iblevn

When trying to operate python, git and start pybpod, input was not recognized by conda. Error message shows as: '..." (that particular input, e.g. "start-pybpod" when I was trying to open bpod) is not recognized as an internal or external command, operable program or batch file.

Similar error message showed up when I tried to "git pull" and "python ..."

I have opened bpoy in this computer following the same process this morning and it worked.

image

Describe the question you have
A clear and concise description of the question you have. Ex. How can I do [...] ?

Is your question related to a specific product?
Add links to relevant repositories or documents if so.

Additional context
Add any other context or screenshots about the usage question here.

[Bug report] - Bonsai Installer Crash

Bonsai installer crashes with the following screenshot when it runs as part of install.py:

BonsaiInstallError

To get this error, after deleting a previous installation's C:/iblrig, C:/iblrig_data and C:/iblrig_params folders, I cloned the iblrig repository, entered the cloned folder and ran 'python install.py'

If I then try to run a protocol that uses Bonsai (e.g. _iblrig_calibration > frame2TTL), I get another crash:

Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly 'System.Reactive.Core, Version=2.2.5.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified. at Bonsai.Launcher.LaunchWorkflowPlayer(String fileName, Dictionary2 propertyAssignments) at Bonsai.Program.Main(String[] args) at Bonsai64.Program.Main(String[] args)

Kilosort2/error: out of memory

run_batch_ks2_ibl.m , got this error: Out of memory. Type "help memory" for your options.
Error in learnAndSolve8b (line 244)
fWpc(:,:,2*size(st3,1)) = 0;
Error in run_ks2_ibl (line 73)
rez = learnAndSolve8b(rez);
Error in run_batch_ks2_ibl (line 40)
run_ks2_ibl(rootZ, rootH);
There is enough space in the server and I didn't run any other things on the server.

restart the matlab and computer, but got the same error. It seems that it always stopped at the step 4054.71 sec, 10101 / 11414 batches, 465 units, nspks: 4594.8495, mu: 13.3936, nst0: 8707, merges: 154.5829, 0.1216, then error appeared.

see slack link: https://int-brain-lab.slack.com/archives/C9L8DKZ9T/p1579015818018700
image

Error on Mac 10.12 when running Ephys Alignment GUI - seems related to cv2?

(iblenv) Jean-Pauls-MacBook-Pro:~ jean-paulnoel$ python Documents/int-brain-lab/iblapps/atlaselectrophysiology/ephys_atlas_gui.py
Traceback (most recent call last):
File “Documents/int-brain-lab/iblapps/atlaselectrophysiology/ephys_atlas_gui.py”, line 5, in
from atlaselectrophysiology.load_data import LoadData
File “/Users/jean-paulnoel/Documents/int-brain-lab/iblapps/atlaselectrophysiology/load_data.py”, line 4, in
import ibllib.pipes.histology as histology
File “/Users/jean-paulnoel/Documents/int-brain-lab/ibllib-repo/ibllib/pipes/histology.py”, line 11, in
from ibllib.ephys.spikes import probes_description as extract_probes
File “/Users/jean-paulnoel/Documents/int-brain-lab/ibllib-repo/ibllib/ephys/spikes.py”, line 9, in
from ibllib.ephys.sync_probes import apply_sync
File “/Users/jean-paulnoel/Documents/int-brain-lab/ibllib-repo/ibllib/ephys/sync_probes.py”, line 12, in
from ibllib.io.extractors.ephys_fpga import _get_sync_fronts, get_ibl_sync_map
File “/Users/jean-paulnoel/Documents/int-brain-lab/ibllib-repo/ibllib/io/extractors/init.py”, line 1, in
from ibllib.io.extractors import (biased_trials, biased_wheel, ephys_fpga,
File “/Users/jean-paulnoel/Documents/int-brain-lab/ibllib-repo/ibllib/io/extractors/biased_trials.py”, line 5, in
from ibllib.io.extractors.training_trials import ( # noqa; noqa
File “/Users/jean-paulnoel/Documents/int-brain-lab/ibllib-repo/ibllib/io/extractors/training_trials.py”, line 2, in
import cv2
File “/Users/jean-paulnoel/anaconda3/envs/iblenv/lib/python3.7/site-packages/cv2/init.py”, line 5, in
from .cv2 import *
ImportError: dlopen(/Users/jean-paulnoel/anaconda3/envs/iblenv/lib/python3.7/site-packages/cv2/cv2.cpython-37m-darwin.so, 2): Symbol not found: _inflateValidate
Referenced from: /Users/jean-paulnoel/anaconda3/envs/iblenv/lib/python3.7/site-packages/cv2/.dylibs/libpng16.16.dylib (which was built for Mac OS X 10.13)
Expected in: /usr/lib/libz.1.dylib
in /Users/jean-paulnoel/anaconda3/envs/iblenv/lib/python3.7/site-packages/cv2/.dylibs/libpng16.16.dylib

Bonsai Crash

On running trainingChoiceWorld, I get an unhandled exception:

Unhandled Exception: System.Collections.Generic.KeyNotFoundException: The specified property 'REPortName' was not found in the workflow. at Bonsai.Expressions.ExpressionBuilderGraphExtensions.SetWorkflowProperty(ExpressionBuilderGraph source, String name, Object value) at Bonsai.Launcher.LaunchWorkflowPlayer(String fileName, Dictionary2 propertyAssignments)
at Bonsai.Program.Main(String[] args)
at Bonsai64.Program.Main(String[] args)`

The software displays the error, but continues to run the task.
The ipad screen does not display the stimulus, and continues to show the desktop background.
I am able to manually run multiple trials, with sound and camera sync OK - but no output to the screen.

I'm not sure what REPortName is. The rotary encoder is on port COM5, and it is recognized by Pybpod (as shown in the following console output immediately after the Bonsai crash message):

2020-09-02 14:58:58.920 INFO [params.py:135] Writing {'NAME': 'SELECT_BOARD_NAME_(e.g.[_iblrig_mainenlab_behavior_0])', 'IBLRIG_VERSION': '6.4.2', 'COM_BPOD': 'COM4', 'COM_ROTARY_ENCODER': 'COM5', 'COM_F2TTL': 'COM6', 'F2TTL_DARK_THRESH': 70.0, 'F2TTL_LIGHT_THRESH': 30.0, 'F2TTL_CALIBRATION_DATE': '2020-09-02', 'SCREEN_FREQ_TARGET': 60, 'SCREEN_FREQ_TEST_STATUS': None, 'SCREEN_FREQ_TEST_DATE': None, 'WATER_CALIBRATION_RANGE': None, 'WATER_CALIBRATION_OPEN_TIMES': None, 'WATER_CALIBRATION_WEIGHT_PERDROP': None, 'WATER_CALIBRATION_DATE': None, 'BPOD_TTL_TEST_STATUS': None, 'BPOD_TTL_TEST_DATE': None} to C:\iblrig_params\.iblrig_params.json

It wasn't immediately clear to me how the Bonsai work flow inherits the REPortName parameter, and why it isn't getting 'COM5'.
Any suggestions are welcome.

[Bug report] - phy_launcher.py error

Describe the bug
trying to run the phy launcher, I'm getting an error below. git pull for all folders, already follow this instruction (https://github.com/int-brain-lab/iblenv) to update my environment to install new packages.

Screenshots
Traceback (most recent call last):
File "phy_launcher.py", line 8, in
from oneibl.one import ONE
File "C:\Users\Julia Desk\int-brain-lab\ibllib-repo\oneibl\one.py", line 13, in
import oneibl.webclient as wc
File "C:\Users\Julia Desk\int-brain-lab\ibllib-repo\oneibl\webclient.py", line 13, in
from alf.io import is_uuid_string
File "C:\Users\Julia Desk\int-brain-lab\ibllib-repo\alf\io.py", line 20, in
from brainbox.io import parquet
File "C:\Users\Julia Desk\int-brain-lab\ibllib-repo\brainbox\io\parquet.py", line 4, in
from numba import jit
ModuleNotFoundError: No module named 'numba'

Feature request: transfer all sessions in one go

Currently when you launch the transfer for the video or ephys it asks whether the path is correct for each session. This is not handy if you want to launch the transfer of a bunch of sessions because the query pops up after each session and you have to be at the computer to provide input. It would be better if it asks whether the path is correct or needs to be changed for all sessions first and then starts the transfer of all sessions in one go.

[Usage question] -how to extract data on local computer with other protocol than original ibltasks_CH

Describe the question you have
Some of us will modify protocols for individual projects. How should I run on the local computer the extractor for behavioral data that creates alf files? This data does not have to be uploaded to FlatIron,

For example, I created a new protocol that it is called: ibltasks_myprojectChoiceWorld; That protocol was a little bit modified from biasedChoiceWorld.

After the training I would like to generate alf files for that protocol, and for that I would like to modify extractor to get variables that are important. I wish to modify extractor for my individual needs and run on local computer.

[Usage question] - Disabling Alyx for local testing

In previous versions of iblrig, I could run a test session of trainingChoiceWorld without logging into Alyx.

With the current version, since I don't have an Alyx login, I get a crash with the following error message:
HTTPError: 400 Client Error: Bad Request for url: https://test.alyx.internationalbrainlab.org/auth-token

Is there a simple way to disable Alyx in order to run local testing?

For future versions, the intuitive place to put that configuration is in either task_settings.py of the relevant protocol, or globally from the UI's taskbar in Options > Edit user settings, where the other Alyx settings are.

[Bug report] - phylib cannot access the raw ephys file

Describe the bug
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'H:\dl62\raw_ephys_data\_spikeglx_ephysData_g0_t0.imec.ap.bin'

I used iblapps.atlaselectrophysiology.extract_files.extract_data to extract data offline, and ran into this error on valid WindowsPath.

To Reproduce
Steps to reproduce the behavior:

  1. Open Anaconda prompt
  2. activate iblenv
  3. from pathlib import Path
  4. from iblapps.atlaselectrophysiology.extract_files import extract_data
  5. ks_path = Path('H:\alfExtract\dl62\20190131_1\KS2'); ephys_path = Path('H:\alfExtract\dl62\20190131_1\raw_ephys_data'); out_path = Path('H:\alfExtract\dl62\20190131_1\alf')
  6. Error below

PermissionError Traceback (most recent call last)
in
----> 1 extract_data(ks_path, ephys_path, out_path)

~\int-brain-lab\iblapps\atlaselectrophysiology\extract_files.py in extract_data(ks_path, ephys_path, out_path)
109 if efile.get('ap') and efile.ap.exists():
110 ks2_to_alf(ks_path, ephys_path, out_path, bin_file=efile.ap,
--> 111 ampfactor=_sample2v(efile.ap), label=None, force=True)
112 extract_rmsmap(efile.ap, out_folder=out_path, spectra=False)
113 if efile.get('lf') and efile.lf.exists():

~\int-brain-lab\ibllib-repo\ibllib\ephys\spikes.py in ks2_to_alf(ks_path, bin_path, out_path, bin_file, ampfactor, label, force)
174 :return:
175 """
--> 176 m = ephysqc.phy_model_from_ks2_path(ks2_path=ks_path, bin_path=bin_path, bin_file=bin_file)
177 ephysqc.unit_metrics_ks2(ks_path, m, save=True)
178 ac = alf.EphysAlfCreator(m)

~\int-brain-lab\ibllib-repo\ibllib\ephys\ephysqc.py in phy_model_from_ks2_path(ks2_path, bin_path, bin_file)
275 sample_rate=fs,
276 n_channels_dat=nch,
--> 277 n_closest_channels=NCH_WAVEFORMS)
278 m.depths = m.get_depths()
279 return m

~\int-brain-lab\phylib\phylib\io\model.py in init(self, **kwargs)
259 self.offset = getattr(self, 'offset', 0)
260
--> 261 self._load_data()
262
263 #--------------------------------------------------------------------------

~\int-brain-lab\phylib\phylib\io\model.py in _load_data(self)
357
358 # Traces and duration.
--> 359 self.traces = self._load_traces(self.channel_mapping)
360 if self.traces is not None:
361 self.duration = self.traces.duration

~\int-brain-lab\phylib\phylib\io\model.py in _load_traces(self, channel_map)
490 traces = get_ephys_reader(
491 self.dat_path, n_channels_dat=n, dtype=self.dtype, offset=self.offset,
--> 492 sample_rate=self.sample_rate)
493 if traces is not None:
494 traces = traces[:, channel_map] # lazy permutation on the channel axis

~\int-brain-lab\phylib\phylib\io\traces.py in get_ephys_reader(obj, **kwargs)
492 if not klass:
493 return
--> 494 return klass(arg, **kwargs)
495
496

~\int-brain-lab\phylib\phylib\io\traces.py in init(self, paths, sample_rate, dtype, offset, n_channels, **kwargs)
314 self._mmaps = [
315 _memmap_flat(path, dtype=dtype, n_channels=n_channels, offset=offset)
--> 316 for path in paths]
317
318 self.sample_rate = sample_rate

~\int-brain-lab\phylib\phylib\io\traces.py in (.0)
314 self._mmaps = [
315 _memmap_flat(path, dtype=dtype, n_channels=n_channels, offset=offset)
--> 316 for path in paths]
317
318 self.sample_rate = sample_rate

~\int-brain-lab\phylib\phylib\io\traces.py in _memmap_flat(path, dtype, n_channels, offset)
161 n_samples = (fsize - offset) // (item_size * n_channels)
162 shape = (n_samples, n_channels)
--> 163 return np.memmap(path, dtype=dtype, offset=offset, shape=shape)
164
165

C:\ProgramData\Anaconda3\envs\iblenv\lib\site-packages\numpy\core\memmap.py in new(subtype, filename, dtype, mode, offset, shape, order)
273 # special case - if we were constructed with a pathlib.path,
274 # then filename is a path object, not a string
--> 275 self.filename = filename.resolve()
276 elif hasattr(fid, "name") and isinstance(fid.name, str):
277 # py3 returns int for TemporaryFile().name

C:\ProgramData\Anaconda3\envs\iblenv\lib\pathlib.py in resolve(self, strict)
1159 if self._closed:
1160 self._raise_closed()
-> 1161 s = self._flavour.resolve(self, strict=strict)
1162 if s is None:
1163 # No symlink resolution => for consistency, raise an error if

C:\ProgramData\Anaconda3\envs\iblenv\lib\pathlib.py in resolve(self, path, strict)
203 while True:
204 try:
--> 205 s = self._ext_to_normal(_getfinalpathname(s))
206 except FileNotFoundError:
207 previous_s = s

PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'H:\dl62\raw_ephys_data\_spikeglx_ephysData_g0_t0.imec.ap.bin'

Expected behavior
Error when attempting to access the file.

Screenshots
error

Desktop (please complete the following information):

  • OS: Windows 7
  • Python 3.7.4

Additional context
Running the same data extraction steps on Mac OS with Python 3.8.3 did not result in any errors.

python crashes when ONE class is instantiated with ibllib >=1.5.5

Describe the bug
WHen I try to instantiate a ONE class with my credentials, python crashes consistently with ibllib==1.5.5. When I install the previous version: ibllib==1.5.4 then it works fine.

To Reproduce

pip install ibllib==1.5.5
from oneibl.one import ONE
one=ONE()# I have my credentials stored in the .one_params file

Expected behavior

  • works as expected with ibllib==1.5.4
  • python crashes with ibllib==1.5.5

Screenshots
Failed screen shot:
image

**Desktop

  • WIndows 10

[Bug report] - Tasks except for flush_water do not run after fresh install

Describe the bug
After a clean install of iblrig, running any task other than flush_water gave us the following error:

Traceback (most recent call last): File "C:\iblrig_params\IBL\tasks\_iblrig_tasks_trainingChoiceWorld\_iblrig_tasks_trainingChoiceWorld.py", line 13, in <module> from iblrig.bpod_helper import BpodMessageCreator ModuleNotFoundError: No module named 'iblrig' Traceback (most recent call last): File "C:\iblrig\scripts\bpod_lights.py", line 7, in <module> import iblrig.params as params ModuleNotFoundError: No module named 'iblrig' Traceback (most recent call last): File "C:\iblrig\scripts\create_session.py", line 13, in <module> from iblrig.poop_count import poop ModuleNotFoundError: No module named 'iblrig'

We installed Bonsai from the IBLrig installer with no issue, but separately installing Bonsai 2.3 from the install package we used until July 2019 crashed with an error: "Could not create SSL/TLS secure channel". Bonsai 2.5.1 installed with no issue.

We tried rolling back the repository to the previous release with 'python update.py -v 6.4.1'. This crashed with the following errors (console log):

Update dump.txt

Bonsai crashed out at the end, trying to roll itself back with the following error:

Bonsai Update Crash

We also tried factory resetting the PC with a disk image from Lenovo and reinstalling all of the software. The bug persisted.

To Reproduce
Steps to reproduce the behavior:

  1. Start with a new PC (may not be necessary?), with all available Windows updates and audio and video card drivers installed
  2. Install Anaconda (We tried v2018.12 and 2020.02)
  3. Open Anaconda prompt. Type 'conda install git'
  4. cd to C:/ and run 'git install https://github.com/int-brain-lab/iblrig.git'
  5. cd into iblrig and run 'python install.py'
  6. run pybpod
  7. Load events from the Bpod board (selecting COM port), test connect to rotary encoder (also selecting COM port)
  8. Select the default user
  9. Under subject _iblrig_test_mouse, select setup _iblrig_tasks > trainingChoiceWorld (or any other protocol besides flush_water)

Expected behavior
The software crashes immediately with the error above - No module named 'iblrig'. A complete console log is attached.
Crash console dump.txt
However, despite the crash, the command prompt hangs, and must be quit with ctrl-C

Desktop (please complete the following information):

  • OS: Win10
  • Sound card: ASUS Xonar AE, driver version 1.1.18
  • Video card: NVidia Quadro P600, driver version 451.77
  • All driver updates applied via Lenovo Vantage
  • All available Windows updates applied
  • Other software installed: Adobe Acrobat Reader DC, VLC Media Player, Point Grey FlyCap2Viewer 2.12.3.2 x86 and x64

Additional context
The command prompt hangs following the errors, and needs to be quit with ctrl-C. During the crash, the state machine's status LED is remains blue and pulsing, indicating that it has not yet started a trial.

I noticed that during the initial installation (python install.py), an error was printed to the console (but the install script continued running):
ERROR: Could not install packages due to an EnvironmentError: [WinError 5] Access is denied: 'C:\Users\IBLuser\AppData\Local\Temp\pip-uninstall-_5c0l70s\_cffi_backend.cp37-win_amd64.pyd' Consider using the --user option or check the permissions.

Bug in query of sessions with an atlas acronym using Alyx REST

Querying a brain region that is at the lowest level of the hierarchy returns an empty list.

Examples:
one.alyx.rest('sessions', 'list', atlas_acronym="CA") works but one.alyx.rest('sessions', 'list', atlas_acronym="CA1") does not
one.alyx.rest('sessions', 'list', atlas_acronym="VIS") works but one.alyx.rest('sessions', 'list', atlas_acronym="VISp2/3") does not
one.alyx.rest('sessions', 'list', atlas_acronym="MBmot") works but one.alyx.rest('sessions', 'list', atlas_acronym="SNr") does not

SPIKESORTING_KS2_MATLAB Error

Describe the bug
All my recent ephys sessions have this error for KS2 spike sorting below (in the Alyx https://alyx.internationalbrainlab.org/admin-tasks/status/EphysExtractionPipeline):

**2020-09-16 00:12:17,786,786 INFO [tasks.py:68] Starting job <class 'ibllib.pipes.ephys_preprocessing.SpikeSorting_KS2_Matlab'>
2020-09-16 00:28:10,705,705 INFO [ephys_preprocessing.py:132] /home/ibladmin/Documents/PYTHON/iblscripts/deploy/serverpc/kilosort2/task_ks2_matlab.sh /mnt/s/ks2m/DY_016_2020-09-15_001_probe00
2020-09-16 00:47:46,297,297 ERROR [tasks.py:76] Traceback (most recent call last):
File "/home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages/ibllib/pipes/tasks.py", line 72, in run
self.outputs = self._run(**kwargs)
File "/home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages/ibllib/pipes/ephys_preprocessing.py", line 140, in _run
raise RuntimeError('Matlab error ks2 log below:')
RuntimeError: Matlab error ks2 log below:

2020-09-16 00:47:46,298,298 INFO [tasks.py:77] Job <class 'ibllib.pipes.ephys_preprocessing.SpikeSorting_KS2_Matlab'> errored
2020-09-16 00:47:46,298,298 INFO [tasks.py:87] N outputs: 0
2020-09-16 00:47:46,298,298 INFO [tasks.py:88] --- 2128.511780977249 seconds run-time ---**

[Bug report] - On specific eid : No data loading when launching TaskQC

Describe the bug
From ibllib master branch, I am trying to launch this:

from ibllib.qc.task_metrics import TaskQC
from ibllib.qc.qcplots import plot_results
eid = '695a6073-eae0-49e0-bb0f-e9e57a9275b9'
qc = TaskQC(eid)

it runs, but qc is basically empty. As a result, the plotting fails.

The QC seems to have run fine previously as per Alyx: https://alyx.internationalbrainlab.org/admin/actions/ephyssession/695a6073-eae0-49e0-bb0f-e9e57a9275b9/change/

To Reproduce
Code above.

Expected behavior
Plots.

[Bug report] - cronjob on serverPC is aborted

Describe the bug
Since 09-07, no alf data have been extracted for my behavior (other data like video is uploaded up flatiron). 242 tasks are ‘Waiting’, which I've never seen for just behavioral extraction.

To Reproduce

My globus server is running. However, I get

ibladmin 15312 15247  0 Sep07 pts/2    00:02:21 python jobs.py run /mnt/s0/Data/Subjects
ibladmin 15579 15478  0 Sep09 ?        00:00:48 python jobs.py create /mnt/s0/Data/Subjects
ibladmin 15580 15478  0 Sep09 ?        00:00:01 python jobs.py kill run
ibladmin 20446 20289  0 Sep07 ?        00:00:01 python jobs.py kill run
ibladmin 20783  2084  0 06:53 pts/6    00:00:00 grep --color=auto jobs.py
ibladmin 32466 32368  0 Sep08 ?        00:00:01 python jobs.py kill run```

and

(base) ibladmin@iblserver:/mnt/cshl_grid/ibl$ cat /var/log/oneibl.log
Fetching origin
From https://github.com/int-brain-lab/iblscripts
2ae1a98..ad38ae1 master -> origin/master
e443fbf..3193087 TaskQC_test -> origin/TaskQC_test
fd967f0..5bad264 develop -> origin/develop
Already on 'master'
Your branch is behind 'origin/master' by 8 commits, and can be fast-forwarded.
(use "git pull" to update your local branch)
HEAD is now at 2ae1a98 atlas: ccf 2 xyz conversions. volumes in mlapdv contiguous c-ordering. top and bottom surfaces.
Updating 2ae1a98..ad38ae1
Fast-forward
...data_releases.py => behaviour_data_releases.py} | 3 +-
tests/test_ephys_pipeline.py | 6 +
tests/test_task_qc_extractors.py | 159 +++++++++++++++++++++
tests/test_training_audio.py | 2 +-
tests/test_training_pipeline.py | 6 +
5 files changed, 173 insertions(+), 3 deletions(-)
rename tests/{test_data_releases.py => behaviour_data_releases.py} (94%)
create mode 100644 tests/test_task_qc_extractors.py
Uninstalling ibllib-1.5.9:
Successfully uninstalled ibllib-1.5.9
Uninstalling phylib-2.3a0:
Successfully uninstalled phylib-2.3a0
Collecting git+https://github.com/cortex-lab/phylib.git@ibl_tests
Cloning https://github.com/cortex-lab/phylib.git (to revision ibl_tests) to /tmp/pip-req-build-rz138ywd
Running command git clone -q https://github.com/cortex-lab/phylib.git /tmp/pip-req-build-rz138ywd
Running command git checkout -b ibl_tests --track origin/ibl_tests
Switched to a new branch 'ibl_tests'
Branch 'ibl_tests' set up to track remote branch 'ibl_tests' from 'origin'.
Requirement already satisfied: numpy in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib==2.3a0) (1.16.4)
Requirement already satisfied: scipy in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib==2.3a0) (1.3.0)
Requirement already satisfied: dask in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib==2.3a0) (2.9.1)
Requirement already satisfied: requests in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib==2.3a0) (2.22.0)
Requirement already satisfied: tqdm in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib==2.3a0) (4.36.1)
Requirement already satisfied: toolz in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib==2.3a0) (0.10.0)
Requirement already satisfied: joblib in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib==2.3a0) (0.14.0)
Requirement already satisfied: mtscomp in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib==2.3a0) (1.0.1)
Requirement already satisfied: certifi>=2017.4.17 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from requests->phylib==2.3a0) (2019.3.9)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from requests->phylib==2.3a0) (1.25.3)
Requirement already satisfied: idna<2.9,>=2.5 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from requests->phylib==2.3a0) (2.8)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from requests->phylib==2.3a0) (3.0.4)
Building wheels for collected packages: phylib
Building wheel for phylib (setup.py): started
Building wheel for phylib (setup.py): finished with status 'done'
Stored in directory: /tmp/pip-ephem-wheel-cache-wbntpgc8/wheels/18/f9/78/8492f045f770f5d4ec34b31db6bc89967856ae20dfc3fad26f
Successfully built phylib
Installing collected packages: phylib
Successfully installed phylib-2.3a0
WARNING: You are using pip version 19.1.1, however version 20.2.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Collecting git+https://github.com/int-brain-lab/ibllib.git@master
Cloning https://github.com/int-brain-lab/ibllib.git (to revision master) to /tmp/pip-req-build-x4v1qw9x
Running command git clone -q https://github.com/int-brain-lab/ibllib.git /tmp/pip-req-build-x4v1qw9x
Requirement already satisfied: click>=7.0.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (7.0)
Requirement already satisfied: colorlog>=4.0.2 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (4.0.2)
Requirement already satisfied: flake8>=3.7.8 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (3.7.8)
Requirement already satisfied: globus-sdk>=1.7.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (1.7.1)
Requirement already satisfied: graphviz in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (0.14)
Requirement already satisfied: jupyter>=1.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (1.0.0)
Requirement already satisfied: jupyterlab>=1.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (1.2.6)
Requirement already satisfied: matplotlib>=3.0.3 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (3.1.0)
Requirement already satisfied: mtscomp>=1.0.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (1.0.1)
Requirement already satisfied: numba in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (0.50.1)
Requirement already satisfied: numpy>=1.16.4 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (1.16.4)
Requirement already satisfied: opencv-python in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (4.1.1.26)
Requirement already satisfied: pandas>=0.24.2 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (0.24.2)
Requirement already satisfied: phylib>=2.2 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (2.3a0)
Requirement already satisfied: pyarrow in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (1.0.0)
Requirement already satisfied: pynrrd>=0.4.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (0.4.0)
Requirement already satisfied: requests>=2.22.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (2.22.0)
Requirement already satisfied: scikit-learn>=0.22.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (0.22.1)
Requirement already satisfied: scipy>=1.3.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (1.3.0)
Requirement already satisfied: seaborn>=0.9.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (0.9.0)
Requirement already satisfied: tqdm>=4.32.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ibllib==1.5.9) (4.36.1)
Requirement already satisfied: pyflakes<2.2.0,>=2.1.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from flake8>=3.7.8->ibllib==1.5.9) (2.1.1)
Requirement already satisfied: mccabe<0.7.0,>=0.6.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from flake8>=3.7.8->ibllib==1.5.9) (0.6.1)
Requirement already satisfied: entrypoints<0.4.0,>=0.3.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from flake8>=3.7.8->ibllib==1.5.9) (0.3)
Requirement already satisfied: pycodestyle<2.6.0,>=2.5.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from flake8>=3.7.8->ibllib==1.5.9) (2.5.0)
Requirement already satisfied: six<2.0.0,>=1.10.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from globus-sdk>=1.7.1->ibllib==1.5.9) (1.12.0)
Requirement already satisfied: pyjwt[crypto]<2.0.0,>=1.5.3 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from globus-sdk>=1.7.1->ibllib==1.5.9) (1.7.1)
Requirement already satisfied: jupyter-console in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter>=1.0->ibllib==1.5.9) (6.1.0)
Requirement already satisfied: notebook in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter>=1.0->ibllib==1.5.9) (6.0.3)
Requirement already satisfied: ipywidgets in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter>=1.0->ibllib==1.5.9) (7.5.1)
Requirement already satisfied: qtconsole in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter>=1.0->ibllib==1.5.9) (4.6.0)
Requirement already satisfied: ipykernel in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter>=1.0->ibllib==1.5.9) (5.1.4)
Requirement already satisfied: nbconvert in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter>=1.0->ibllib==1.5.9) (5.6.1)
Requirement already satisfied: jupyterlab-server~=1.0.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyterlab>=1.0->ibllib==1.5.9) (1.0.6)
Requirement already satisfied: jinja2>=2.10 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyterlab>=1.0->ibllib==1.5.9) (2.11.1)
Requirement already satisfied: tornado!=6.0.0,!=6.0.1,!=6.0.2 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyterlab>=1.0->ibllib==1.5.9) (6.0.3)
Requirement already satisfied: cycler>=0.10 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from matplotlib>=3.0.3->ibllib==1.5.9) (0.10.0)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from matplotlib>=3.0.3->ibllib==1.5.9) (2.4.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from matplotlib>=3.0.3->ibllib==1.5.9) (1.1.0)
Requirement already satisfied: python-dateutil>=2.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from matplotlib>=3.0.3->ibllib==1.5.9) (2.8.0)
Requirement already satisfied: setuptools in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from numba->ibllib==1.5.9) (41.0.1)
Requirement already satisfied: llvmlite<0.34,>=0.33.0.dev0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from numba->ibllib==1.5.9) (0.33.0)
Requirement already satisfied: pytz>=2011k in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from pandas>=0.24.2->ibllib==1.5.9) (2019.1)
Requirement already satisfied: toolz in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib>=2.2->ibllib==1.5.9) (0.10.0)
Requirement already satisfied: joblib in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib>=2.2->ibllib==1.5.9) (0.14.0)
Requirement already satisfied: dask in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from phylib>=2.2->ibllib==1.5.9) (2.9.1)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from requests>=2.22.0->ibllib==1.5.9) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from requests>=2.22.0->ibllib==1.5.9) (2.8)
Requirement already satisfied: certifi>=2017.4.17 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from requests>=2.22.0->ibllib==1.5.9) (2019.3.9)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from requests>=2.22.0->ibllib==1.5.9) (1.25.3)
Requirement already satisfied: cryptography>=1.4; extra == "crypto" in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from pyjwt[crypto]<2.0.0,>=1.5.3->globus-sdk>=1.7.1->ibllib==1.5.9) (2.6.1)
Requirement already satisfied: ipython in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter-console->jupyter>=1.0->ibllib==1.5.9) (7.5.0)
Requirement already satisfied: jupyter-client in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter-console->jupyter>=1.0->ibllib==1.5.9) (5.3.4)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter-console->jupyter>=1.0->ibllib==1.5.9) (2.0.9)
Requirement already satisfied: pygments in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyter-console->jupyter>=1.0->ibllib==1.5.9) (2.4.2)
Requirement already satisfied: Send2Trash in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from notebook->jupyter>=1.0->ibllib==1.5.9) (1.5.0)
Requirement already satisfied: traitlets>=4.2.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from notebook->jupyter>=1.0->ibllib==1.5.9) (4.3.2)
Requirement already satisfied: jupyter-core>=4.6.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from notebook->jupyter>=1.0->ibllib==1.5.9) (4.6.1)
Requirement already satisfied: nbformat in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from notebook->jupyter>=1.0->ibllib==1.5.9) (5.0.4)
Requirement already satisfied: terminado>=0.8.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from notebook->jupyter>=1.0->ibllib==1.5.9) (0.8.3)
Requirement already satisfied: prometheus-client in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from notebook->jupyter>=1.0->ibllib==1.5.9) (0.7.1)
Requirement already satisfied: pyzmq>=17 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from notebook->jupyter>=1.0->ibllib==1.5.9) (18.1.1)
Requirement already satisfied: ipython-genutils in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from notebook->jupyter>=1.0->ibllib==1.5.9) (0.2.0)
Requirement already satisfied: widgetsnbextension~=3.5.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ipywidgets->jupyter>=1.0->ibllib==1.5.9) (3.5.1)
Requirement already satisfied: bleach in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from nbconvert->jupyter>=1.0->ibllib==1.5.9) (3.1.0)
Requirement already satisfied: pandocfilters>=1.4.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from nbconvert->jupyter>=1.0->ibllib==1.5.9) (1.4.2)
Requirement already satisfied: testpath in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from nbconvert->jupyter>=1.0->ibllib==1.5.9) (0.4.4)
Requirement already satisfied: defusedxml in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from nbconvert->jupyter>=1.0->ibllib==1.5.9) (0.6.0)
Requirement already satisfied: mistune<2,>=0.8.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from nbconvert->jupyter>=1.0->ibllib==1.5.9) (0.8.4)
Requirement already satisfied: jsonschema>=3.0.1 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyterlab-server~=1.0.0->jupyterlab>=1.0->ibllib==1.5.9) (3.2.0)
Requirement already satisfied: json5 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jupyterlab-server~=1.0.0->jupyterlab>=1.0->ibllib==1.5.9) (0.9.1)
Requirement already satisfied: MarkupSafe>=0.23 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jinja2>=2.10->jupyterlab>=1.0->ibllib==1.5.9) (1.1.1)
Requirement already satisfied: cffi!=1.11.3,>=1.8 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from cryptography>=1.4; extra == "crypto"->pyjwt[crypto]<2.0.0,>=1.5.3->globus-sdk>=1.7.1->ibllib==1.5.9) (1.12.3)
Requirement already satisfied: asn1crypto>=0.21.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from cryptography>=1.4; extra == "crypto"->pyjwt[crypto]<2.0.0,>=1.5.3->globus-sdk>=1.7.1->ibllib==1.5.9) (0.24.0)
Requirement already satisfied: pexpect; sys_platform != "win32" in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ipython->jupyter-console->jupyter>=1.0->ibllib==1.5.9) (4.7.0)
Requirement already satisfied: decorator in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ipython->jupyter-console->jupyter>=1.0->ibllib==1.5.9) (4.4.0)
Requirement already satisfied: backcall in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ipython->jupyter-console->jupyter>=1.0->ibllib==1.5.9) (0.1.0)
Requirement already satisfied: pickleshare in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ipython->jupyter-console->jupyter>=1.0->ibllib==1.5.9) (0.7.5)
Requirement already satisfied: jedi>=0.10 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from ipython->jupyter-console->jupyter>=1.0->ibllib==1.5.9) (0.13.3)
Requirement already satisfied: wcwidth in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->jupyter-console->jupyter>=1.0->ibllib==1.5.9) (0.1.7)
Requirement already satisfied: ptyprocess; os_name != "nt" in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from terminado>=0.8.1->notebook->jupyter>=1.0->ibllib==1.5.9) (0.6.0)
Requirement already satisfied: webencodings in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from bleach->nbconvert->jupyter>=1.0->ibllib==1.5.9) (0.5.1)
Requirement already satisfied: attrs>=17.4.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jsonschema>=3.0.1->jupyterlab-server~=1.0.0->jupyterlab>=1.0->ibllib==1.5.9) (19.3.0)
Requirement already satisfied: pyrsistent>=0.14.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jsonschema>=3.0.1->jupyterlab-server~=1.0.0->jupyterlab>=1.0->ibllib==1.5.9) (0.15.7)
Requirement already satisfied: importlib-metadata; python_version < "3.8" in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jsonschema>=3.0.1->jupyterlab-server~=1.0.0->jupyterlab>=1.0->ibllib==1.5.9) (1.5.0)
Requirement already satisfied: pycparser in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from cffi!=1.11.3,>=1.8->cryptography>=1.4; extra == "crypto"->pyjwt[crypto]<2.0.0,>=1.5.3->globus-sdk>=1.7.1->ibllib==1.5.9) (2.19)
Requirement already satisfied: parso>=0.3.0 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from jedi>=0.10->ipython->jupyter-console->jupyter>=1.0->ibllib==1.5.9) (0.4.0)
Requirement already satisfied: zipp>=0.5 in /home/ibladmin/Documents/PYTHON/envs/iblenv/lib/python3.7/site-packages (from importlib-metadata; python_version < "3.8"->jsonschema>=3.0.1->jupyterlab-server~=1.0.0->jupyterlab>=1.0->ibllib==1.5.9) (2.2.0)
Building wheels for collected packages: ibllib
Building wheel for ibllib (setup.py): started
Building wheel for ibllib (setup.py): finished with status 'done'
Stored in directory: /tmp/pip-ephem-wheel-cache-xb4262aj/wheels/16/26/c4/9fe3563f3a12839a93e2cad59f3132c2a278dd25a657971670
Successfully built ibllib
Installing collected packages: ibllib
Successfully installed ibllib-1.5.9
WARNING: You are using pip version 19.1.1, however version 20.2.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Namespace(action='kill', dry=False, folder='create', restart=False)
status/kill request sent, waiting for job response
b'ACK STOP'
Job terminated successfully
Namespace(action='create', dry=False, folder='/mnt/s0/Data/Subjects', restart=False)
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
Connected to https://alyx.internationalbrainlab.org as anneu
ABORT !!

[Feature request] - Add and label vertical ticks (iblapps Bpod QC Viewer)

Is your feature request related to a problem? Please describe.
Information missing to identify QC problems using the Bpod viewer on iblapps.

Describe the solution you'd like

  • Vertical line indicating errorCue trigger times, stimOffTrigger_times.
  • To facilitate visualization, use the same color for trigger and actual time ticks, but use other plotting style (e.g. dashed for trigger, plain line for actual time).
  • Similarly, use same color for items belonging to the same event type, but use vertical ticks with markers on top of the line (e.g. triangle) to differentiate amongst them (e.g. trial start/end: green, dot/triangle; stimOn/Freeze/Off: blue, dot/circle/triangle) to facilitate the plot reading
  • Make vertical ticks, as well as legend ticks, slightly thicker to see the colors more easily.
  • If possible, plot legend outside the graph

Describe alternatives you've considered
No alternative possible than to provide this display.

Additional context
Current display (using iblapps > develop branch) attached.
Capture d’écran 2020-08-24 à 15 02 43

[Bug report] - Issue with launching phy/extracting waveforms for phy

When trying to launch phy, I get the errors below

I just updated iblenv and I am up to date on the develop branch of iblapps.
I have gotten this for two sessions: '-e aad23144-0e52-4eac-80c5-c4ee2decb198 -p probe01', and '-s KS022 -d 2019-12-10 -n 1 -p probe00'

Traceback (most recent call last):
  File "C:\Installs\envs\iblenv\lib\site-packages\joblib\func_inspect.py", line 281, in filter_args
    arg_dict[arg_name] = arg_defaults[position]
IndexError: tuple index out of range

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "int-brain-lab\iblapps\launch_phy\phy_launcher.py", line 100, in <module>
    launch_phy(str(args.probe_label), eid=str(args.eid))
  File "int-brain-lab\iblapps\launch_phy\phy_launcher.py", line 74, in launch_phy
    gui = controller.create_gui()
  File "C:\Installs\envs\iblenv\lib\site-packages\phy\apps\base.py", line 1638, in create_gui
    self.supervisor.attach(gui)
  File "C:\Installs\envs\iblenv\lib\site-packages\phy\cluster\supervisor.py", line 942, in attach
    gui=gui, sort=gui.state.get('ClusterView', {}).get('current_sort', None))
  File "C:\Installs\envs\iblenv\lib\site-packages\phy\cluster\supervisor.py", line 760, in _create_views
    gui, data=self.cluster_info, columns=self.columns, sort=sort)
  File "C:\Installs\envs\iblenv\lib\site-packages\phy\cluster\supervisor.py", line 916, in cluster_info
    return [self.get_cluster_info(cluster_id) for cluster_id in self.clustering.cluster_ids]
  File "C:\Installs\envs\iblenv\lib\site-packages\phy\cluster\supervisor.py", line 916, in <listcomp>
    return [self.get_cluster_info(cluster_id) for cluster_id in self.clustering.cluster_ids]
  File "C:\Installs\envs\iblenv\lib\site-packages\phy\cluster\supervisor.py", line 745, in get_cluster_info
    out[key] = func(cluster_id)
  File "C:\Installs\envs\iblenv\lib\site-packages\phy\utils\context.py", line 154, in memcached
    out = f(*args, **kwargs)
  File "C:\Users\Steinmetz Lab User\int-brain-lab\iblapps\launch_phy\phy_plugin.py", line 81, in mean_amp_true
    n_spikes_waveforms=100)['data'].data
  File "C:\Installs\envs\iblenv\lib\site-packages\joblib\memory.py", line 568, in __call__
    return self._cached_call(args, kwargs)[0]
  File "C:\Installs\envs\iblenv\lib\site-packages\joblib\memory.py", line 483, in _cached_call
    func_id, args_id = self._get_output_identifiers(*args, **kwargs)
  File "C:\Installs\envs\iblenv\lib\site-packages\joblib\memory.py", line 589, in _get_output_identifiers
    argument_hash = self._get_argument_hash(*args, **kwargs)
  File "C:\Installs\envs\iblenv\lib\site-packages\joblib\memory.py", line 583, in _get_argument_hash
    return hashing.hash(filter_args(self.func, self.ignore, args, kwargs),
  File "C:\Installs\envs\iblenv\lib\site-packages\joblib\func_inspect.py", line 288, in filter_args
    _function_called_str(name, args, kwargs))
ValueError: Wrong number of arguments for _get_waveforms_with_n_spikes(self, cluster_id, n_spikes_waveforms, batch_size_waveforms, current_filter=None):
     _get_waveforms_with_n_spikes(<phy.apps.template.gui.TemplateController object at 0x0000024EA4391848>, 0, n_spikes_waveforms=100) was called.

[Bug report] - TaskQC failing on example session eid

Describe the bug
I tried this code:

from ibllib.qc.task_metrics import TaskQC
from ibllib.qc.qcplots import plot_results
eid = 'fb70ebf7-8175-42b0-9b7a-7c6e8612226e'
qc = TaskQC(eid)
plot_results(qc)

using the master branch of ibllib, and I get this error:

Traceback (most recent call last):
  File "/Users/gaelle/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/IPython/core/interactiveshell.py", line 3343, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-2-e0e8434bbe0b>", line 5, in <module>
    plot_results(qc)
  File "/Users/gaelle/Documents/Git/int-brain-lab/ibllib-repo/ibllib/qc/qcplots.py", line 31, in plot_results
    n_trials = qc_obj.extractor.data['intervals_0'].size
KeyError: 'intervals_0'

To Reproduce
Launch code above.

Expected behavior
Plots should appear.

Add any other context about the problem here.
This specific eid used to work and was previously given as an example.

[Bug report] - TaskQC(eid) returns empty qc.extractor for specific eid

Describe the bug

Using ibllib master branch and running this code:

from ibllib.qc.task_metrics import TaskQC
from ibllib.qc.qcplots import plot_results
eid = 'c6db3304-c906-400c-aa0f-45dd3945b2ea'
qc = TaskQC(eid)

The last line runs, but qc.extractor is empty. As a result, no plotting can be done.
This is odd at the QC seems to have run, as displayed on Alyx: https://alyx.internationalbrainlab.org/admin/actions/ephyssession/c6db3304-c906-400c-aa0f-45dd3945b2ea/change/

To Reproduce
Run code above

Expected behavior
qc.extractor not empty

[Feature request] - Accomodate recordings with less than 385 channels in the atlaseletrophysiology tool

Is your feature request related to a problem? Please describe.
The current data extraction and plots in the GUI only accomodates recordings with 385 channels. This should be made flexible depending on the number of channels used in the recording.

Describe the solution you'd like
Example raw ephys and meta data for a recording with 302 channels is provide here to test the implementation. nSavedChans=302 in the .meta file.

Describe alternatives you've considered
N/A

Additional context
Current error when extracting from raw recordings with less than 385 channels

----> 1 extract_data(ks_path, ephys_path, out_path)

~/Desktop/int-brain-lab/iblapps/atlaselectrophysiology/extract_files.py in extract_data(ks_path, ephys_path, out_path)
110 ks2_to_alf(ks_path, ephys_path, out_path, bin_file=efile.ap,
111 ampfactor=_sample2v(efile.ap), label=None, force=True)
--> 112 extract_rmsmap(efile.ap, out_folder=out_path, spectra=False)
113 if efile.get('lf') and efile.lf.exists():
114 extract_rmsmap(efile.lf, out_folder=out_path)

~/Desktop/int-brain-lab/iblapps/atlaselectrophysiology/extract_files.py in extract_rmsmap(fbin, out_folder, spectra)
82
83 # crunch numbers
---> 84 rms = rmsmap(fbin, spectra=spectra)
85 # output ALF files, single precision with the optional label as suffix before extension
86 if not out_folder.exists():

~/Desktop/int-brain-lab/iblapps/atlaselectrophysiology/extract_files.py in rmsmap(fbin, spectra)
40 # loop through the whole session
41 for first, last in wingen.firstlast:
---> 42 D = sglx.read_samples(first_sample=first, last_sample=last)[0].transpose()
43 # remove low frequency noise below 1 Hz
44 D = dsp.hp(D, 1 / sglx.fs, [0, 1])

~/Desktop/int-brain-lab/ibllib-repo/ibllib/io/spikeglx.py in read_samples(self, first_sample, last_sample, channels)
128 if channels is None:
129 channels = slice(None)
--> 130 return self.read(slice(first_sample, last_sample), channels)
131
132 def read_sync_digital(self, _slice=slice(0, 10000)):

~/Desktop/int-brain-lab/ibllib-repo/ibllib/io/spikeglx.py in read(self, nsel, csel, sync)
109 """
110 darray = np.float32(self._raw[nsel, csel])
--> 111 darray *= self.channel_conversion_sample2v[self.type][csel]
112 if sync:
113 return darray, self.read_sync(nsel)

ValueError: operands could not be broadcast together with shapes (131072,302) (385,) (131072,302)

[Bug report] - Error when disabling AUTOMATIC_CALIBRATION setting

In each protocol's task_settings.py file, setting AUTOMATIC_CALIBRATION to 'False' used to allow the trainingChoiceWorld protocol to run without a calibration table on the local machine. This was useful for bench testing.

In the current version, this results in a crash:

Traceback (most recent call last): File "C:\iblrig_params\IBL\tasks\_iblrig_tasks_trainingChoiceWorld\_iblrig_tasks_trainingChoiceWorld.py", line 21, in <module> sph = SessionParamHandler(task_settings, user_settings) File "C:\iblrig_params\IBL\tasks\_iblrig_tasks_trainingChoiceWorld\session_params.py", line 81, in __init__ self.CALIB_FUNC_RANGE = adaptive.init_calib_func_range() File "C:\iblrig\iblrig\adaptive.py", line 72, in init_calib_func_range min_open_time = PARAMS["WATER_CALIBRATION_RANGE"][0] TypeError: 'NoneType' object is not subscriptable

[Bug report] - Local server doesn't restart - unrecognized mount option

On one of the local servers adding the SSD drive for spike sorting scratch ended up preventing the server to reboot.
Instructions to fix in emergency mode:

nano /etc/fstab

Then change the line from :
/dev/nvme1n1p1 /mnt/h0 ext4 0 0

to
/dev/nvme1n1p1 /mnt/h0 ext4 defaults,nofail 0 0

And reboot the computer.
If it still doesn't work, you can also comment the line by adding a # on the first line and reboot.

allen_structure_tree.csv not included in ibllib/atlas

When trying to run

from ibllib.pipes import histology

I get the error: `In [1]: from ibllib.pipes import histology
...:

FileNotFoundError Traceback (most recent call last)
in
----> 1 from ibllib.pipes import histology

~/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/ibllib/pipes/histology.py in
15
16 # origin Allen left, front, up
---> 17 brain_atlas = atlas.AllenAtlas(res_um=25)
18
19

~/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/ibllib/atlas/atlas.py in init(self, res_um, par, scaling, mock, hist_path)
697 image = np.swapaxes(np.swapaxes(image, 2, 0), 1, 2) # image[iap, iml, idv]
698 # resulting volumes origin: x right, y front, z top
--> 699 regions = regions_from_allen_csv(FILE_REGIONS)
700 xyz2dims = np.array([1, 0, 2])
701 dims2xyz = np.array([1, 0, 2])

~/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/ibllib/atlas/atlas.py in regions_from_allen_csv(csv_file)
746 :return: BrainRegions object
747 """
--> 748 df_regions = pd.read_csv(csv_file)
749 # converts colors to RGB uint8 array
750 c = np.uint32(df_regions.color_hex_triplet.map(

~/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/pandas/io/parsers.py in parser_f(filepath_or_buffer, sep, delimiter, header, names, index_col, usecols, squeeze, prefix, mangle_dupe_cols, dtype, engine, converters, true_values, false_values, skipinitialspace, skiprows, skipfooter, nrows, na_values, keep_default_na, na_filter, verbose, skip_blank_lines, parse_dates, infer_datetime_format, keep_date_col, date_parser, dayfirst, cache_dates, iterator, chunksize, compression, thousands, decimal, lineterminator, quotechar, quoting, doublequote, escapechar, comment, encoding, dialect, error_bad_lines, warn_bad_lines, delim_whitespace, low_memory, memory_map, float_precision)
674 )
675
--> 676 return _read(filepath_or_buffer, kwds)
677
678 parser_f.name = name

~/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/pandas/io/parsers.py in _read(filepath_or_buffer, kwds)
446
447 # Create the parser.
--> 448 parser = TextFileReader(fp_or_buf, **kwds)
449
450 if chunksize or iterator:

~/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/pandas/io/parsers.py in init(self, f, engine, **kwds)
878 self.options["has_index_names"] = kwds["has_index_names"]
879
--> 880 self._make_engine(self.engine)
881
882 def close(self):

~/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/pandas/io/parsers.py in _make_engine(self, engine)
1112 def _make_engine(self, engine="c"):
1113 if engine == "c":
-> 1114 self._engine = CParserWrapper(self.f, **self.options)
1115 else:
1116 if engine == "python":

~/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/pandas/io/parsers.py in init(self, src, **kwds)
1889 kwds["usecols"] = self.usecols
1890
-> 1891 self._reader = parsers.TextReader(src, **kwds)
1892 self.unnamed_cols = self._reader.unnamed_cols
1893

pandas/_libs/parsers.pyx in pandas._libs.parsers.TextReader.cinit()

pandas/_libs/parsers.pyx in pandas._libs.parsers.TextReader._setup_parser_source()

FileNotFoundError: [Errno 2] File /Users/ckrasnia/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/ibllib/atlas/allen_structure_tree.csv does not exist: '/Users/ckrasnia/opt/anaconda3/envs/iblenv/lib/python3.7/site-packages/ibllib/atlas/allen_structure_tree.csv'`

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.