Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Permission denied" when using se.NwbRecordingExtractor #85

Closed
NilsNyberg opened this issue Dec 3, 2020 · 9 comments
Closed

"Permission denied" when using se.NwbRecordingExtractor #85

NilsNyberg opened this issue Dec 3, 2020 · 9 comments

Comments

@NilsNyberg
Copy link

NilsNyberg commented Dec 3, 2020

Hi everyone,

I am trying to build a new spike sorting pipeline using SpikeInterface.

For my recordings I am using the Open Ephys acqusition system, and I am saving my recorded data in the NWB format.

Lastly, I am using the "NWB_Developer_Breakout_Session_Sep2020" tutorial as a starting point. However, I only get as far as the 2nd cell in the jupyter notebook.

I am running using the 64-bit Windows 10 Enterprise OS, and I am opening the jupyter notebook in the Anaconda Prompt (Miniconda3) with administrative privileges.

Below is the code. I have simply put the experiment_1.nwb file, outputted by the Open Ephys system, together with the settings.xml file in a new folder I have called 'nwb-dataset'. This folder exists in the same folder as the jupyter tutorial notebook (i.e., in the same folder as the 'open-ephys-dataset' folder, and the code runs fine when I run the original code on that folder):

import spikeinterface
import spikeinterface.extractors as se 
import spikeinterface.toolkit as st
import spikeinterface.sorters as ss
import spikeinterface.comparison as sc
import spikeinterface.widgets as sw
import matplotlib.pyplot as plt
import numpy as np
%matplotlib notebook

-------------------------------------------------------------------------

#recording_folder = 'open-ephys-dataset/'
#recording = se.OpenEphysRecordingExtractor(recording_folder)

recording_folder = 'nwb-dataset/'
recording = se.NwbRecordingExtractor(recording_folder)

and here is the error message:

---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-5-5b1002091086> in <module>
      3 
      4 recording_folder = 'nwb-dataset/'
----> 5 recording = se.NwbRecordingExtractor(recording_folder)

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\spikeextractors\extractors\nwbextractors\nwbextractors.py in __init__(self, file_path, electrical_series_name)
    156         se.RecordingExtractor.__init__(self)
    157         self._path = str(file_path)
--> 158         with NWBHDF5IO(self._path, 'r') as io:
    159             nwbfile = io.read()
    160             if electrical_series_name is not None:

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\pynwb\__init__.py in __init__(self, **kwargs)
    244             elif manager is None:
    245                 manager = get_manager()
--> 246         super(NWBHDF5IO, self).__init__(path, manager=manager, mode=mode, file=file_obj, comm=comm)
    247 
    248     @docval({'name': 'src_io', 'type': HDMFIO, 'doc': 'the HDMFIO object for reading the data to export'},

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in __init__(self, **kwargs)
     66         self.__mode = mode
     67         self.__file = file_obj
---> 68         super().__init__(manager, source=path)
     69         self.__built = dict()       # keep track of each builder for each dataset/group/link for each file
     70         self.__read = dict()        # keep track of which files have been read. Key is the filename value is the builder

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\io.py in __init__(self, **kwargs)
     15         self.__built = dict()
     16         self.__source = getargs('source', kwargs)
---> 17         self.open()
     18 
     19     @property

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in open(self)
    682             else:
    683                 kwargs = {}
--> 684             self.__file = File(self.source, open_flag, **kwargs)
    685 
    686     def close(self):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\h5py\_hl\files.py in __init__(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, fs_strategy, fs_persist, fs_threshold, **kwds)
    425                                fapl, fcpl=make_fcpl(track_order=track_order, fs_strategy=fs_strategy,
    426                                fs_persist=fs_persist, fs_threshold=fs_threshold),
--> 427                                swmr=swmr)
    428 
    429             if isinstance(libver, tuple):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\h5py\_hl\files.py in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
    188         if swmr and swmr_support:
    189             flags |= h5f.ACC_SWMR_READ
--> 190         fid = h5f.open(name, flags, fapl=fapl)
    191     elif mode == 'r+':
    192         fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)

h5py\_objects.pyx in h5py._objects.with_phil.wrapper()

h5py\_objects.pyx in h5py._objects.with_phil.wrapper()

h5py\h5f.pyx in h5py.h5f.open()

OSError: Unable to open file (unable to open file: name = 'nwb-dataset/', errno = 13, error message = 'Permission denied', flags = 0, o_flags = 0)

If I instead use the following code:

#recording_folder = 'open-ephys-dataset/'
#recording = se.OpenEphysRecordingExtractor(recording_folder)

recording_folder = 'nwb-dataset/experiment_1.nwb'
recording = se.NwbRecordingExtractor(recording_folder)

I get the following error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-5-f36c0438bc0a> in <module>
      3 
      4 recording_folder = 'nwb-dataset/experiment_1.nwb'
----> 5 recording = se.NwbRecordingExtractor(recording_folder)

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\spikeextractors\extractors\nwbextractors\nwbextractors.py in __init__(self, file_path, electrical_series_name)
    157         self._path = str(file_path)
    158         with NWBHDF5IO(self._path, 'r') as io:
--> 159             nwbfile = io.read()
    160             if electrical_series_name is not None:
    161                 self._electrical_series_name = electrical_series_name

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in read(self, **kwargs)
    412                                        % (self.source, self.__mode))
    413         try:
--> 414             return call_docval_func(super().read, kwargs)
    415         except UnsupportedOperation as e:
    416             if str(e) == 'Cannot build data. There are no values.':  # pragma: no cover

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in call_docval_func(func, kwargs)
    403 def call_docval_func(func, kwargs):
    404     fargs, fkwargs = fmt_docval_args(func, kwargs)
--> 405     return func(*fargs, **fkwargs)
    406 
    407 

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\io.py in read(self, **kwargs)
     34             # TODO also check that the keys are appropriate. print a better error message
     35             raise UnsupportedOperation('Cannot build data. There are no values.')
---> 36         container = self.__manager.construct(f_builder)
     37         return container
     38 

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in construct(self, **kwargs)
    236                 # we are at the top of the hierarchy,
    237                 # so it must be time to resolve parents
--> 238                 result = self.__type_map.construct(builder, self, None)
    239                 self.__resolve_parents(result)
    240             self.prebuilt(result, builder)

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in construct(self, **kwargs)
    850         if build_manager is None:
    851             build_manager = BuildManager(self)
--> 852         obj_mapper = self.get_map(builder)
    853         if obj_mapper is None:
    854             dt = builder.attributes[self.namespace_catalog.group_spec_cls.type_key()]

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in get_map(self, **kwargs)
    770             data_type = self.get_builder_dt(obj)
    771             namespace = self.get_builder_ns(obj)
--> 772             container_cls = self.get_cls(obj)
    773         # now build the ObjectMapper class
    774         mapper = self.__mappers.get(container_cls)

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in get_cls(self, **kwargs)
    699         data_type = self.get_builder_dt(builder)
    700         if data_type is None:
--> 701             raise ValueError("No data_type found for builder %s" % builder.path)
    702         namespace = self.get_builder_ns(builder)
    703         if namespace is None:

ValueError: No data_type found for builder root

Thank you in advance, and please let me know if you require any further information from me.

Best,
Nils

@alejoe91
Copy link
Member

alejoe91 commented Dec 4, 2020

Hi @NilsNyberg

What version of h5py are you using? If it's >= 3, can you try again after downgrading to 2.10.0? The new version of h5py introduced some API changes at different levels and many packages still need to be updated.

Let me know if that works

Alessio

@NilsNyberg
Copy link
Author

I was using the newest version, but tried downgrading to 2.10.0 now. I have confirmed it successfully downgraded to 2.10.0 via pip show h5py. I also again open the Anaconda console with administrative privileges. However, when I run the code I get the same error message.

I can also mention that I have tried different NWB files of different sizes (ranging from 0.66mb to 20+ gb) and I get the same error message on all of them. I haven't had a chance to try another computer yet, but gonna try to do it this afternoon.

I have attached the message again, in case you want to have a look:

---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-5-5b1002091086> in <module>
      3 
      4 recording_folder = 'nwb-dataset/'
----> 5 recording = se.NwbRecordingExtractor(recording_folder)

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\spikeextractors\extractors\nwbextractors\nwbextractors.py in __init__(self, file_path, electrical_series_name)
    156         se.RecordingExtractor.__init__(self)
    157         self._path = str(file_path)
--> 158         with NWBHDF5IO(self._path, 'r') as io:
    159             nwbfile = io.read()
    160             if electrical_series_name is not None:

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\pynwb\__init__.py in __init__(self, **kwargs)
    244             elif manager is None:
    245                 manager = get_manager()
--> 246         super(NWBHDF5IO, self).__init__(path, manager=manager, mode=mode, file=file_obj, comm=comm)
    247 
    248     @docval({'name': 'src_io', 'type': HDMFIO, 'doc': 'the HDMFIO object for reading the data to export'},

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in __init__(self, **kwargs)
     66         self.__mode = mode
     67         self.__file = file_obj
---> 68         super().__init__(manager, source=path)
     69         self.__built = dict()       # keep track of each builder for each dataset/group/link for each file
     70         self.__read = dict()        # keep track of which files have been read. Key is the filename value is the builder

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\io.py in __init__(self, **kwargs)
     15         self.__built = dict()
     16         self.__source = getargs('source', kwargs)
---> 17         self.open()
     18 
     19     @property

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in open(self)
    682             else:
    683                 kwargs = {}
--> 684             self.__file = File(self.source, open_flag, **kwargs)
    685 
    686     def close(self):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\h5py\_hl\files.py in __init__(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, **kwds)
    406                 fid = make_fid(name, mode, userblock_size,
    407                                fapl, fcpl=make_fcpl(track_order=track_order),
--> 408                                swmr=swmr)
    409 
    410             if isinstance(libver, tuple):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\h5py\_hl\files.py in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
    171         if swmr and swmr_support:
    172             flags |= h5f.ACC_SWMR_READ
--> 173         fid = h5f.open(name, flags, fapl=fapl)
    174     elif mode == 'r+':
    175         fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)

h5py\_objects.pyx in h5py._objects.with_phil.wrapper()

h5py\_objects.pyx in h5py._objects.with_phil.wrapper()

h5py\h5f.pyx in h5py.h5f.open()

OSError: Unable to open file (unable to open file: name = 'nwb-dataset/', errno = 13, error message = 'Permission denied', flags = 0, o_flags = 0)

Any other suggestions for what might be causing the permission error?

@alejoe91
Copy link
Member

alejoe91 commented Dec 4, 2020

Can you try to open it directly in NWB?

import pynwb

with pynwb.NWBHDF5IO('file-path.nwb', 'r') as io:
    nwbfile = io.read() 

If this fail as well, then the problem is somewher in the nwb file. Maybe you don't have the right file permissions?

@NilsNyberg
Copy link
Author

NilsNyberg commented Dec 4, 2020

I get the "No data_type found for builder root" error (I have attached it below). Doing a quick google search and found this: NeurodataWithoutBorders/pynwb#1077 suggesting this is likely due to the file being in NWB 1.0 and not 2.0. Does spikeinterface only work with NWB 2.0? If so, is there any way to get spikeinterface to work with NWB 1.0 as well?Afaik that is the only version of NWB that can be outputted by Open Ephys recordings (you can always record in another format and convert to NWB later I suppose, but would be great if spikeinterface would work with all the formats directly outputted by open ephys).

Using the following code

import pynwb
fpath = 'E:/spiketutorials/NWB_Developer_Breakout_Session_Sep2020/nwb-dataset/experiment_1.nwb'
with pynwb.NWBHDF5IO('experiment_1.nwb', 'r') as io:
    nwbfile = io.read() 

gives

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-9-d7594c4c141d> in <module>
      2 fpath = 'E:/spiketutorials/NWB_Developer_Breakout_Session_Sep2020/nwb-dataset/experiment_1.nwb'
      3 with pynwb.NWBHDF5IO('experiment_1.nwb', 'r') as io:
----> 4     nwbfile = io.read()

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in read(self, **kwargs)
    412                                        % (self.source, self.__mode))
    413         try:
--> 414             return call_docval_func(super().read, kwargs)
    415         except UnsupportedOperation as e:
    416             if str(e) == 'Cannot build data. There are no values.':  # pragma: no cover

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in call_docval_func(func, kwargs)
    403 def call_docval_func(func, kwargs):
    404     fargs, fkwargs = fmt_docval_args(func, kwargs)
--> 405     return func(*fargs, **fkwargs)
    406 
    407 

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\io.py in read(self, **kwargs)
     34             # TODO also check that the keys are appropriate. print a better error message
     35             raise UnsupportedOperation('Cannot build data. There are no values.')
---> 36         container = self.__manager.construct(f_builder)
     37         return container
     38 

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in construct(self, **kwargs)
    236                 # we are at the top of the hierarchy,
    237                 # so it must be time to resolve parents
--> 238                 result = self.__type_map.construct(builder, self, None)
    239                 self.__resolve_parents(result)
    240             self.prebuilt(result, builder)

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in construct(self, **kwargs)
    850         if build_manager is None:
    851             build_manager = BuildManager(self)
--> 852         obj_mapper = self.get_map(builder)
    853         if obj_mapper is None:
    854             dt = builder.attributes[self.namespace_catalog.group_spec_cls.type_key()]

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in get_map(self, **kwargs)
    770             data_type = self.get_builder_dt(obj)
    771             namespace = self.get_builder_ns(obj)
--> 772             container_cls = self.get_cls(obj)
    773         # now build the ObjectMapper class
    774         mapper = self.__mappers.get(container_cls)

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
    559             def func_call(*args, **kwargs):
    560                 pargs = _check_args(args, kwargs)
--> 561                 return func(args[0], **pargs)
    562         else:
    563             def func_call(*args, **kwargs):

c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in get_cls(self, **kwargs)
    699         data_type = self.get_builder_dt(builder)
    700         if data_type is None:
--> 701             raise ValueError("No data_type found for builder %s" % builder.path)
    702         namespace = self.get_builder_ns(builder)
    703         if namespace is None:

ValueError: No data_type found for builder root 

@alejoe91
Copy link
Member

alejoe91 commented Dec 4, 2020

@bendichter maybe it'd make sense to keep back-compatibility with NWB 1.0? But I guess that would require to have an old version of pynwb installed? Is there a way to open 1.0 files with the current pynwb version?

@NilsNyberg at the moment only 2.0 is supported unfortunately. Sorry about that!
I recommend saving Open Ephys data to binary and using the OpenEphysRecordingExtractor for the moment

@NilsNyberg
Copy link
Author

Thanks for the help (again!) - yeah I think for future-proofing I'd better move to binary instead... It's a shame Open Ephys does not yet give the availability to save directly to NWB 2.0, but I think its a work in progress at least.

@alejoe91
Copy link
Member

alejoe91 commented Dec 4, 2020

No worries at all :)

@bendichter
Copy link
Collaborator

@alejoe91, unfortunately there is no way to open NWB 1.0 files with pynwb, and they are unlikely to support it going forward. It would be nice to support OpenEphys data in spikeextractors, particularly if in a version of NWB. It would be easiest to do this via h5py. @NilsNyberg, can you share an example file?

@samuelgarcia
Copy link
Member

Can we close this ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants