bilelomrani1 / s5p-tools

Python scripts to download and preprocess air pollution concentration level data aquired from the Sentinel-5P mission

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Exception during conversion to L3

3zero2 opened this issue · comments

My query requires a download of 160Gb and 1556 products. After downloading the first 4 files, then it tries to covert to L3, during which it crashes with the following error:

`
Traceback (most recent call last):
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info
file = self._cache[self._key]
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/lru_cache.py", line 53, in getitem
value = self._cache[key]
KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/wrfchem/Desktop/s5p-tools/L2_data/L2__AER_AI/S5P_NRTI_L2__AER_AI_20210703T152705_20210703T153205_19284_01_010400_20210703T160645.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "s5p-request.py", line 411, in
num_workers=args.num_workers,
File "s5p-request.py", line 228, in main
for filename in L2_files_urls
File "s5p-request.py", line 228, in
for filename in L2_files_urls
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/api.py", line 500, in open_dataset
**kwargs,
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 558, in open_dataset
autoclose=autoclose,
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 380, in open
return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 328, in init
self.format = self.ds.data_model
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4_.py", line 389, in ds
return self.acquire()
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/netCDF4
.py", line 383, in _acquire
with self._manager.acquire_context(needs_lock) as root:
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/contextlib.py", line 112, in enter
return next(self.gen)
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context
file, cached = self._acquire_with_cache_info(needs_lock)
File "/home/wrfchem/anaconda3/envs/Sentinel5P/lib/python3.7/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info
file = self._opener(*self._args, **kwargs)
File "netCDF4/_netCDF4.pyx", line 2291, in netCDF4._netCDF4.Dataset.init
File "netCDF4/_netCDF4.pyx", line 1855, in netCDF4._netCDF4._ensure_nc_success
FileNotFoundError: [Errno 2] No such file or directory: b'/home/wrfchem/Desktop/s5p-tools/L2_data/L2__AER_AI/S5P_NRTI_L2__AER_AI_20210703T152705_20210703T153205_19284_01_010400_20210703T160645.nc'
`

Re-running the command then downloads another 4 files and then crashes again. Any idea what is wrong here please @bilelomrani1 ? I can't seem to be able to follow what's happening.

Is the downloading of just 4 files and then trying to covert to L3 normal behaviour? Or should it first download all the files first before converting to L3?

Thank you.

Is the downloading of just 4 files and then trying to covert to L3 normal behaviour? Or should it first download all the files first before converting to L3?

It is not a normal behaviour. As of right now, all L2 should be downloaded before starting the resampling. Looking at the traces, it looks like the script proceeds to access L2 files that failed to download. Can you post your query so that I can try to reproduce your issue?

Thank you for the reply @bilelomrani1

The command I'm using is:

python s5p-request.py L2__AER_AI --aoi newmap.geojson --qa 80 --num-workers 5

The geojson file is attached: newmap.geojson.zip

Thank you.

I updated the script, your command now works properly on my machine. Please tell me if it works fine on yours.

Thank you @bilelomrani1 . The new update works as expected.