kadlu issueshttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues2021-07-14T18:49:04Zhttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/15Straighforward retrieval of interpolation grid from Ocean class2021-07-14T18:49:04ZOliver KirsebomStraighforward retrieval of interpolation grid from Ocean classWhen fetching environmental data (temp, salinity, bathy, wave, etc), we should include an automatic check of how well the fetched data cover the geographic region of interest. If the coverage is low, a warning should be issued.
Before ...When fetching environmental data (temp, salinity, bathy, wave, etc), we should include an automatic check of how well the fetched data cover the geographic region of interest. If the coverage is low, a warning should be issued.
Before attempting to implement our own coverage measure algorithm, we should look into if scipy has something we can use.https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/47Precipitation data2021-07-14T18:51:50ZOliver KirsebomPrecipitation dataTo do:
1. [x] Identify an online source of precipitation data with adequate resolution and coverage (ERA5?)
2. [x] Implement fetch and loading
3. [ ] Add query method in ocean moduleTo do:
1. [x] Identify an online source of precipitation data with adequate resolution and coverage (ERA5?)
2. [x] Implement fetch and loading
3. [ ] Add query method in ocean moduleMatthewMatthewhttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/48Seafloor lithology data2020-10-09T16:36:45ZOliver KirsebomSeafloor lithology dataTo do:
1. [ ] Read the paper [https://pubs.geoscienceworld.org/geology/article-lookup/43/9/795](https://pubs.geoscienceworld.org/geology/article-lookup/43/9/795)
2. [ ] Obtain the seafloor lithology data from the paper (netcdf format...To do:
1. [ ] Read the paper [https://pubs.geoscienceworld.org/geology/article-lookup/43/9/795](https://pubs.geoscienceworld.org/geology/article-lookup/43/9/795)
2. [ ] Obtain the seafloor lithology data from the paper (netcdf format, freely available), see [https://www.earthbyte.org/seafloor-lithology-of-the-ocean-basins/](https://www.earthbyte.org/seafloor-lithology-of-the-ocean-basins/)
3. [ ] Implement loading method in KadluMatthewMatthewhttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/60Geophony output format2020-04-16T11:16:51ZOliver KirsebomGeophony output formatAs we begin implementing functionalities for storying/loading geophony modelling results, it will be impotant to align our efforts with what has already been done by the UQAR team in relation to the Ocean Soundscape Atlas. Below, is a de...As we begin implementing functionalities for storying/loading geophony modelling results, it will be impotant to align our efforts with what has already been done by the UQAR team in relation to the Ocean Soundscape Atlas. Below, is a description provided by Pierre of their format:
We are using the netCDF-4 classic format. That one is based on hdf5 and allows to have files bigger than 2GB.
In terms of metadata that we use in those files, we are following the CF and ACDD conventions. Like I said in yesterday’s meeting, we have some in-depth documentation and a summary spreadsheet which are already available here:
* [Document](https://docs.google.com/document/d/1KsEpqzOrLRPKob6MEimKzi3-H3mcSFwTZO---aRbkVw/edit)
* [Spreadsheet](https://docs.google.com/spreadsheets/d/1-hFnUodF2_52p3SqI6qRIgAKdYzb90B-3iTYWskiUy8/edit#gid=1357152032)
The next thing we will have to decide is what kind of data we will have to transfer between us so that we can use it for the Soundscape Atlas. I don’t know exactly what kind of outputs you will be able to produce. I would guess that you can make something like 3D maps of the geophony sounds (in dBs) for a particular time and frequency (like we do here for the shipping noise). In our case, after checking with the team here (that I’ve also added in CC), what we would ultimately need on our side is what we call here the risk maps. Those are the risks of exceeding a certain sound pressure level threshold which is basically 1 minus the CDF (the cumulative distribution function) of the sound pressure levels over all the timesteps you have over a day or another period of time (and that for each day, 3D point and frequency). The periods of time we use are daily, monthly and annually. We usually make 1 file by day for the daily risks, 1 file by month for the monthly risks, etc.. even if we have a dimension of time in our files. When we compute the CDF, we use that opportunity to add an error distribution over our results such as a gaussian distribution or a chi-square distribution. From those risks/CDFs we will be able to compute the quantiles maps and the impact risks maps that requires the geophony noise.
We understand that making those files is an extra step and that they can take a good amount of storage space (in our case for with the shipping noise, it takes about 12 TB of storage alone). So if you prefer, we could make those files from your sound pressure levels outputs (we have a very similar netCDF format for them) once they will be completed. This may also allow us to reduce the size of the data to transfer.
Finally, it might be good to share the same coordinates/dimensions (or some of them) such as the frequencies that we use, the depth scale and the 2D grid. Note that this is not at all a necessity since you might have already decided on the coordinates/dimensions that you are using and that you might have different needs than us for them. We can always interpolate the data if needed. Here is the list our dimensions that we are currently using :
Frequencies : 16, 20, 40, 63, 125, 200, 1000, 10000 Hz
Depth scale : 0.5, 1.5, 2.5, 5, 7.5, 10, 15, 20, 25, 50, 75, 100, 150, 200, 250, 300, 350, 400, 450, 500, 535 meters
2D coordinates : We use a GEBCO grid regular in lat-lon with a step of 30 arc seconds (or 1/120 degrees). It makes “pixels” of about 0.6km by 0.9km. I’ve uploaded on the Google Drive an empty netCDF file which contain the grid that we are using (along with the other dimensions and the metadata fields).
Sound pressure level thresholds (the x axis of the CDFs) : -15 to 250 dB with a step 0.5 dB
Time scale : We use 30 minutes between time steps for the shipping noise since we have a lot variability but it doesn’t really matter what you choose on your side since it will be converted into daily CDFs anyway. We will need the data for the year 2013 if it is possible since we have computed the shipping noise for that same year (actually we are about to finish this, we still have to complete the high frequencies with Bellhop).https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/65Unable to fetch CHS bathymetry2021-07-14T18:31:02ZMatthewUnable to fetch CHS bathymetryIt appears that the API endpoint is down, all searches return "failed to execute query". The same issue appears to happen in the CHS web form for manual data retrieval: https://gisp.dfo-mpo.gc.ca/arcgis/rest/services/FGP/CHS_NONNA_100/Im...It appears that the API endpoint is down, all searches return "failed to execute query". The same issue appears to happen in the CHS web form for manual data retrieval: https://gisp.dfo-mpo.gc.ca/arcgis/rest/services/FGP/CHS_NONNA_100/ImageServer/query
I've contacted CHS to request support on the issuehttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/70Problem with fetch-load tutorial2021-07-16T19:40:59ZOliver KirsebomProblem with fetch-load tutorialHi @matt_s , I am attempting to run the fetch_load tutorial. However, when I try to load the GEBCO data, the kernel dies. I'm guessing this is due to lack of memory, but I'm not sure.
This is the block of code that causes the problem:
...Hi @matt_s , I am attempting to run the fetch_load tutorial. However, when I try to load the GEBCO data, the kernel dies. I'm guessing this is due to lack of memory, but I'm not sure.
This is the block of code that causes the problem:
```python
kwargs = dict(
south=47, west=-63,
north=49, east=-61,
bottom=0, top=0,
start=datetime(2013, 1, 1), end=datetime(2013, 1, 7))
bathy1, lat1, lon1 = kadlu.load(source='gebco', var='bathymetry', **kwargs)
waveheight2, lat2, lon2, epoch2 = kadlu.load(source='era5', var='waveheight', **kwargs)
```MatthewMatthewhttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/77Python implementation of RAM2020-12-02T14:12:51ZOliver KirsebomPython implementation of RAMIf the PE solver currently implemented in Kadlu proves too slow for large-scale calculations, it could be supplemented by this solver which might be faster: [https://github.com/marcuskd/pyram](https://github.com/marcuskd/pyram)If the PE solver currently implemented in Kadlu proves too slow for large-scale calculations, it could be supplemented by this solver which might be faster: [https://github.com/marcuskd/pyram](https://github.com/marcuskd/pyram)https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/78New data source of potential interest2021-02-23T20:10:42ZOliver KirsebomNew data source of potential interestCristina Tollefsen (scientist at DRDC) has suggested [CONCEPTS](http://science.gc.ca/eic/site/063.nsf/eng/h_97620.html) as a potential source of environmental data for Kadlu. CONCEPTS stands for Canadian Operational Network of Coupled En...Cristina Tollefsen (scientist at DRDC) has suggested [CONCEPTS](http://science.gc.ca/eic/site/063.nsf/eng/h_97620.html) as a potential source of environmental data for Kadlu. CONCEPTS stands for Canadian Operational Network of Coupled Environmental PredicTion Systems.https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/79Potential Data Source -- NEMO from SalishSeaCast2022-02-07T13:27:03ZJillian AndersonPotential Data Source -- NEMO from SalishSeaCast## NEMO & SalishSeaCast Data
As we have spoken about in some of our recent meetings, I've been working on integrating data from the [SalishSeaCast](https://salishsea.eos.ubc.ca/erddap/index.html) with the Northern Strait of Georgia (NSOG...## NEMO & SalishSeaCast Data
As we have spoken about in some of our recent meetings, I've been working on integrating data from the [SalishSeaCast](https://salishsea.eos.ubc.ca/erddap/index.html) with the Northern Strait of Georgia (NSOG) range test data we have. In particular, I have been using the hindcast data from the NEMO v19-05 model. This data is stored in the datasets labelled as "Green, Salish Sea,..." on [ERDDAP server's griddap page](https://salishsea.eos.ubc.ca/erddap/griddap/index.html?page=1&itemsPerPage=1000).
Adding support for this dataset to kadlu could prove to lead to the easy addition of other ERDDAP servers, since many (from my data scouring over the last couple months) appear to have the same quirks.
## Determining Bounds
The one trick I encountered is that these environmental data are indexed by gridX & gridY coordinates, rather than longitude / latitude. This was a problem, since the bounds we have been using to access data in kadlu, etc all use lon/lat. So, to figure out what the grid boundaries should be, I first needed to download the [geolocation & bathymetry dataset](https://salishsea.eos.ubc.ca/erddap/griddap/ubcSSnBathymetryV17-02.html). This dataset provides the mapping from (gridX, gridY) to (longitude, latitude). Using this, I was able to find the gridX & gridY ranges (and even maximum depth) that I could use to query the ERDDAP data servers.
## Downloading Data
Once you know the bounds you want to use, accessing the data is relatively easy. The simplest way is to programmatically construct the data URL for each variable and download the data directly from the server.
Once it's downloaded, you can load it into an xarray (assuming you are downloading NetCDF files), add in lon/lat coordinates by inner-merging with the bathymetry data, and then save that to disk. I've been saving to NetCDF files, b/c that fits with the rest of the range-driver pipeline, but it could obviously be saved to a database instead.
### Jupyter Notebook
Also, I've attached the Jupyter Notebook I have been using to download the data. It's pretty bare bones and doesn't do a good job of handling the errors that can come from the ERDDAP servers (e.g. specifying depth bounds when variable doesn't have it, or 502 errors), but it should give an idea of what the download process looks like.
[NEMO_Data_Collection.ipynb](/uploads/b77392b19e7d8526d5d3c1e5e0bf659f/NEMO_Data_Collection.ipynb)https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/80Unexpected output when running tutorial2021-05-07T20:30:26ZNikita KovaloffUnexpected output when running tutorial
```
import kadlu
import numpy as np
from datetime import datetime
C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\data_util.py:44: UserWarning: storage location not configured. storage location...
```
import kadlu
import numpy as np
from datetime import datetime
C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\data_util.py:44: UserWarning: storage location not configured. storage location will be set to C:\Users\s2380\kadlu_data\
warnings.warn(f'{msg} storage location will be set to {storage_location}')
C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\data_util.py:44: UserWarning: storage location not configured. storage location will be set to C:\Users\s2380\kadlu_data\
warnings.warn(f'{msg} storage location will be set to {storage_location}')
# ocean boundaries:
bounds = dict(
start=datetime(2015, 1, 9), end=datetime(2015, 1, 9, 12),
south=47.8, north=48.8,
west=-63.4, east=-61.8,
top=0, bottom=100
)
bounds
bounds
{'start': datetime.datetime(2015, 1, 9, 0, 0),
'end': datetime.datetime(2015, 1, 9, 12, 0),
'south': 47.8,
'north': 48.8,
'west': -63.4,
'east': -61.8,
'top': 0,
'bottom': 100}
nul
o = kadlu.Ocean(**bounds) # instantiate ocean with null values
o.bathy(lat=[48.5, 48.1], lon=[-63, -62.5]) # query bathymetric interpolator for values at given coordinates
array([0., 0.])
o = kadlu.Ocean(load_salinity='hycom', **bounds) # instantiate interpolator with HYCOM salinity data
o.salinity(lat=[48.5, 48.1], lon=[-64, -62.5], depth=[0, 10]) # query interpolator for values at given coordinates
C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\data_util.py:44: UserWarning: storage location not configured. storage location will be set to C:\Users\s2380\kadlu_data\
warnings.warn(f'{msg} storage location will be set to {storage_location}')
array([31.29669351, 31.44977994])
o.waveheight(lat=[48.5, 48.1], lon=[-64, -62.5]) # query waveheight interpolator: values remain null
array([0., 0.])
sources = dict(
load_temp='hycom')
o = kadlu.Ocean(**sources, **bounds)
o
<kadlu.geospatial.ocean.Ocean at 0x267584ccee0>
o.waveheight(lat=[48.5, 48.1], lon=[-64, -62.5]) # query waveheight interpolator
array([0., 0.])
```
##### The expected output for the last line is: array(\[1.90879009, 2.17006324\])https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/84Error loading GEBCO bathymetry on Windows2021-05-13T19:59:00ZNikita KovaloffError loading GEBCO bathymetry on Windows
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\gebco.py", line 54, in fetch_bathymetry
raise NotImplementedError('no windows support for GEBCO\'s zip format. '
NotImplementedError: n...
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\gebco.py", line 54, in fetch_bathymetry
raise NotImplementedError('no windows support for GEBCO\'s zip format. '
NotImplementedError: no windows support for GEBCO's zip format. manually unzip file f{storage_cfg()}/gebco_2020_geotiff.zip and run function again
gully = kadlu.Ocean(**kwargs, **data_sources)
2021-05-08 02:50:33 untested on this platform!
Traceback (most recent call last):
File "<ipython-input-3-de47c1356ea5>", line 1, in <module>
gully = kadlu.Ocean(**kwargs, **data_sources)
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\ocean.py", line 135, in __init__
fetchmap(callback=load_map[f'{v}_{load_arg.lower()}'])
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 85, in __call__
return list(self.__call_generator__(callback=callback, **passkwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 92, in __call_generator__
if not self.serialized(kwargs, seed): self.insert_hash(kwargs, seed, callback(**passkwargs, **kwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\gebco.py", line 91, in load_bathymetry
self.fetch_bathymetry(south=south, north=north, west=west, east=east)
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\gebco.py", line 58, in fetch_bathymetry
unzipper(callback=subprocess.run, args=(f'unzip -D -n {os.path.join(storage_cfg(), "gebco_2020_geotiff.zip")} -d {storage_cfg()}'.split()))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 85, in __call__
return list(self.__call_generator__(callback=callback, **passkwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 92, in __call_generator__
if not self.serialized(kwargs, seed): self.insert_hash(kwargs, seed, callback(**passkwargs, **kwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\subprocess.py", line 505, in run
with Popen(*popenargs, **kwargs) as process:
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 108, in __init__
super(SubprocessPopen, self).__init__(*args, **kwargs)
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\subprocess.py", line 951, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\subprocess.py", line 1420, in _execute_child
hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
FileNotFoundError: [WinError 2] The system cannot find the file specified
gully = kadlu.Ocean(**kwargs, **data_sources)
Traceback (most recent call last):
File "<ipython-input-4-de47c1356ea5>", line 1, in <module>
gully = kadlu.Ocean(**kwargs, **data_sources)
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\ocean.py", line 135, in __init__
fetchmap(callback=load_map[f'{v}_{load_arg.lower()}'])
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 85, in __call__
return list(self.__call_generator__(callback=callback, **passkwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 92, in __call_generator__
if not self.serialized(kwargs, seed): self.insert_hash(kwargs, seed, callback(**passkwargs, **kwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\gebco.py", line 91, in load_bathymetry
self.fetch_bathymetry(south=south, north=north, west=west, east=east)
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\gebco.py", line 54, in fetch_bathymetry
raise NotImplementedError('no windows support for GEBCO\'s zip format. '
NotImplementedError: no windows support for GEBCO's zip format. manually unzip file f{storage_cfg()}/gebco_2020_geotiff.zip and run function again
gully = kadlu.Ocean(**kwargs, **data_sources)
2021-05-08 02:54:10 untested on this platform!
Traceback (most recent call last):
File "<ipython-input-5-de47c1356ea5>", line 1, in <module>
gully = kadlu.Ocean(**kwargs, **data_sources)
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\ocean.py", line 135, in __init__
fetchmap(callback=load_map[f'{v}_{load_arg.lower()}'])
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 85, in __call__
return list(self.__call_generator__(callback=callback, **passkwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 92, in __call_generator__
if not self.serialized(kwargs, seed): self.insert_hash(kwargs, seed, callback(**passkwargs, **kwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\gebco.py", line 91, in load_bathymetry
self.fetch_bathymetry(south=south, north=north, west=west, east=east)
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\geospatial\data_sources\gebco.py", line 58, in fetch_bathymetry
unzipper(callback=subprocess.run, args=(f'unzip -D -n {os.path.join(storage_cfg(), "gebco_2020_geotiff.zip")} -d {storage_cfg()}'.split()))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 85, in __call__
return list(self.__call_generator__(callback=callback, **passkwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\kadlu\index.py", line 92, in __call_generator__
if not self.serialized(kwargs, seed): self.insert_hash(kwargs, seed, callback(**passkwargs, **kwargs))
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\subprocess.py", line 505, in run
with Popen(*popenargs, **kwargs) as process:
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 108, in __init__
super(SubprocessPopen, self).__init__(*args, **kwargs)
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\subprocess.py", line 951, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "C:\Users\s2380\anaconda3\envs\kadlu_env\lib\subprocess.py", line 1420, in _execute_child
hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
FileNotFoundError: [WinError 2] The system cannot find the file specified
kadlu.storage_cfg()
Out[6]: C:\Users\s2380\Desktop\Dal\Acoustics\Thesis\Kadlu\https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/85Error raised when computing transmission loss2021-05-26T19:36:49ZNikita KovaloffError raised when computing transmission loss![image](/uploads/cb548902cb774f03de50c4ac8fbfd389/image.png)
```
import kadlu
kadlu.storage_cfg()
import numpy as np
from datetime import datetime
import matplotlib.pyplot as plt
kwargs = dict(
south=44, west=-61,
n...![image](/uploads/cb548902cb774f03de50c4ac8fbfd389/image.png)
```
import kadlu
kadlu.storage_cfg()
import numpy as np
from datetime import datetime
import matplotlib.pyplot as plt
kwargs = dict(
south=44, west=-61,
north=46, east=-59,
bottom=126, top=0,
start=datetime(2015, 8, 18), end=datetime(2015, 8, 25)
)
data_sources = dict(load_temp='hycom', load_salinity='hycom')
station2 = kadlu.Ocean(**kwargs, **data_sources)
seafloor = {'sound_speed':1700,'density':1.5,'attenuation':0.5}
sound_source = {'freq': 100, 'lat': 45.43, 'lon': -59.76, 'source_depth': 126}
transm_loss = kadlu.transmission_loss(seafloor=seafloor,
propagation_range=30,
**sound_source,
**kwargs,
**data_sources)
transm_loss.calc(rec_depth=[0.1, 30], vertical=True, nz_max=1000)
```https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/88Issue in plot_export_tutorial and transm_loss_tutorial2021-07-15T18:54:39ZOliver KirsebomIssue in plot_export_tutorial and transm_loss_tutorialThe plot_export_tutorial and transm_loss_tutorial are not working with the latest version of Kadlu. They give the following error,
```
---------------------------------------------------------------------------
AttributeError ...The plot_export_tutorial and transm_loss_tutorial are not working with the latest version of Kadlu. They give the following error,
```
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-3-9326560b7891> in <module>
----> 1 kadlu.plot2D(var='salinity', source='hycom', **kwargs)
AttributeError: module 'kadlu' has no attribute 'plot2D'
```MatthewMatthewhttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/91issue in geophony_tutorial2021-07-16T19:40:59ZOliver Kirsebomissue in geophony_tutorialattempt to initialize kadlu.Ocean results in the following error
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-3...attempt to initialize kadlu.Ocean results in the following error
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-3-90ca8fdb71cc> in <module>
7
8 # initialize Ocean instance
----> 9 gully = kadlu.Ocean(**bounds, **data_sources)
~/src/anaconda3/envs/kadlu_env/lib/python3.9/site-packages/kadlu/geospatial/ocean.py in __init__(self, south, west, north, east, bottom, top, start, end, **loadvars)
126 for key in loadvars.keys():
127 if not key.lstrip('load_') in vartypes and not key == 'load_precip_type':
--> 128 raise TypeError(f'{key} is not a valid argument. valid datasource args include:\n{", ".join([f"load_{v}" for v in vartypes])}')
129 load_args = [loadvars[f'load_{v}'] if f'load_{v}' in loadvars.keys() else 0 for v in vartypes]
130
TypeError: load_windspeed is not a valid argument. valid datasource args include:
load_bathymetry, load_flux_ocean, load_flux_waves, load_precipitation, load_salinity, load_snowfall, load_stress_ocean, load_temperature, load_water_u, load_water_uv, load_water_v, load_wavedir, load_wavedirection, load_waveheight, load_waveperiod, load_wind_u, load_wind_uv, load_wind_v
```MatthewMatthewhttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/96Issue loading Ifremer data source2022-01-13T19:45:30ZMatthewIssue loading Ifremer data sourcehttps://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/97Run jupyter notebook tutorials as part of CI pipeline2022-02-17T17:03:07ZOliver KirsebomRun jupyter notebook tutorials as part of CI pipelineHey @matt_s ,
How about we try to run the jupyter notebook tutorials as part of the CI pipeline?
I have been working on a bash script to accomplish this for the Ketos tutorials. See here:
https://gitlab.meridian.cs.dal.ca/public_projec...Hey @matt_s ,
How about we try to run the jupyter notebook tutorials as part of the CI pipeline?
I have been working on a bash script to accomplish this for the Ketos tutorials. See here:
https://gitlab.meridian.cs.dal.ca/public_projects/ketos_tutorials/-/blob/ci_script/tutorials/create_database_simpler/run_notebook.sh
What do you think?https://gitlab.meridian.cs.dal.ca/public_projects/kadlu/-/issues/98Benchmarking Kadlu PE solver2022-05-26T23:44:10ZOliver KirsebomBenchmarking Kadlu PE solverI have written a small script that benchmarks Kadlu's PE solver against the exact analytical solution for the Lloyd`s Mirror Problem. The transmission loss curves are in nice agreement, see below. It would be nice to include this somewhe...I have written a small script that benchmarks Kadlu's PE solver against the exact analytical solution for the Lloyd`s Mirror Problem. The transmission loss curves are in nice agreement, see below. It would be nice to include this somewhere in the Docs, and perhaps also in the form of a Jupyter Notebook tutorial.
Python script: [test.py](/uploads/045e23a64b1b4f91ee9e4ca90834cfc1/test.py)
Results:
![test](/uploads/bea7c2692546d760ca32297101665ecd/test.png)