Activity for NCO netCDF Operators

  • palle palle posted a comment on discussion Open Discussion

    @zender thanks for detailed feedback. I will run more tests next week and get back with you on the results. My guess in relation to fill and miss value is that you could have an ocean or terrestrial model that doesn't simulate anything on land or viceversa. Maybe it's for the ocean/land mask. Just a guess.

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I spoke too soon. The "no variables fit criteria for processing" error is correct. The initial dataset has time as a fixed dimension not a record dimension so ncra will not work. To average over the fixed dimension time use ncwa -a time in.nc out.nc. Or change time to a record dimension first as described in the manual. I take back the statement that NCO was at fault for that. The error message clearly describes the problem.

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I read the rest of your original post (OP). The large values are due to the missing_value issue, for which I provided a workaround. I'm hopeful that upgrading the NCO will eliminate the segfaults. The last remaining issue is about "no variables fit criteria for processing". This is an NCO issue that I will work on in the near future. Not sure why it hasn't shown up before...

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I don't have the answers to all these questions. The presence of NC_STRING is difficult for many applications to handle, though it's working fine for me on MacOS with NCO 5.3.6. There is, however, one crucial problem with the metadata. Namely, the geophysical field contains both a missing_value and a _FillValueattribute and the values of these attributes are distinct. It is silly to have a distinct missing_value for model data since models can predict values in all gridcells. I do not know what the...

  • palle palle posted a comment on discussion Open Discussion

    Hi again, I managed to install 5.3.6. I tried to skip removing the string variables but I still get a segmentation fault. Caught signal 11 (Segmentation fault: invalid permissions for mapped object at address 0x7ff80903b39f) ==== backtrace (tid: 8619) ==== 0 /home/a/miniconda3/envs/ncoenv/bin/../lib/./././libucs.so.0(ucs_handle_error+0x2fd) [0x7f91fe33e84d] 1 /home/a/miniconda3/envs/ncoenv/bin/../lib/./././libucs.so.0(+0x2fa3f) [0x7f91fe33ea3f] 2 /home/a/miniconda3/envs/ncoenv/bin/../lib/./././libucs.so.0(+0x2fc0a)...

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    You can either run on a different machine with a newer version, figure out how to install a newer version on your current machine, or build from source. Your sysadmin may know how to update the current version with sudo apt ... but I do not. Until/unless you upgrade, you may not be able to get this stuff working. I admit that I thought 5.2.1 would be new enough to include all the relevant fixes for NC_STRING but I may be wrong about that.

  • palle palle posted a comment on discussion Open Discussion

    Hi Charlie, Thanks for looking into this. I have version 5.2.1 sudo apt install nco Reading package lists... Done Building dependency tree... Done Reading state information... Done nco is already the newest version (5.2.1-1build2). 0 upgraded, 0 newly installed, 0 to remove and 207 not upgraded. $ ncks --version NCO netCDF Operators version 5.2.1 "Shabu Shabu" built by buildd on lcy02-amd64-080 at Apr 1 2024 07:01:37 ncks version 5.2.1 I am on WSL on Windows 10. Does this mean I have to build my...

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    Hello palle, Thanks for your report. I can reproduce some though not all of the issues you encounter. For example, this reports no errors for me: ncwa -O -a time,member ~/ph_CMIP6_historical_mon_185001-201412.nc ~/foo.nc What version of NCO are you using? Please upgrade to the latest possible. I will refrain from saying more about these files until you confirm the NCO version. Charlie

  • palle palle modified a comment on discussion Open Discussion

    Hi All, I am trying to perform several operations with .nc files associated to the latest IPCC report AR6 and that are used in the IPCC Atlas. The files are here https://digital.csic.es/handle/10261/332744 and the problem is the same on all the files. What I am trying to do is for example selecting only March over a period of consecutive year, and averaging over these years and over the ensemble models. Another operation is averaging over consecutive years and ensemble. If I use the command ncwa...

  • palle palle posted a comment on discussion Open Discussion

    Hi All, I am trying to perform several operations with .nc files associated to the latest IPCC report AR6 and that are used in the IPCC Atlas. The files are here https://digital.csic.es/handle/10261/332744 and the problem is the same on all the files. What I am trying to do is for example selecting only March over a period of consecutive year, and averaging over these years and over the ensemble models. Another operation is averaging over consecutive years and ensemble. If I use the command ncwa...

  • NCO netCDF Operators NCO netCDF Operators released /nco-5.3.6.tar.gz

  • NCO netCDF Operators NCO netCDF Operators released /nco-5.3.5.tar.gz

  • Andreas Dobler Andreas Dobler posted a comment on discussion Open Discussion

    Great! Thanks for the efforts, really appreciated!

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I confirm that the patch made to netcdf-c today fixes the issue raised in the original post. All hail Dave Allured for figuring out that issue and posting the patch.

  • Andreas Dobler Andreas Dobler posted a comment on discussion Open Discussion

    Thanks for your help! I have opened a GitHub issue now. https://github.com/Unidata/netcdf-c/issues/3154 Feel free to comment or follow :)

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    The plot thickens. Since the R implementation is completely independent of NCO except for relying on the same netCDF-C library, it's unlikely that the issue is due problems in R or NCO. You have enough to submit a GitHub issue to netCDF. I haven't the foggiest idea why it works in Python. More eyes on the issue would be good and lead to its eventual solution I'm sure.

  • Andreas Dobler Andreas Dobler posted a comment on discussion Open Discussion

    If I do this in Python: import netCDF4 file = "one-letter-att.nc" with netCDF4.Dataset(file, 'r+') as nc_file: nc_file.history = "" it works: ncks -M one-letter-att.nc | grep history :history = "" ; If I do it in R: library(ncdf4) nc <- nc_open("one-letter-att.nc",write=T) ncatt_put(nc, 0, "history","") nc_close(nc) it doesn't work ncks -M one-letter-att.nc | grep history :history = "c" ;

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    This is odd. I can reproduce the problem with your input dataset (and any input dataset) that is netCDF4. However, netCDF3 datasets (including yours, if I first convert it to netCDF3) do not produce this problem. Definitely a bug. NCO uses the same source code regardless of filetype but libnetcdf.a uses different internal routines to do this on netCDF3 vs. netCDF4 files. Behavior is consistent with an off-by-one error in the netCDF4 version of nc_put_att_char(). I hate to point fingers, though, and...

  • Andreas Dobler Andreas Dobler posted a comment on discussion Open Discussion

    Hi. I have upgraded to ncatted version 5.3.4 but the problem still occurs. Note: it only happens when I try to set an empty attribute AFTER I have set a one-letter attribute: Original test-file (attached) ncks -M in.nc | grep history :history = "dummy" ; Changing this to an empty string works: ncatted -h -O -a history,global,o,c,"" in.nc works.nc ncks -M works.nc | grep history :history = "" ; Setting a one-letter attribute: ncatted -h -O -a history,global,o,c,"a" in.nc one-letter-att.nc ncks -M...

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    Thanks for reporting this. I cannot replicate it with the latest version (5.3.4) so I suggest you upgrade. If the problem still occurs, please LMK and send me the file you use and I will look into it further. zender@spectral:~$ ncatted -h -O -a history,global,o,c,"" ~/nco/data/in.nc ~/foo.nc zender@spectral:~$ ncks -M ~/foo.nc | grep history :history = "" ; zender@spectral:~$

  • Andreas Dobler Andreas Dobler posted a comment on discussion Open Discussion

    Hi. I've just found this strange behaviour and was wondering if someone can replicate it and whether there is a fix to it: Setting the global history attribute to a single letter, e.g. "a" (but works with all attributes): ncatted -h -O -a history,global,o,c,"a" $file ncdump -h $file | grep history :history = "a" ; Set it to blank: ncatted -h -O -a history,global,o,c,"" $file ncdump -h $file | grep history :history = "c" ; There is always a "c"! When I set it to something longer, e.g. ncatted -h -O...

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I'm glad this workaround works for you. Per-chunk access would indeed be a nice feature but is not well supported by the netCDF API, AFAIK. In any case I wanted to point out that the slow speed is mainly due to the compression/decompression, so you might consider moving to Zstandard instead of DEFLATE. Just add --cmp=shf|zstd.

  • Johannes Timm Johannes Timm posted a comment on discussion Open Discussion

    Hi Charlie, Thanks for your detailed reply. I'm going the route of reading subsets of the variables. Since I assume less issues of merging them afterwards. I did a test with only one variable (ie. -v var1) , which succeeded and used about 82GB Ram and took 1h:12min . I suspect the memory requirements for the other variables to be very similar. A naive extrapolation 82Gb x 23 =1.89TB might have been the practical requirement for my dataset. I suspect I misunderstood the internal workings of nces,...

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    Hi Johannes, This is a good question. nces computes its statistics one file at a time, not one gridpoint at a time, as explained here. This can require a lot of RAM, as much as twice the size of an uncompressed input dataset. 1 TB of RAM is more than that, so I suspect the uncompressed files are significantly larger than the compressed files. In such a case the OOM error can only be circumvented by computing the ensemble average of subsets of each dataset with, e.g., nces -L 7 -4 -D 5 -v var1...var5...

  • Johannes Timm Johannes Timm posted a comment on discussion Open Discussion

    Hi, Im encountering out of memory problems with version 5.3.4 (conda) in regards to computing an ensemble average (and other statistics) of a larger number of model realisations. I have 42 compressed netcdf-files (each 127GB) of the same size (variables, dimensions), which contain 23 4D double variables. The dimensions are : time=Unlimited (672 currently), lon=630, lat=387, layer_number=25. Due to land-points the files contain a significant amount of Fill-Value . Due to compression the individual...

  • NCO netCDF Operators NCO netCDF Operators released /nco-5.3.4.tar.gz

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    The two main ways NCO could support LZ4 are via the CCR and via netCDF. IIRC the CCR needs some updates before it supports LZ4, but most of the work has been done. The code to support the CCR LZ4 is already in NCO (though untested). The easiest way for NCO would be if libnetCDF added LZ4 support similar to its support of Zstandard and Bzip2. Adding that feature is fairly labor-intensive since it touches many parts of the code. Anyway, once CCR and/or libnetCDF support LZ4 then NCO support would only...

  • Dan Kokron Dan Kokron posted a comment on discussion Open Discussion

    Given that the HDF5 LZ4 filter appears to be fixed (https://github.com/HDFGroup/hdf5_plugins/pull/186), what steps are needed before LZ4 can be used from NCO?

  • Charlie Zender Charlie Zender posted a comment on discussion Developers

    Glad to hear it!

  • Erika Erika posted a comment on discussion Developers

    Thank you Charlie! That worked! I was stymied again by the environment element of conda.

  • Charlie Zender Charlie Zender posted a comment on discussion Developers

    My understanding is that this will not happen if all of your packages (including libgsl are from conda-forge. This works for thousands of users. My suspicion is unchanged since 3 years ago---I suspect you are mixing packages from within and without conda-forge or other sources (e.g., MacPorts, HomeBrew). One way to get it right is to create a conda environment for just NCO: conda create -n nco -c conda-forge nco conda activate nco

  • Erika Erika modified a comment on discussion Developers

    Hi! It's now 3 years from this original post. I just installed NCO in my home directory on an HPC through "conda install conda-forge::nco" and got this same error: ncks: error while loading shared libraries: libgsl.so.25: cannot open shared object file: No such file or directory I have libgsl.so.27 .... I could 'downgrade' as suggested above to 25, but wanted to ping this again as others might have this issue if libgsl.so.27 is more common. Adding - I'm not finding much help online on how to do this...

  • Erika Erika posted a comment on discussion Developers

    Hi! It's now 3 years from this original post. I just installed NCO in my home directory on an HPC through "conda install conda-forge::nco" and got this same error: ncks: error while loading shared libraries: libgsl.so.25: cannot open shared object file: No such file or directory I have libgsl.so.27 .... I could 'downgrade' as suggested above to 25, but wanted to ping this again as others might have this issue if libgsl.so.27 is more common

  • Nir Krakauer Nir Krakauer posted a comment on discussion Help

    Thank you, Charlie! It worked for me with slight modification: ncremap -G latlon=72,144#lat_typ=uni#lon_typ=grn_wst -g ~/grd_72x144.nc ncremap -G latlon=181,360#lat_typ=cap#lon_typ=grn_ctr -g ~/grd_181x360.nc ncremap -R '--rgr lat_nm=Y#lon_nm=X#lat_dmn_nm=Y#lon_dmn_nm=X' --grd_src=${HOME}/grd_181x360.nc --grd_dst=${HOME}/grd_72x144.nc mean_f.nc mean_regrid.nc

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    Hi Nir, Your syntax is incorrect. The -g option expects the name of the grid file to output. Please read https://acme-climate.atlassian.net/wiki/spaces/DOC/pages/754286611/Regridding+E3SM+Data+with+ncremap#Prototypical-Regridding-IV:-Manual-Grid-file-Generation These commands should work for you: ncremap -G latlon=72,144#lat_typ=uni#lon_typ=grn_wst -g ~/grd_72x144.nc ncremap -R '--rgr lat_nm=Y#lon_nm=X#lat_dmn_nm=Y#lon_dmn_nm=X' -d ~/mean_f.nc -g ~/grd_181x360.nc ncremap --grd_src=${HOME}/grd_181x360.nc...

  • Nir Krakauer Nir Krakauer posted a comment on discussion Help

    I'm trying to regrid some forecasts from the IRI Data Library (sample posted at https://nirkrakauer.net/misc/mean_f.nc ) using ncremap --alg_typ=ncoaave -R "--rgr lat_nm=Y --rgr lon_nm=X" -G latlon=181,360#lat_typ=cap#lon_typ=grn_ctr -g latlon=72,144#lat_typ=uni#lon_typ=grn_wst mean_f.nc mean_regrid.nc and the result is Input #00: mean_f.nc ncks: ERROR nco_grd_nfr() unable to identify latitude and/or longitude variable. HINT: Potential causes and workarounds for this include: 1. Coordinate variables...

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    Your memory is sharp! I second the thanks to you for the references to older discussions. I have no plans to change NCO's behavior regarding NUL termination. There sure is a lot to say about NUL (meant ironically).

  • Dave Allured Dave Allured posted a comment on discussion Help

    I remember this. Over the years, NCO responded to various and contradictory user requests, to either include or omit trailing NUL's on strings of CHAR type. NCO versions went back and forth at least twice, IIRC. It is not helpful at all, that the Netcdf Users Guide advocates no less than THREE different and potentially conflicting ways to terminate CHAR strings. https://docs.unidata.ucar.edu/nug/current/best_practices.html#bp_Strings-and-Variables-of-type-char Here are a couple of the old conversations,...

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    Yes, the trailing NUL character in character attributes is a "feature" that I added to NCO in 2012 as shown in the ChangeLog file: 2012-09-03 Charlie Zender <zender@uci.edu> * Restore trailing NUL to append mode, yet overwrite previous trailing NUL 2012-09-02 Charlie Zender <zender@uci.edu> * Add trailing NUL character to ncatted in create/modify/overwrite modes I cannot remember what prompted me to make this change. BTW, congratulations for having a 14-year old version of NCO running (until rec...

  • Matt Thompson Matt Thompson posted a comment on discussion Help

    This is an odd one discovered by a user of some data some colleagues make. Note: my guess is this isn't actually an issue in NCO, but probably a difference in HDF5/netCDF, but we want to make sure. Namely, we recently had an OS upgrade on our cluster to SLES15 and this forced quite a few older running models to need to move to newer compilers, etc. And in that, newer HDF5/netCDF/NCO because, well, I couldn't figure out how to build 10-year old libraries with recent compilers. So, essentially, what...

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    Try this and see if it fixes things: ncremap -P elm -m ${output_dir}${mapfile} -i ${input_dir}${lnd_input_file} -o ${output_dir}${lnd_output_file}

  • Yuan Sun Yuan Sun posted a comment on discussion Help

    Dear NCO staff, I tried to use E3SMv2 ensemble data, where the land model output remain ne30 yet. Do you know how to get a map for regridding it to 0.9x1.25 (CESM's default 1degree mesh)? I attached my bash command below. But the plot shows difference. I do not know how to fix it. I am using NCO5.2.7. Thanks for any comments. Best, Yuan module load jaspy home_path='/gws/nopw/j04/duicv/yuansun/' drc_in_e3sm=${home_path}dataset/e3sm/inputdata/share/meshes/homme/ drc_in_cesm=${home_path}dataset/inputdata/share/scripgrids/...

  • NCO netCDF Operators NCO netCDF Operators released /nco-5.3.3.tar.gz

  • henry Butowsky henry Butowsky posted a comment on discussion Developers

    Hi Todd, Not sure exatly what you want ? For now will assume you want an array of the number of uwnd obsrvations for each timestamp, totaled over lat/lon. nbr_good[time,lat,lon]=0; where(uwnd>=1) nbr_good=1; nbr_good_month=nbr_good.total($lat,$lon) ...Henry

  • Todd  Mitchell Todd Mitchell modified a comment on discussion Developers

    Hi Henry, A few years later, I have bumped into your solution for me. I would like to generate a map of the number of months with data from monthly data, uwnd(time,lat,lon). 'uwnd' is a monthly number of observations variable. I tried ncap2 -h -s 'where(uwnd>=1)uwnd=1;nbr_good=uwnd.total();' uwnd.nobs.nc b.nc which yielded a single value, nbr_good, and not a map. What do I modify to get a map of the number of months with data? Thank you!

  • Todd  Mitchell Todd Mitchell modified a comment on discussion Developers

    Hi Henry, A few years later, I have bumped into your solution for me. I would like to generate a map of the number of months with data from monthly data, uwnd(time,lat,lon). 'uwnd' is a monthly number of observations variable. I tried ncap2 -h -s 'where(uwnd>=1)uwnd=1;nbr_good=uwnd.total();' uwnd.nobs.nc b.nc which yielded a single value nbr_good, and not a map. What do I modify to get a map of the number of months with data? Thank you!

  • Todd  Mitchell Todd Mitchell posted a comment on discussion Developers

    Hi Henry, A few years later, I have bumped into your solution for me. I would like to generate a map of the number of months with data from monthly data, uwnd(time,lat,lon). 'uwnd' is a monthly number of observations variable. I tried ncap2 -h -s 'where(uwnd>=1)uwnd=1;nbr_good=uwnd.total();' uwnd.nobs.nc b.nc which yielded a scalar, nbr_good. What do I modify to get a map of the number of months with data? Thank you!

  • Charlie Zender Charlie Zender posted a comment on discussion Developers

    Thanks Dave. That looks like a super helpful guide to this long awaited migration. I wouldn't have known about it except for you.

  • Dave Allured Dave Allured posted a comment on discussion Developers

    Posting for info only, in case anyone wants to work on antlr2 --> antlr4 migration. https://tomassetti.me/migrating-from-antlr2-to-antlr4/

  • Dan Kokron Dan Kokron posted a comment on discussion Open Discussion

    ncpdq, I'll try that. Thanks!

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    The segfault is expected. Sorry. I suggest you use ncpdq to permute the files to time is the first dimension. Then concatenate with ncrcat. Then permute back to desired order. http://nco.sf.net/nco.html#dmn_cat

  • Dan Kokron Dan Kokron posted a comment on discussion Open Discussion

    I have two files that I want to concatenate along a record dimension. The record dimension is not the slowest dimension. The first file looks like the following. The second file has 244 times that pick up where the first file leaves off. dimensions: batch = 1 ; time = UNLIMITED ; // (245 currently) lat = 181 ; lon = 360 ; level = 13 ; variables: float \10m_u_component_of_wind(batch, time, lat, lon) ncrcat test_RecTime.nc test2_RecTime.nc merge2.nc Segmentation fault (core dumped) The core dump isn't...

  • Dan Kokron Dan Kokron modified a comment on discussion Open Discussion

    Converting to netCDF3, doing the variable and dimension conversions then converting back to netCDF4 does allow me to move forward. ncks -3 -C -x -v number,lat_bnds,lon_bnds,gw,expver,area test.nc sst.nc ncrename -O -d latitude,lat -d longitude,lon -d valid_time,time sst.nc ncrename -O -v latitude,lat -v longitude,lon -v valid_time,time sst.nc ncks -4 sst.nc sst.nc4

  • Dan Kokron Dan Kokron posted a comment on discussion Open Discussion

    Converting to netCDF3, doing the variable conversions then converting back does allow me to move forward. ncks -3 -C -x -v number,lat_bnds,lon_bnds,gw,expver,area test.nc sst.nc ncrename -O -d latitude,lat -d longitude,lon -d valid_time,time sst.nc ncrename -O -v latitude,lat -v longitude,lon -v valid_time,time sst.nc ncks -4 sst.nc sst.nc4

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    p.s. Feel free to weigh-in with what you think should be done on the netCDF-C issue.

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    Unfortunately this appears to be a manifestation of the dreaded netCDF rename bug: http://nco.sf.net/nco.html#bug_nc4_rename https://github.com/Unidata/netcdf-c/issues/597 You can determine whether this is the case by transferring all the variables to a netCDF3 file and then renaming. That will not be possible with string variables, though.

  • Dan Kokron Dan Kokron posted a comment on discussion Open Discussion

    Forgot to mention my build particulars. ncrename -r NCO netCDF Operators version 5.2.4 "Kamehameha I" Linked to netCDF library version 4.9.2 compiled Dec 23 2024 16:26:39 . . Check _FillValue Yes http://nco.sf.net/nco.html#mss_val Community Codec Repo No http://github.com/ccr/ccr DAP support No http://nco.sf.net/nco.html#dap Debugging: Custom No Pedantic, bounds checking (slowest execution) Debugging: Symbols No Produce symbols for debuggers (e.g., dbx, gdb) GNU Scientific Library Yes http://nco.sf.net/nco.html#gsl...

  • Peter Miller Peter Miller posted a comment on discussion Help

    See related post: https://sourceforge.net/p/nco/discussion/9830/thread/6027c6a3/?limit=25 I had the same problem with ncra not recognising the fill value when it was NaN, so I changed it to -999: ncap2 --script 'myvar=myvar; myvar.change_miss(-999)' in_file.nc out_file.nc

  • Peter Miller Peter Miller posted a comment on discussion Help

    Thanks, Chris (from many years ago). I had the same problem with ncra not recognising the fill value when it was NaN, so I changed it to -999: ncap2 --script 'myvar=myvar; myvar.change_miss(-999)' in_file.nc out_file.nc

  • Dan Kokron Dan Kokron modified a comment on discussion Open Discussion

    I have a file with a time dimension and variable called "valid_time". I'd need to rename both of these because I will be adding the sst variable to another file with a time dimension and variable called "time" ncdump -h test.nc | grep valid_time valid_time = 6023 ; string expver(valid_time) ; float sst(valid_time, latitude, longitude) ; int64 valid_time(valid_time) ; valid_time:long_name = "time" ; valid_time:standard_name = "time" ; valid_time:calendar = "proleptic_gregorian" ; valid_time:units...

  • Dan Kokron Dan Kokron modified a comment on discussion Open Discussion

    I have a file with a time dimension and variable called "valid_time". I'd need to rename both of these because I will be adding the sst variable to another file with a time dimension and variable called "time" ncdump -h test.nc | grep valid_time valid_time = 6023 ; string expver(valid_time) ; float sst(valid_time, latitude, longitude) ; int64 valid_time(valid_time) ; valid_time:long_name = "time" ; valid_time:standard_name = "time" ; valid_time:calendar = "proleptic_gregorian" ; valid_time:units...

  • Dan Kokron Dan Kokron posted a comment on discussion Open Discussion

    I have a file with a time dimension and variable called "valid_time". I'd need to rename both of these because I will be adding the sst variable to another file with a time dimension and variable called "time" ncdump -h test.nc | grep valid_time valid_time = 6023 ; string expver(valid_time) ; float sst(valid_time, latitude, longitude) ; int64 valid_time(valid_time) ; valid_time:long_name = "time" ; valid_time:standard_name = "time" ; valid_time:calendar = "proleptic_gregorian" ; valid_time:units...

  • toys sawdust toys sawdust posted a comment on discussion Help

    For my circumstances, I don't believe I can employ those. I have an infinite record variable. https://support.hdfgroup.org/documentation/hdf5/latest/group___h5_d.html#gac1092a63b718ec949d6539590a914b60 Sprunki

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    Hi Sylvain, The newly released NCO 5.3.2 works (for me) on all ECMWF files. Various bugs have been fixed in the last two years. I suggest you try it again if you are still looking for a solution. Charlie

  • NCO netCDF Operators NCO netCDF Operators released /nco-5.3.2.tar.gz

  • Cristian Rendina Cristian Rendina posted a comment on discussion Open Discussion

    Hi, Ok solved, test ok with version 5.06 (default on ubuntu 22) and 5.2.1 (default on ubuntu 24). My ubuntu is still 20 so the default version was 4.9.1 which I suppose has the bug. SOLVED.

  • Cristian Rendina Cristian Rendina posted a comment on discussion Open Discussion

    Hi, I don't achieve the right result. I write again my step: 1) Given single time-step WRF output, first I extract t2m and coordinate variables: ncks -v T2,XLAT,XLONG wrfout_d01_2025-01-30_00\:00\:00 wrfout_d01.nc 2) Then I use ncwa ncwa -a Time wrfout_d01.nc test1.nc (Time si with 'T' not 't') 3) Then I regrid ncremap -i test1.nc -v T2 -G 'latlon=487,756#snwe=34.47,49.08,1.18,23.85' -o test2.nc here the output of ncdump, it seems to me correct (except it is without time dimension, but it is not...

  • Matt Thompson Matt Thompson posted a comment on discussion Help

    Thanks for finding it! I figured it was some group thing as I don't often see them in my work, but others have files with them. I'll try and do a snapshot if I have a chance, but my guess is you might release before then given how busy things are for me now.

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    Erik, I'm glad to be of service.

  • Erik Erik posted a comment on discussion Open Discussion

    Hi Charlie, thanks so much for the continued support. I just checked by building nco from the current Github repository, and can confirm it works! Thanks very much.

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I identified the problem and committed a fix that works for me. Please test the 5.3.2-alpha02 snapshot if you are able. Otherwise the change will appear as is in 5.3.2 in a few weeks. Some notes: 1. The manual is correct---invoke with ncremap --ps_nm=lnsp. 2. When interpolatin ECMWF/IFS data the only surface pressure in the input data file should be lnsp. Do not place a duplicate copy of PS in the input data file. The desired output surface pressure can be placed in PS in the vertical grid file....

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I can reproduce the behavior you post above. I am investigating further...

  • Cristian Rendina Cristian Rendina modified a comment on discussion Open Discussion

    Ok, I will try next days and inform about results. Thanks. - edit : sorry maybe was not clear but I already did this: ### time average ncwa -a Time wrfout_d01.nc test1.nc ### regrid ncremap -i test1.nc -v T2 -G 'latlon=487,756#snwe=34.47,49.08,1.18,23.85' -o test2.nc That is why I tried also an input with XLAT and XLONG with time dimension that gave me that error. But I suppose I should leave T2 with time dimension and only XLAT and XLONG without time dimension?

  • Cristian Rendina Cristian Rendina posted a comment on discussion Open Discussion

    Ok, I will try next days and inform about results. Thanks.

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    ncks: ERROR nco_grd_nfr() reports an identified grid variable (XLAT with rank 3 and/or XLONG with rank 3) has rank greater than two---grid variables currently must have rank 1 or 2. You must eliminate the (degenerate) time dimension from the grid variables so their rank changes from 3 to 2. Use ncwa -a time for this. The inputs are only one timestep long so the time dimension is extraneous and breaks the regridder.

  • Cristian Rendina Cristian Rendina posted a comment on discussion Open Discussion

    Hi Charlie, Thank you for quick response, I read your response above , that is why I asked about time averaging, since my original WRF output is ALREADY 1 time step , that is the problem.... wrfout_d01_2025-01-30_00\:00\:00 I just extracted t2m since it is very huge (1,5 GB). So I do not get how to resolve this part, unless for every time step I just have to set XLAT and XLONG time averaged as user that opened the post

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I copy here my response above: 2) Note that time-averaging a temporally varying grid seems like a bad idea. I suggest you hyperslab the input file into individual timesteps and regrid each of those separately to see if that helps. Do not time-average grid information for time-varying grids. Regrid each timestep separately to the same output grid, then time-average average the outputs. This is because the regridding weights are changing every timestep. Re-read this message: ncks: ERROR nco_grd_nfr()...

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    I fixed the problem in the latest NCO snapshot. It works for me. Please test if possible. Otherwise the fix will be in NCO 5.3.2 to be released in the next two weeks.

  • Cristian Rendina Cristian Rendina posted a comment on discussion Open Discussion

    Hello, Same problem here. These are the commands I used: ### subset of original file to temperature 2m ncks -v T2,XLAT,XLONG wrfout_d01_2025-01-30_00\:00\:00 wrfout_d01.nc ### time average ncwa -a Time wrfout_d01.nc test1.nc ### regrid ncremap -i test1.nc -v T2 -G 'latlon=487,756#snwe=34.47,49.08,1.18,23.85' -o test2.nc Attached image and input file (the subset) My version is 4.9.1 , I have only nco algorithm installed. ncremap --config ncremap, the NCO regridder and grid, map, and weight-generator,...

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    Thanks for providing an example. I can reproduce the behavior you report with 5.3.1. My hunch is that ncks produces and ERROR because the lossy compression routine is unhappy with the dimensions being in a different group than the variables (although this should be perfectly legal). I will investigate further....

  • Matt Thompson Matt Thompson modified a comment on discussion Help

    @zender Find attached a subsetted version of the file produced by: ncea -h -g geolocation_data -v latitude,longitude,height,range -d number_of_lines,1,2 -d number_of_pixels,1,2 -4 VNP03IMG.A2012019.1042.002.2020318134543.nc -o subset_small.nc4 It seems to show the same issue: > ncks --version NCO netCDF Operators version 5.3.1 "Dark Sunset (Palisades Lost)" built by mathomp4 on gs6101-bucy.gsfc.nasa.gov at Jan 15 2025 09:40:36 ncks version 5.3.1 > ncks -h -g geolocation_data -v latitude,longitude...

  • Matt Thompson Matt Thompson posted a comment on discussion Help

    @zender Find attached a subsetted version of the file produced by: ncea -h -g geolocation_data -v latitude,longitude,height,range -d number_of_lines,1,100 -d number_of_pixels,1,100 -4 VNP03IMG.A2012019.1042.002.2020318134543.nc -o subset.nc4 It seems to show the same issue: > ncks --version NCO netCDF Operators version 5.3.1 "Dark Sunset (Palisades Lost)" built by mathomp4 on gs6101-bucy.gsfc.nasa.gov at Jan 15 2025 09:40:36 ncks version 5.3.1 > ncks -h -g geolocation_data -v latitude,longitude -4...

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    Hi Matthew, No one has reported this behavior before, so it's definitely not a FAQ. Please add the smallest datasets possible that reproduce the above behavior and I will investigate further. CZ

  • Matt Thompson Matt Thompson posted a comment on discussion Help

    Charlie, et al, I'm sure this is a FAQ, but I'm a bit stumped. To wit, on an old OS we have an old NCO and: > ncks --version NCO netCDF Operators version 5.1.5 "Rhubarb Pi" built by mathomp4 on discover13 at Jun 5 2023 09:19:25 ncks version 5.1.5 > ncks -h -v latitude,longitude -4 --baa=4 --ppc dfl=6 VNP03IMG.A2012019.1042.002.2020318134543.nc -o test_sles12.nc4 > echo $? 0 On the new machine, I have NCO 5.3.1 and when I try the same command: > ncks --version NCO netCDF Operators version 5.3.1 "Dark...

  • Dan Kokron Dan Kokron posted a comment on discussion Open Discussion

    You understood correctly. We do currently use ncrcat --rec_apn for step 2. Thanks for confirming there is no prepend option anywhere in nco.

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    I have a hard time visualing what you are trying to do but I think I understand. You want to keep the last N records of an N+1 record-long file, but you do not want to have to copy them in order to create a new file because that is time consuming due to compression/decompression etc. So now you're wondering if instead of appending to the end of the file as in Step 2., you can instead overwrite the first timestep of the existing file with your latest output, chronological ordering be damned. Overwriting...

  • Dan Kokron Dan Kokron posted a comment on discussion Open Discussion

    We have a workflow where we continually replace the oldest data in one file with current information from another file. The current workflow looks like Create a file with latest information Append that information onto the existing archive file. This creates a file with N+1 time records. Copy the newest N time records to another file. The dimensions of the resulting archive file (N=120) dimensions: xa = 2048 ; ya = 1536 ; lead_time_01h = 241 ; lead_time_06h = 40 ; time = UNLIMITED ; // (120 currently)...

  • Charlie Zender Charlie Zender posted a comment on discussion Open Discussion

    These messages reached me while I was traveling. I will look into them in the next weekish, and try to fix any misbehavior I find. CZ

  • Erik Erik posted a comment on discussion Open Discussion

    In the meantime, I've installed/compiled v 4.9.0 using https://packages.spack.io/package.html?name=nco , so that works for me for the time being. I just couldn't seem to get ncremap to work with the current version, I tried various input formats (structured, unstructured, giving lnsp a time and lev_2 dimension, changing its name, ...) but none of those options seemed to work.

  • Erik Erik posted a comment on discussion Open Discussion

    Hi Charlie -- when using version 4.9.0 the code given above does the job, ncremap --vrt_fl=era5_light.nc cams_light.nc cams_remapped.nc but the updated version 5.2.9 (as well as 5.3.1) instead gives the following error Input #00: /home/erik-koene/NCO/cams_light.nc Vertical : era5_light.nc ncks: INFO nco_ntp_vrt() reports no variables fit vertical interpolation criteria. The output file will consist of 4 geophysical variables that are copied directly from the input (because they lack the vertical...

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    Good job figuring that all out. Usingtrintbilin for state variables and traave for fluxes should work fine. E3SM internally now uses a sophisticated variation of this invoked by the CAAS (Clip And Assured Sum) which is both monotone and conservative but it's not yet supported in ncremap.

  • pgf pgf posted a comment on discussion Help

    Hi there, I'm trying to figure out what is the best way to remap CAM-SE variables on the newer horizontal grids to a rectilinear grid in order to compute zonal means, using ncremap (v5.3.0 from conda-forge) for the remapping. I had a look at the documentation for both ncremap and E3SM (Regridding E3SM Data with ncremap and related pages) which as far as I understand moved to similar grids. Specifically, the version of CAM-SE I'm using mention this file: mesh_atm = /data/inputs/CESM/inputdata/share/meshes/ne30pg3_ESMFmesh_cdf5_c20211018.nc...

  • Dan Kokron Dan Kokron posted a comment on discussion Help

    Understood.

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    Theoretically they are exactly what you want. But they are a long-term not immediate solution. It takes a long time implement extensions to the netCDF API. You need to learn netCDF and HDF5 APIs, write the extension, debug, write unit tests, debug, then submit for feedback/review.

  • Dan Kokron Dan Kokron posted a comment on discussion Help

    Thanks for the clarification. Do those H5Dread_chunk() and H5Dwrite_chunk() functions seem like a possible solution/optimization for my scenario?

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    My guess is that the restriction on variable-length datatypes refers to NC_VLEN and NC_COMPOUND (and possibly NC_STRING) types, and not to all variables that contains a record dimension. In other words the type of single element must be known at compile timefor the HDF5 routines to work.

  • Charlie Zender Charlie Zender posted a comment on discussion Help

    No, NCO uses the netCDF API. Unidata (or you) would need to add new functions to the API in order for NCO to support them. If support were added to the netCDF API, it would be fun and easy to support those new functions in NCO.

  • Dan Kokron Dan Kokron posted a comment on discussion Help

    Hmmm, I don't think I can use those for my situation. My record variable is unlimited. From https://support.hdfgroup.org/documentation/hdf5/latest/group___h5_d.html#gac1092a63b718ec949d6539590a914b60 "Note H5Dread_chunk() and H5Dwrite_chunk() are currently not supported with parallel HDF5 and do not support variable-length datatypes."

  • Dan Kokron Dan Kokron posted a comment on discussion Help

    The NetCDF folks pointed be to HDF5. I opened a discussion at https://forum.hdfgroup.org/t/transfer-records-to-another-file-without-decompress-recompress/12960 to which I got the response. "H5Dread_chunk() and H5Dwrite_chunk() allow raw chunk data to be accessed and written while bypassing some or all compression filters." Does ncrcat use those

  • NCO netCDF Operators NCO netCDF Operators released /nco-5.3.1.tar.gz

1 >
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.