AIRS decoder installed on hazel.mmm:/hazel2/auligne/AIRS_ret/decode_l2_airs (ZLIB, JPEG, HDF4, HDF-EOS libraries)

Two scripts are adapted from Hui-Chuan and located on hazel.mmm:/hazel2/auligne/AIRS_ret/

  • get_airx2ret.csh connects to NASA ftp site and gets HDF files for several time windows. The files are stored under ./AIRX2RET/*DATE.tar.gz
  • airx2ret.csh cycles through several dates, untar the right file and launches AIRS decoder. Output are in "little_r" format under ./airs_r
The following modifications have been made:
  • Read TAirStdErr/H2OMMRStdErr estimated obs errors from NASA HDF files and write into little_r file under U/V wind
  • Read numCloud from NASA HDF files (number of retrieved cloud layers: 0, 1 or 2)
  • Removed check using H2OMMRStdErr to discard some humidity observations
  • Write Q instead of converting Q to RH and writing RH
  • Write a single file airx2ret_DATE.r containing polarday and polarnight with a different Quality flag for T:
    • "0"=polar night
    • "1"=polar day (NB: "1" becomes "4" after OBSPROC)
  • Write numCloud into Quality flag for Height Z (0, 1 or 2)

AIRS Level 2 Decoder Wiki Page

AIRS Level 2 (retrievals) HDF decoder

get_airx2ret.csh

Hui-Chuan:

I have a script (attached, or in /mmmtmp/hclin/get_airx2ret.csh)
to download AIRS retrieval data in hdf format from NASA ftp and
group orbit-by-orbit files into tar files according to specified
time windows.

I use 'wget' utility I installed on my Mac to download data.

The script is kind of slow. You might have a better way of
downloading data.
The idea of my script is to first check the time info of the file
in its corresponding *.xml to decide if the file falls into the
desired time frame.

Anther script (decode_airx2ret_v5.csh) is used to convert hdf format
to little_r format.
The decoder src is in DATC_branch_code/convertor/decode_l2_airs,
but I don't think it works on IBM...

I can process the data for you, since the procedure is already set up
on my Mac.

  • No labels