Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Also, the sorters naturally filter samples with times that are out of range. If the log file has messages about dropping samples outside the time window, check that the DSM time is correct and synchronized. The DSM time only has to be off by a few seconds for its samples to be dropped by the sorter.

Near real-time rsync

The rsync_dsms.py script uses rsync to synchronize the data files from the USB archive of each DSM into the same directory as the real-time network stream, /data/isfs/projects/Perdigao/raw_data, except the data file names begin with the DSM name and are not compressed. The script removes all the data files which are synchronized successfully except for the most recent.Right now it runs from a cron job, but it has been modified to be able to run continuously, to provide more complete status output in a JSON file, and to be more flexible as to how often DSMs are synchronized and how many are done in parallel. Hopefull those capabilities will be deployed soon  The script runs continuously, keeping at most 4 rsyncs running in parallel at a time, pulling data from each DSM every hour, and retrying failed rsyncs every 30 minutes.  The status of each DSM rsync is written to a json file in Perdigao/ISFS/logs.  A second script rsync_status.py reports to nagios whether an rsync has succeeded within the last 2 hours, and if not reports a critical check failure to NAGIOS.

A crontab entry restarts the rsync process if it ever stops running.

Raw Data Transfer to EOL

...

The barolo:/scr/tmp filesystem was selected because it had 9.6T available at the time, and because these raw data do not need to be backed up. Santiago also turned off the scrubber for /scr/tmp/isfs/perdigao. The /scr/tmp location will serve as a backup for the on-site data, and as a staging location for moving the raw data to HPSS during the project.

The BitTorrent mirror was replaced with a 15-minute rsync loop script, running in the isfs account on barolo, so that the real-time network files could be updated continuously.  BitTorrent only updates files after they are quiescent for 15 minutes, which means the real-time network stream could be as much as 4 hours behind at EOL.

Data backups to USB on ustar

Two multi-terabyte USB disk drives were attached to ustar.  A crontab entry mirrors the entire /data/isfs/ filesystem onto each USB drive every 6 hours.  The laptop /data partition is not large enough to contain the entire project's data, so older raw and merged data were aged off that disk manually as it filled up.  The iss_catalog.py script was used to keep a database of all the data files and their time coverage, and then the delete operation was used to age off the oldest data.

Code Block
/opt/local/iss-system/iss/src/iss_catalog/iss_catalog.py scan --context=perdigao/ustar --scan /data/isfs/projects/Perdigao --db ~/ustar.db --verbose
/opt/local/iss-system/iss/src/iss_catalog/iss_catalog.py delete --time 20000101, --size 100000 --context=perdigao/ustar --db $HOME/ustar.db --categories raw/nidas --categories merge

Data processing

Once a complete data file has been synchronized from a DSM (ie, it is complete because it is not the file being currently written), then all the samples which were acquired by that DSM for that same time period now reside on ustar in either the USB archive stream or the network archive stream. These two streams are merged to generate the most complete sample archive.

...