Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Note: You can set up cylc-flow and cylc-uiserver in separate virtual environments. If you install cylc-flow in your JEDI-Skylab virtual environment, you can then install cylc-uiserver in a separate virutal virtual env and use that environment to kick off the GUI. Follow steps "Adding cylc to your workflow" and then "Setting up venv for cylc GIU".

...

  1. Activate your Skylab virtual environment, if you haven't already. 
    Code Block
    languageshell
    source $JEDI_ROOT/venv/bin/activate
  2. Force install cylc-flow. Note, cylc is installed in spack-stack but it has some compatibility issues so it is easiest at the point to re-install in your venv. 
    Code Block
    languageshell
    pip install cylc-flow --force-reinstall
  3. Check cylc location and test with skylab/experiments/workflow-test.yaml: 
  4. (*Optional - as needeed) rsync  is required for the workflow.  If a which rsync  does not return this application (if one is on a brand new OrbStack machine):
    Code Block
    languageshell
  5. which
  6. sudo su
    apt install -y rsync
    exit
  7. Check cylc location and test with skylab/experiments/workflow-engine-test.yaml: 
    Code Block
    languageshell
    which cylc
    createcylc
    create_experiment.py skylab/experiments/workflow-engine-test.yaml

Setting up venv for cylc GUI (ex: mac):

  1. Needs python3.9 for the UI, therefore run: 
    Code Block
    languageshell
    brew install python@3.9
  2. Update PYTHONPATH:
    Code Block
    languageshell
    module purge
    unset PYTHONPATH
    unset PYTHONHOME
  3. Created venv without spack-stack:
    Code Block
    languageshell
    python3.9 -m venv --system-site-packages cylc-venv
  4. Activate venv:
    Code Block
    languageshell
    source cylc-venv/bin/activate
  5. Install cylc:
    Code Block
    languageshell
    pip install cylc-flow
    pip install cylc-uiserver
    pip install cylc-rose metomi-rose
  6. Install optional:
    Code Block
    languageshell
    pip install 'cylc-flow[tutorial]'
    pip install 'cylc-uiserver[hub]'
  7. Graphviz:
    Code Block
    languageshell
    brew install graphviz
  8. To test the GUI:
    Code Block
    languageshell
    cylc gui
  9.  graphviz
  10. To test the GUI:
    Code Block
    languageshell
    cylc gui

Setting up cylc localhost configuration:

In order to run Skylab with the correct virtual environment, since cylc ignores PYTHONPATH, you need to add a global.cylc file to run an init-script before runtime that will activate the JEDI venv. This should go in ~/.cylc/flow/global.cylc. The install block is optional for now, but it sets your cylc work directory and run directory to mimic ecflow. Note, these will automatically put a cylc-run directory under the parent EWOK_WORKDIR and EWOK_FLOWDIR directories.

vi ~/.cylc/flow/global.cylc

Code Block
languageshell
title~/.cylc/flow/global.cylc
[install]
    [[symlink dirs]]  
        [[[localhost]]]
            work = ${EWOK_WORKDIR}
            run = ${EWOK_FLOWDIR}

[platforms]
    [[localhost]]
        hosts = localhost
        job runner = background
        global init-script = source ${JEDI_ROOT}/venv/bin/activate

Setting up cylc HPC configuration:

Similar to the localhost setup, you will need to add or update ~/.cylc/flow/global.cylc. The install block is optional for now, but it sets your cylc work directory and run directory to mimic ecflow. Note, these will automatically put a cylc-run directory under the parent EWOK_WORKDIR and EWOK_FLOWDIR directories. Example of global.cylc file for HPCs that use slurm/sbatch for jobs:

vi ~/.cylc/flow/global.cylc

Code Block
languageshell
title~/.cylc/flow/global.cylc
[install]
    [[symlink dirs]]  
        [[[localhost]]]
            work = ${EWOK_WORKDIR}
            run = ${EWOK_FLOWDIR}

[platforms]
    [[localhost]]
        hosts = localhost
        job runner = background
	    global init-script = source ${JEDI_ROOT}venv/bin/activate 

    [[compute]]
	hosts = localhost
	job runner = slurm
	install target = localhost
	global init-script = """
	    source ${JEDI_ROOT}/venv/bin/activate
	    export SLURM_EXPORT_ENV=ALL
	    export HDF5_USE_FILE_LOCKING=FALSE
	    ulimit -s unlimited || true
	    ulimit -v unlimited || true
	    """	
Hercules Note:

On Hercules, it appears that the aws package is not found when only running source {JEDI_ROOT}venv/bin/activate . Therefore it is best to source your setup.sh script instead of just the virtual environment. Replace the global init-script = ${JEDI_ROOT}/venv/bin/activate  with global init-script = ${JEDI_ROOT}/setup.sh  or wherever you keep setup.sh. Then you will want to comment out all of the ecflow lines in setup.sh. 

Discover via spack-stack:

  1. Load spack-stack modules
    Code Block
    languageshell
    #!/usr/bin/env bash
    
    # Initialize modules
    source $MODULESHOME/init/bash
    
    # Load python dependencies
    echo "Using SLES15 modules"
    module use /discover/swdev/jcsda/spack-stack/scu17/modulefiles
    module use /gpfsm/dswdev/jcsda/spack-stack/scu17/spack-stack-1.9.0/envs/ue-intel-2021.10.0/install/modulefiles/Core
    module load stack-intel/2021.10.0
    module load stack-intel-oneapi-mpi/2021.10.0
    module load stack-python/3.11.7
    module load py-pip/23.1.2

  2. Load cylc module and test
    Code Block
    languageshell
    # Load the cylc module
    
    module use -a /discover/nobackup/projects/gmao/advda/swell/dev/modulefiles/core/
    module load cylc/sles15_8.4.0
    
    # Run cylc command
    cylc "$@"

  3. You might need to create a file called $HOME/bin/cylc, and make sure it is executable in order to run locally: chmod +x $HOME/bin/cylc
    1. Note, I did not have to do this when setting up and running on discover-mil with spack-stack 1.9.0 intel
  4. Add the example of ~/.cylc/flow/global.cylc see above for HPCs that use slurm/sbatch for job submission.

FAQ

How to remove an experiment from ecflow?

...