...
Note: You can set up cylc-flow and cylc-uiserver in separate virtual environments. If you install cylc-flow in your JEDI-Skylab virtual environment, you can then install cylc-uiserver in a separate virutal virtual env and use that environment to kick off the GUI. Follow steps "Adding cylc to your workflow" and then "Setting up venv for cylc GIU".
...
- Activate your Skylab virtual environment, if you haven't already.
Code Block language shell source $JEDI_ROOT/venv/bin/activate
- Force install cylc-flow. Note, cylc is installed in spack-stack but it has some compatibility issues so it is easiest at the point to re-install in your venv.
Code Block language shell pip install cylc-flow --force-reinstall
Check cylc location and test with skylab/experiments/workflow-test.yaml: (*Optional - as needeed) rsyncis required for the workflow. If awhich rsyncdoes not return this application (if one is on a brand new OrbStack machine):Code Block language shell whichsudo su apt install -y rsync exit
- Check cylc location and test with skylab/experiments/workflow-engine-test.yaml:
Code Block language shell which cylc createcylc create_experiment.py skylab/experiments/workflow-engine-test.yaml
Setting up venv for cylc GUI (ex: mac):
- Needs python3.9 for the UI, therefore run:
Code Block language shell brew install python@3.9
- Update PYTHONPATH:
Code Block language shell module purge unset PYTHONPATH unset PYTHONHOME
- Created venv without spack-stack:
Code Block language shell python3.9 -m venv --system-site-packages cylc-venv
- Activate venv:
Code Block language shell source cylc-venv/bin/activate
- Install cylc:
Code Block language shell pip install cylc-flow pip install cylc-uiserver pip install cylc-rose metomi-rose
- Install optional:
Code Block language shell pip install 'cylc-flow[tutorial]' pip install 'cylc-uiserver[hub]'
- Graphviz:
Code Block language shell brew install graphvizTo test the GUI: graphviz- To test the GUI:
Code Block language shell cylc gui
| language | shell |
|---|
Setting up cylc localhost configuration:
In order to run Skylab with the correct virtual environment, since cylc ignores PYTHONPATH, you need to add a global.cylc file to run an init-script before runtime that will activate the JEDI venv. This should go in ~/.cylc/flow/global.cylc. The install block is optional for now, but it sets your cylc work directory and run directory to mimic ecflow. Note, these will automatically put a cylc-run directory under the parent EWOK_WORKDIR and EWOK_FLOWDIR directories.
vi ~/.cylc/flow/global.cylc
| Code Block | ||||
|---|---|---|---|---|
| ||||
[install]
[[symlink dirs]]
[[[localhost]]]
work = ${EWOK_WORKDIR}
run = ${EWOK_FLOWDIR}
[platforms]
[[localhost]]
hosts = localhost
job runner = background
global init-script = source ${JEDI_ROOT}/venv/bin/activate |
Setting up cylc HPC configuration:
Similar to the localhost setup, you will need to add or update ~/.cylc/flow/global.cylc. The install block is optional for now, but it sets your cylc work directory and run directory to mimic ecflow. Note, these will automatically put a cylc-run directory under the parent EWOK_WORKDIR and EWOK_FLOWDIR directories. Example of global.cylc file for HPCs that use slurm/sbatch for jobs:
vi ~/.cylc/flow/global.cylc
| Code Block | ||||
|---|---|---|---|---|
| ||||
[install]
[[symlink dirs]]
[[[localhost]]]
work = ${EWOK_WORKDIR}
run = ${EWOK_FLOWDIR}
[platforms]
[[localhost]]
hosts = localhost
job runner = background
global init-script = source ${JEDI_ROOT}venv/bin/activate
[[compute]]
hosts = localhost
job runner = slurm
install target = localhost
global init-script = """
source ${JEDI_ROOT}/venv/bin/activate
export SLURM_EXPORT_ENV=ALL
export HDF5_USE_FILE_LOCKING=FALSE
ulimit -s unlimited || true
ulimit -v unlimited || true
""" |
Hercules Note:
On Hercules, it appears that the aws package is not found when only running source {JEDI_ROOT}venv/bin/activate . Therefore it is best to source your setup.sh script instead of just the virtual environment. Replace the global init-script = ${JEDI_ROOT}/venv/bin/activate with global init-script = ${JEDI_ROOT}/setup.sh or wherever you keep setup.sh. Then you will want to comment out all of the ecflow lines in setup.sh.
Discover via spack-stack:
- Load spack-stack modules
Code Block language shell #!/usr/bin/env bash # Initialize modules source $MODULESHOME/init/bash # Load python dependencies echo "Using SLES15 modules" module use /discover/swdev/jcsda/spack-stack/scu17/modulefiles module use /gpfsm/dswdev/jcsda/spack-stack/scu17/spack-stack-1.9.0/envs/ue-intel-2021.10.0/install/modulefiles/Core module load stack-intel/2021.10.0 module load stack-intel-oneapi-mpi/2021.10.0 module load stack-python/3.11.7 module load py-pip/23.1.2
- Load cylc module and test
Code Block language shell # Load the cylc module module use -a /discover/nobackup/projects/gmao/advda/swell/dev/modulefiles/core/ module load cylc/sles15_8.4.0 # Run cylc command cylc "$@"
- You might need to create a file called $HOME/bin/cylc, and make sure it is executable in order to run locally: chmod +x $HOME/bin/cylc
- Note, I did not have to do this when setting up and running on discover-mil with spack-stack 1.9.0 intel
- Add the example of ~/.cylc/flow/global.cylc see above for HPCs that use slurm/sbatch for job submission.
FAQ
How to remove an experiment from ecflow?
...