Spack: Difference between revisions

From Mu2eWiki
Jump to navigation Jump to search
 
(19 intermediate revisions by 3 users not shown)
Line 7: Line 7:


Phase 3, which is full spack adoption, we expect will come in two forms.  The first is "bare spack" for experts who are comfortable working with spack directly, and a "wrapped' version with Muse-like script which will hide most details for beginners and casual code users. A basic build in bare spack is being used in the [[Building_Offline_with_cmake| online setting]].
Phase 3, which is full spack adoption, we expect will come in two forms.  The first is "bare spack" for experts who are comfortable working with spack directly, and a "wrapped' version with Muse-like script which will hide most details for beginners and casual code users. A basic build in bare spack is being used in the [[Building_Offline_with_cmake| online setting]].
Mu2e computing management plans to stay at Phase 2 until the following prerequisites for advancing to Phase 3 are achieved:
# Usage of spack by both CSAID and Mu2e is stable, robust and easy enough for novice users.
# There is support for building code on our interactive machines and running it in grid jobs.  This is needed for most development workflows.
# All functionality currently provided by Muse is made available in the new environment.


==Phase 2 Usage==
==Phase 2 Usage==
The initial setup, which works for sl7 (with UPS) and al9 (with spack) is
The initial setup, which works for sl7 (with UPS) and al9 (with spack) is
  source /cvmfs/mu2e.opensciencegrid.org/setupmu2e-art.sh
  source /cvmfs/mu2e.opensciencegrid.org/setupmu2e-art.sh
Since we can't use UPS, we can't "setup mu2e". After this, muse will be in your path and will work normally.  You can "muse setup" and "muse build" [[Muse| as usual]].  You can also make tarballs, and submit to the grid, or access musings, like "muse setup SimJob".
If you have chosen to use the recommended Mu2e login scripts ( see [[Shells]]) this can be abbreviated to <code>mu2einit</code>.
After this, muse will be in your path and will work normally.  You can "muse setup" and "muse build" [[Muse| as usual]].  You can also make tarballs, and submit to the grid, or access musings, like "muse setup SimJob".


To access the data-handling tools, both SAM and the new (metacat) tools, you can
To access the data-handling tools, both SAM and the new (metacat) tools, you can
Line 23: Line 29:
== Minimal Commands ==
== Minimal Commands ==


* locate a package
listing packages
spack find -lv <packagename>
or with the dependencies (things it depends on)
spack find -lvd <packagename>
just print the version
spack find --format "{version}" root
what environment is active
spack env status
locate a package
  spack location -i art
  spack location -i art
  /cvmfs/mu2e.opensciencegrid.org/packages/art/3.14.03/linux-almalinux9-x86_64_v2-gcc-13.1.0-c6fjht65rpw7ctr6ivsn2bbscjbvh5x3
include path
  export ART_INC=$(spack location -i art)/include




Line 31: Line 46:
Generally on al9, we can build Offline, TrkAna and other repos using muse, which drives the build via Scons and will link to art and root provided in the spack format on cvmfs.  For repos which do not have Scons capability, and can only be made with cmake, then we must use the full spack machinery.  The most common situation is when a user need to build KinKal or artdaq-core-mu2e locally with Offline, in order to develop new code in these repos together.  The following will create a spack working area, check out repos locally, and build them locally using spack driving cmake. To first order, no muse command will work here.
Generally on al9, we can build Offline, TrkAna and other repos using muse, which drives the build via Scons and will link to art and root provided in the spack format on cvmfs.  For repos which do not have Scons capability, and can only be made with cmake, then we must use the full spack machinery.  The most common situation is when a user need to build KinKal or artdaq-core-mu2e locally with Offline, in order to develop new code in these repos together.  The following will create a spack working area, check out repos locally, and build them locally using spack driving cmake. To first order, no muse command will work here.


This procedure is together to allow cut-and-past and will be explained a bit more below.  It assumes you are in the build area, such as you app directory.
This procedure is all together here to allow cut-and-past and will be explained a bit more below.  It assumes you are in the build area, such as your app directory.


<pre>
<pre>
Line 46: Line 61:
spack env activate $MYENV
spack env activate $MYENV


# if you want kinkal checked out locally (optional)
spack rm kinkal
spack rm kinkal
spack add kinkal@main
spack add kinkal@main
spack develop kinkal@main
spack develop kinkal@main
# if you want adcm checked out locally (optional)
spack rm artdaq-core-mu2e
spack add artdaq-core-mu2e@develop
spack develop artdaq-core-mu2e@develop
# add these locally
spack add Offline@main +g4
spack add Offline@main +g4
spack develop Offline@main
spack develop Offline@main
Line 63: Line 86:


</pre>
</pre>
After editing code, you can simply
The "develop" repos are checked out in <code>$MYENV</code>. After editing code, you can simply
  spack install | tee i_$(date +%F-%H-%M-%S).log
  spack install 2>&1 | tee i_$(date +%F-%H-%M-%S).log
The code is installed in <code>$SPACK_LOCAL</code>
 
This build area has a "view" which gathers your bin and includes together in one directory to simplify your paths.  This is supposed to get checked with every install command, but sometimes it isn't.  So if the build ran and it doesn't look like all your changes are there, you can try to refresh the view.
This build area has a "view" which gathers your bin and includes together in one directory to simplify your paths.  This is supposed to get checked with every install command, but sometimes it isn't.  So if the build ran and it doesn't look like all your changes are there, you can try to refresh the view.
<pre>
<pre>
Line 71: Line 96:
</pre>
</pre>


returning to the area in a new process
Other non-obvious errors might be helped with a cleanup (won't delete code in you env or installed packages)
spack clean -a
or refreshing the load
spack uninstall Offline
spack install Offline
 
When returning to the area in a new process, you should do
<pre>
<pre>
mu2einit
cd $MYSS
cd $MYSS
source ./setup-env.sh
source ./setup-env.sh
spack env activate $MYENV
spack env activate $MYENV
<pre>
</pre>
 
* smack is a script which simply runs some common combinations of commands, it has a help
** subspack creates a local spack build area backed by builds on cvmfs
** subenv creates a a local environment based on the current complete set of built mu2e code equivalent to a muse envset - art, root, KinKal, BTrk and artdaq-core-mu2e
* an environment limits you to a selected set of code packages and versions
* ''rm'', ''add'', and ''develop'' removes the package from the env, adds it to the env, and checks it out locally
* ''@main'' means the head of the main branch
* ''+g4'' means build with Geant4 (do not skip it like we do for the trigger, which would be ''~g4'')
* concretize means find a set of packages and versions which meet all demands, you only need to do it after changing the packages involved
* install means compile and install as needed
* to develop <code>artdaq-core-mu2e</code>, replace the <code>kinkal</code> lines with
spack rm artdaq-core-mu2e
spack add artdaq-core-mu2e@develop
spack develop artdaq-core-mu2e@develop
 
 
Spack is doing a rather large number of operations behind each command, and it is trying to figure it all out for you, so sometimes it will produce strange errors. Probably best to post on slack help-bug.
 
==Using root with Pre Built Dictionaries==
 
In the pre-spack era, run-time resolution of .so library dependencies and in art jobs was done using LD_LIBRARY_PATH.  In the spack era, run-time resolution of library dependencies in art jobs is done using RPATHS and LD_LIBRARY_PATH is no longer needed.  Therefore LD_LIBRARY_PATH is not defined in the default Mu2e AL9 environment.
 
However root uses LD_LIBRARY_PATH for a second purpose.  At run-time it finds root dictionary files using LD_LIBRARY_PATH.  If you run root in the Mu2e default spack enviroment root will not be able to find dictionaries.  This will cause errors running root on TrkAna and Stnutple format root files and on any other files that require dictionaries.
 
The following is a hack that will work until a better solution is developed.  Before you run root, type:


==Notes==
  export LD_LIBRARY_PATH=${CET_PLUGIN_PATH}
show what spack knows about architectures
spack arch --known-targets


full listing of dependencies
You should only take this step when it is needed. There is a chance that this may sometimes interfere with subsequently running art jobs in the same shell (we have yet fully understood this issue).  You can protect against that by doing the following prior to running art jobs:
  spack find --long --show-flags --deps --variants ifdhc


diff between two hashes
  unset LD_LIBRARY_PATH
spack diff art-root-io/jrcjyn4 art-root-io/h43e5rd


How to group sets of setups into one. You can make a spack environment, like
You can read about CET_PLUGIN_PATH at [[SearchPaths#CET_PLUGIN_PATH]].
spack env create uboone_analysis_current_sl7_x86_64
then
spack activate uboone_analysis_current_sl7_x86_64
and "spack install" packages into it; then if you
spack env activate uboone_analysis_current_sl7_x86_64
those will be the packages you see, and they're all spack loaded at once.
Marc stills need to add support for that to the cvmfs scripts though...


Once you have activated a spack environment you can discover the version number of a package in that environment, for example:
spack find --format "{version}" root




===geant names===
===Geant4 names===
geant data packages have different names in spack.  From Julia:
Geant4 data packages have different names in spack.  From Julia:
<pre>
<pre>
g4able is G4ABLA
g4able is G4ABLA
Line 121: Line 165:
==References==
==References==
*[https://docs.google.com/presentation/d/16CxC40Hs0V0m7pwTQaSoUdo-gqy5UqojNZY824AiCBY/edit#slide=id.g23f597c85bd_0_0 lab intro talk]
*[https://docs.google.com/presentation/d/16CxC40Hs0V0m7pwTQaSoUdo-gqy5UqojNZY824AiCBY/edit#slide=id.g23f597c85bd_0_0 lab intro talk]
*[https://spack.readthedocs.io/en/latest/index.html spack official docs]
*[https://spack.readthedocs.io/en/latest/index.html spack official docs] [https://spack.io/ fora]
*[https://packages.spack.io/ public packages]
*[https://packages.spack.io/ public packages]
*[https://fifewiki.fnal.gov/wiki/Spack lab spack wiki and tutorial]
*[https://fifewiki.fnal.gov/wiki/Spack lab spack wiki and tutorial]
* CSAID has "spack" and "fnal-spack-team-mu2e" channels (invite only)
* githubs
* githubs
**[https://github.com/FNALssi FNALssi] (spack, spack tools)
**[https://github.com/FNALssi FNALssi] (spack, spack tools)

Latest revision as of 04:02, 24 November 2024

Introduction

Spack is a software package setup and build system. It can replace UPS, MRB (Multi-Repo Build) and Muse "package managers". Spack was developed for the supercomputer environment and is common use there today. The driving force that inspired spack is the need to install many packages while coordinating their dependencies.

The computing division will provide their software (art, ifdhc) within the spack framework. They have requested that we adopt spack as far as possible, so that the lab can converge to one package manager which will simplify maintenance and support. It will also prepare us to be more efficient in our use of supercomputers in the future. The lab has decided to end support for UPS and only provide software in the spack framework starting with he adoption of the AlmaLinux operating system, which will fully adopted by the hard deadline of 6/30/2024.

spack is designed to manage your code from github repo, to build, to installation in the final location. However, for a few reasons, we are adopting spack in two phases. For historical reasons, they are phase 2 and 3. In phase 2, we build our Offline repos locally, driven by the muse scripts, and link to the art suite libraries which are delivered in spack format. This has the primary advantage that all muse commands and functionality stay the same.

Phase 3, which is full spack adoption, we expect will come in two forms. The first is "bare spack" for experts who are comfortable working with spack directly, and a "wrapped' version with Muse-like script which will hide most details for beginners and casual code users. A basic build in bare spack is being used in the online setting.

Mu2e computing management plans to stay at Phase 2 until the following prerequisites for advancing to Phase 3 are achieved:

  1. Usage of spack by both CSAID and Mu2e is stable, robust and easy enough for novice users.
  2. There is support for building code on our interactive machines and running it in grid jobs. This is needed for most development workflows.
  3. All functionality currently provided by Muse is made available in the new environment.

Phase 2 Usage

The initial setup, which works for sl7 (with UPS) and al9 (with spack) is

source /cvmfs/mu2e.opensciencegrid.org/setupmu2e-art.sh

If you have chosen to use the recommended Mu2e login scripts ( see Shells) this can be abbreviated to mu2einit. After this, muse will be in your path and will work normally. You can "muse setup" and "muse build" as usual. You can also make tarballs, and submit to the grid, or access musings, like "muse setup SimJob".

To access the data-handling tools, both SAM and the new (metacat) tools, you can

muse setup ops

which can be run with or without setting up a Muse Offline build.

We are preparing

muse setup ana

to provide access to a python analysis tool suite, like pyana.

Minimal Commands

listing packages

spack find -lv <packagename>

or with the dependencies (things it depends on)

spack find -lvd <packagename>

just print the version

spack find --format "{version}" root

what environment is active

spack env status

locate a package

spack location -i art

include path

export ART_INC=$(spack location -i art)/include


Local Offline build

Generally on al9, we can build Offline, TrkAna and other repos using muse, which drives the build via Scons and will link to art and root provided in the spack format on cvmfs. For repos which do not have Scons capability, and can only be made with cmake, then we must use the full spack machinery. The most common situation is when a user need to build KinKal or artdaq-core-mu2e locally with Offline, in order to develop new code in these repos together. The following will create a spack working area, check out repos locally, and build them locally using spack driving cmake. To first order, no muse command will work here.

This procedure is all together here to allow cut-and-past and will be explained a bit more below. It assumes you are in the build area, such as your app directory.

export MYSS=kk
export MYENV=dev

mu2einit

smack subspack $MYSS
cd $MYSS

smack subenv $MYENV
source ./setup-env.sh
spack env activate $MYENV

# if you want kinkal checked out locally (optional)
spack rm kinkal
spack add kinkal@main
spack develop kinkal@main

# if you want adcm checked out locally (optional)
spack rm artdaq-core-mu2e
spack add artdaq-core-mu2e@develop
spack develop artdaq-core-mu2e@develop

# add these locally
spack add Offline@main +g4
spack develop Offline@main
spack add production@main
spack develop production@main
spack add mu2e-trig-config@main
spack develop mu2e-trig-config@main

spack concretize -f 2>&1 | tee c_$(date +%F-%H-%M-%S).log
spack install       2>&1 | tee i_$(date +%F-%H-%M-%S).log

spack env deactivate
spack env activate $MYENV

The "develop" repos are checked out in $MYENV. After editing code, you can simply

spack install 2>&1 | tee i_$(date +%F-%H-%M-%S).log

The code is installed in $SPACK_LOCAL

This build area has a "view" which gathers your bin and includes together in one directory to simplify your paths. This is supposed to get checked with every install command, but sometimes it isn't. So if the build ran and it doesn't look like all your changes are there, you can try to refresh the view.

rm -rf $SPACK_ENV/.spack-env/view $SPACK_ENV/.spack-env/._view
spack env view regenerate $MYENV

Other non-obvious errors might be helped with a cleanup (won't delete code in you env or installed packages)

spack clean -a

or refreshing the load

spack uninstall Offline
spack install Offline

When returning to the area in a new process, you should do

mu2einit
cd $MYSS
source ./setup-env.sh
spack env activate $MYENV
  • smack is a script which simply runs some common combinations of commands, it has a help
    • subspack creates a local spack build area backed by builds on cvmfs
    • subenv creates a a local environment based on the current complete set of built mu2e code equivalent to a muse envset - art, root, KinKal, BTrk and artdaq-core-mu2e
  • an environment limits you to a selected set of code packages and versions
  • rm, add, and develop removes the package from the env, adds it to the env, and checks it out locally
  • @main means the head of the main branch
  • +g4 means build with Geant4 (do not skip it like we do for the trigger, which would be ~g4)
  • concretize means find a set of packages and versions which meet all demands, you only need to do it after changing the packages involved
  • install means compile and install as needed
  • to develop artdaq-core-mu2e, replace the kinkal lines with
spack rm artdaq-core-mu2e
spack add artdaq-core-mu2e@develop
spack develop artdaq-core-mu2e@develop


Spack is doing a rather large number of operations behind each command, and it is trying to figure it all out for you, so sometimes it will produce strange errors. Probably best to post on slack help-bug.

Using root with Pre Built Dictionaries

In the pre-spack era, run-time resolution of .so library dependencies and in art jobs was done using LD_LIBRARY_PATH. In the spack era, run-time resolution of library dependencies in art jobs is done using RPATHS and LD_LIBRARY_PATH is no longer needed. Therefore LD_LIBRARY_PATH is not defined in the default Mu2e AL9 environment.

However root uses LD_LIBRARY_PATH for a second purpose. At run-time it finds root dictionary files using LD_LIBRARY_PATH. If you run root in the Mu2e default spack enviroment root will not be able to find dictionaries. This will cause errors running root on TrkAna and Stnutple format root files and on any other files that require dictionaries.

The following is a hack that will work until a better solution is developed. Before you run root, type:

 export LD_LIBRARY_PATH=${CET_PLUGIN_PATH}

You should only take this step when it is needed. There is a chance that this may sometimes interfere with subsequently running art jobs in the same shell (we have yet fully understood this issue). You can protect against that by doing the following prior to running art jobs:

 unset LD_LIBRARY_PATH

You can read about CET_PLUGIN_PATH at SearchPaths#CET_PLUGIN_PATH.


Geant4 names

Geant4 data packages have different names in spack. From Julia:

g4able is G4ABLA
g4emlow is G4EMLOW
g4neutron is G4NDL
g4nucleonsxs is G4SAIDDATA
g4nuclide is G4ENSDFSTATE 
g4photon is PhotonEvaporation
g4pii is G4PII
g4radiative is RadioactiveDecay
g4tendl is G4TENDL
g4particlexs is G4PARTICLEXS
g4incl is G4INCL

G4NEUTRONXS appears to be obsolete.

References