An example and exploration of how and if Rez could be used to version both software and project configurations.
Table of contents
- History
- Prelude
- Features
- Workflow
- How it works
- Requirement Network
- Conditional Requirements
- This Repository
- Usage
- Architecture
- Missing
- FAQ
This repo is complex; I've saved prior working versions in simpler conditions that you can checkout, whereby the README details what is available and what is missing.
$ git clone https://github.com/mottosso/rez-for-projects.git
$ cd rez-for-projects
$ git checkout 1.0
Version | Description |
---|---|
1.0 |
Initial working version |
1.1 |
Conditional requirements with @late and private_build_requires |
1.2 |
Got rid of rezbuild.py dependency, in favor of build_command |
1.3 |
Added Workflow section and refactored directory layout |
Broadly speaking, Rez is used to optimise parallelism amongst humans in a collaborative endeavors, especially those in visual effects. Every human added to a project comes with some amount of overhead to tooling and communication. At a certain scale - beyond 10-50 humans - that overhead needs management and that's where Rez comes in.
User Stories
These are some specific scenarios that Rez, in particular through this project, addresses.
As a.. | I want.. | So that.. | |
---|---|---|---|
1 | developer | to publish updates to my software | artists can use it |
2 | developer | to preserve prior updates | I can rollback if necessary |
3 | developer | to indicate which versions are stable | artists can choose between latest or safest |
4 | developer | to indicate which package depends on which other package | I can ensure compatibility |
5 | developer | to resolve an environment whereby all version requirements are fulfilled | I can develop tools that depend on it |
6 | developer | to be able to work in parallel with another developer on the same project | neither of us have to wait for the other |
7 | artist | pipeline to get out of my way | I can focus on my work |
8 | artist | software to run fast | I can focus on my work |
9 | supervisor | multiple developers able to work on a software project in parallel | I can get the most bang for the buck |
10 | supervisor | to track who published what and when | I know who to ask about updates or issues |
11 | developer | Git tags associated with software version numbers | there is a single source of truth as to what is released and what is not |
12 | supervisor | to have my show locked off from pipeline updates | nothing new gets broken during a crunch |
13 | developer | to add a comment to a published package | I can communicate to others what the changes were and why |
14 | developer | to release packages on a per-show basis | other shows are unaffected by potentially breaking changes |
15 | developer | to group related packages on disk | they are more easily browsed through via e.g. Windows Explorer or Nautilus |
16 | td | to share scripts with my colleagues without having to know Git or Rez | I can avoid a Phd in pipeline to share my work |
This repo assumes an experienced-level of familiarity with Rez.
Rez is primarily a framework for resolving dependencies required for building software; which help explain why a build step is required per default, why a CMakeLists.txt
is presumed to reside alongside a package definition, and why requirements default to being resolved at build-time rather than run-time; x
can't be built without y
having been built first.
This repo is different. It (mis)uses Rez primarily for resolving environments, in particular those involved in launching software for an appropriate context given some VFX project or asset within that project.
It's not all upside down however; a lot of packages do contain Python modules or compiled Maya plug-ins in which case build system muscles are flexed in full.
Warnings
The community explicitly points out that Rez is not well suited for this purpose.
- "Rez is not a production environment management system" (Old, Rez-1 documentation)
- "Rez was not designed to manage production environments" (Allan, Google Groups conversation)
- "Rez makes a clear distinction between configuration management and package management" (Allan, Google Groups conversation)
But if someone told you "ice cream wasn't designed for chocolate lava cake", would you listen? :)
Motivation
So why do it? Because:
- Complex use of a single system > simple use of multiple systems
- Complex use of a simple system > simple use of a complex system
- Complex use of an established system > simple use of an ad-hoc system
- Complex use of a system with a community > simple use of a solo-developed system
So with all that out of the way, let's have a look at what's possible!
Studio-wide environment
base |
---|
A top-level package represents the studio itself; containing environment and requirements passed down to every software and project package. |
Per-project environment
alita |
---|
Every show is represented by a Project Package that encapsulates each unique requirement and environment variable, augmenting the studio-wide package |
Free-form Overrides
alita |
---|
Every project provides "free-form overrides" which are read/write directories of scripts, plug-ins and shelves etc. for DCCs like Maya. Any artist may add or share scripts this way and is a way for those not involved with Rez or Git to contribute and share code with co-workers. alita/package.py def commands():
if "maya" in request:
# Refers to location outside of package, that may or may not exist
env["PYTHONPATH"].prepend("{env.PROJECTPATH}/maya/scripts")
env["PYTHONPATH"].prepend("{env.PROJECTPATH}/maya/shelves") This idea is mostly relevant to studios in the 1-100 size, where there aren't enough developers to justify a release-cycle for any minor change, and less suitable in the 100-1,000 range where reliability trumps speed. A word of caution A consequence of this feature is that you can never be sure that what works today, given a fixed set of project requirements and versions, will work tomorrow; as there is only ever 1 version of these globally accessible free-form overrides. |
Third-party services
ftrack |
---|
This project refers to ftrack for production tracking, but applies to any external or internal service; the package merely includes appropriate environment variables that point to the remote URI. For security, the API key necessary for actually logging in and reading/writing information is provided separately at the OS level. This also helps when the key needs to change or refresh for whatever reason; there is only ever 1 valid key at any point in time, so no versioning is required. $ set FTRACK_API_KEY=xyz123
$ re ... |
gitlab |
Like |
Per-package combination environment
alita |
---|
Some requirements only make sense in conjunction with two or more packages. For example, requesting
|
External and internal packages
pip , core_pipeline |
---|
Packages developed internally are managed on GitLab, cloned onto the local disk of a developer, and released on creating a new tag via the GitLab web-based UI. External packages from $ rez pip --install Qt.py |
Self-contained packages
core_pipeline |
---|
Some packages in this project reference an external payload, like |
Reference packages
maya |
---|
To save on disk space and avoid accessing static or large files over a potentially slow network connection, some packages carry their payload separate from their metadata. See Reference Packages for more. |
Cross-platform application packages
maya |
---|
The |
Multi-versioned application packages
maya |
---|
Because major versions of DCCs update independently, packages like
Which means
|
The following documents how developer and artists interact with Rez and each other. Every release is accompanied by a mandatory develop stage. That is, no developer works directly towards the files accessible to Rez and the wider audience.
- Every Rez package ends up in the
release_package_path/
directory, which is an example of where you host shared packaged within a single local area network. - Every Rez package is developed in 1 of 3 ways:
- GitLab Tagging
rez build --install --prefix
rez pip --install --release
1. Releasing via GitLab
In most cases, you'll be editing an internal project. In this example, we'll pretend |
1.1 Prerequisites
|
1.2 Develop Getting started on fixing a bug or implementing a feature involves an edit-and-install cycle.
Whenever |
1.3 Release Once happy, you're ready to release.
A |
2. Releasing without GitLab
The |
2.1 Develop Developing works remains unchanged from the previous tutorial, except:
The package is available in your |
2.2 Release Again, similar to the previous tutorial.
The package is now available in your |
3. Releasing pip
packages
3.1 Develop Install any package from $ rez pip --install Qt.py The package is available in your 3.2 Release Just append $ rez pip --install --release Qt.py The package is now available in your |
The project defines 3 types of Rez packages.
Package Type | Description | Examples |
---|---|---|
software |
Self-contained software distribution | qt_py , pyblish_base , pip |
reference |
A software package whose payload reside elsewhere | maya , python |
bundle |
Combines two or more packages | alita , lotr |
Terminology Reference |
Software Package | Reference Package | Bundle Package |
---|---|---|
A self-contained Rez package, with payload and metadata residing within the package itself. | A package in which metadata references an external payload, such as Maya or Python. | A package that combines two or more packages, that may or may not carry a payload. |
~/
packages/
core_pipeline/
python/ # Payload
core_pipeline/
__init__.py
lib.py
util.py
package.py # Metadata
rezbuild.py |
/opt
maya2018/ # Payload
bin/
maya
~/
packages/
maya/
2018/
package.py # Metadata |
~/
packages
alita/
python/ # Payload
maya/
package.py # Metadata
rezbuild.py # Metadata |
Here's and example of how one request is resolved.
$ re alita maya
_______
| |
. | ~2018 | .
. |_______| . weak reference
. .
___.___ _.____
| | | |
$ command-line > | alita | | maya |
|_______| |______|
/ \ \
- - - - - - - - - - - / - - \ - - - - - - \ - - - - - - - - - - - - -
___/__ _\______ __\________
| | | | | |
| base | | python | | maya_base | resolved
|______| |________| |___________|
__|__ ____\__________
| | | |
| pip | | core_pipeline |
|_____| |_______________|
Some combinations of packages give rise to intelligent behavior.
$ re alita maya
________________________
| |
$ command-line > | alita + maya |
|________________________|
/ \ \ \
- - - - - - - - - - - / - - \ - - - - - \ - - -\- - - - - - - - - - -
_/___ _\___ ____\____ \_______
| | | | | | | |
| ... | | ... | | pyblish | | mgear | resolved
|_____| |_____| |_________| |_______|
Because maya
was included, alita
imbues it with extra requirements.
Properties
- Resolving an environment with only
maya
yields a "vanilla" environment whereby the latest version of Maya is present. - Resolving an environment with only
alita
yields a "vanilla" environment whereby the latest version of this project and its environment is present.
Additionally
- Resolving an environment with both
maya
andalita
yields an environment whereby:- A specific version of
maya
is present, one compatible withmaya
, via the weak reference~maya-2018
- A specific set of requirements are included, relevant to both the project and application, such as
mGear
orpyblish
, via the@late
decorator ofrequires()
- A specific version of
Specific version of Maya to a given project
# alita/package.py
name = "alita"
version = "1.0"
requires = [
"~maya-2018",
]
Specific set of requirements to a given combination of project and application
# alita/package.py
name = "alita"
version = "1.0"
@late()
def requires():
if in_context() and "maya" in request:
return ["mgear-1"]
if in_context() and "nuke" in request:
return ["optflow-3.5"]
return []
This repository combines several aspects normally separate in an actual Rez-ified production environment. For example, the dev/
directory is typically local to a developer's machine. The local_packages_path/
is typically ~/packages
. And so forth. See the below table or README's contained in each sub-directory for details.
Directory | Description |
---|---|
.rez/ |
Private files to this repo |
dev/ |
Representation of a local development directory |
local_packages_path/ |
Representation of the default ~/packages directory |
remote_packages_path/ |
Representation of a shared location for released packages |
dev/
Local development directory.
Directory | Description |
---|---|
core-pipeline/ |
Representation of an internal project, hosted on e.g. GitLab |
maya-base/ |
|
mgear/ |
Representation of an external project, hosted on GitHub |
pip/ |
External project, temporarily hosted locally for release |
rez-bundles/ |
Internal project, containing all projects and applications |
dev/rez-bundles/
Internal mono-repo of projects and applications.
Directory | Description |
---|---|
alita/ |
DCC and software requirements, and environment for the Alita project |
lotr/ |
Likewise, but for Lord of the Rings |
base/ |
Common studio environment |
maya_base/ |
Common studio environment for Maya |
maya/ |
System reference to Maya-2017-2019 |
nuke/ |
System reference to Nuke-11v3.2 |
python/ |
System reference to Python-2.7 and -3.6 |
Prerequisites
- Windows, Linux or OSX
- bleeding-rez
python
available on your PATHrez
available on PATH
Install
On either Windows or Unix, run the below.
Don't forget about
--recursive
, due to therez-bundles
submodule.
$ set PATH=<-- path/to/rez/Scripts/rez -->;%PATH%
$ git clone --recursive https://github.com/mottosso/rez-for-projects.git
$ cd rez-for-projects
$ ./build_all
The build script will make contained packages available for resolve
Now enter a shell.
$ ./shell
==============================
Welcome to rez-for-projects!
This demo illustrates how projects
and software can happily co-exist
with Rez.
==============================
Usage
-----
$ rez env # Establish a Rez environment
$ re # ..using an alias
$ re alita # In a given project
$ re alita maya # With a given application
$ rez build --install # Edit and release new package
$ ri # ..using an alias
$
The shell script configures Rez to look for packages in this repository, exposes aliases re
and ri
for common Rez commands and provides you with a greeting message. It does not implement any custom behavior, everything is native to Rez.
base
is required by every project, and defines general variables accessible to all projects and DCCs, such asPROJECTS_PATH
which is an absolute path to where projects are stored relative a given platform, e.g./mnt/projects
on Linuxmaya
,nuke
are standalone DCCs, installed on the local system and referenced by a packgecore_pipeline
represents a shared, common library used on all shows and all DCCsmaya_base
likewise, represents shared Maya requirements and environment variablesalita
andlotr
are "configurations", in that they represent a project, rather than softwarealita
is associated to version 2018 of Maya, via a "weak reference"- Combinations of two or more packages result in a specific list of requirements and environment variables via the
@late
decorator.
Here are some of the things I'd like to happen but haven't figured out how to do yet.
- Cascading Overrides If
/projects/alita/rez
is on theREZ_PACKAGE_PATH
, then the containedmaya/package.py
should add to the studio-wide Maya configuration for this project. Similar to how CSS works.- Could potentially be implemented by having every project require a stub
project_override
package, that per default does nothing, but can be implemented elsewhere and added to theREZ_PACKAGES_PATH
. - Another, less appealing way, is by "subclassing" a project e.g.
alita_override
of whichthe originalalita
package is a requirement along with additional requirements and environment variables. The downside of this is (a) you need one package for each permutation and (b) the user would need to stop typingalita maya
and start typingalita_override maya
which is error prone and tedious on both developer and user ends.
- Could potentially be implemented by having every project require a stub
Cons | Pros |
---|---|
Out of Sync A package can be installed, but the content it references can be missing; e.g. a user may not have Maya 2018 installed. | Reach Create packages out of software you wouldn't normally be able to, due to software, system or permission restrictions. |
Lack of Granuluarity A new version of a package doesn't affect the content, which complicates updates to the payload | Space Large, rigid packages like Maya build up large requirements for your file server and archiving solution. |
Performance Running multi-gigabyte software from a network location isn't healthy for your fellow DevOps engineers. | |
Iteration time As the package contains solely requirements and environment variables, releasing a configuration package can be instantaneous. |
Reference Packages are a way to utilise Rez for packages that are otherwise impractical or impossible to confine into a package. For example it may be too large or dependent on their native installation path, such as Maya being large or PyQt4 with absolute paths embedded in its linked library.
How it works
With Rez, each package consists of two parts.
- Metadata
- Payload
Some packages carry both definition and content, like core_pipeline
. Such that whenever a new version of this package is made, its content is updated too.
.packages/
core_pipeline/
2.1.0/
python/ # content
package.py # definition
Other packages reference something on the local system instead.
.packages/
maya/
2018.0.1/
package.py ---. # definition
.
v
c:\program files\autodesk\maya2018\bin\maya.exe # content
I.e. why not store them with the project, and reference that?
By keeping a Rez package self-contained:
- You enable versioning of project-specific payload
- You avoid package and payload from getting out of sync
- You enable re-use of a package
Consider the following example.
packages/
alita/
package.py
/
projects/
alita/
scripts/
maya/
# package.py
def commands():
env["PYTHONPATH"] = "${PROJECT_PATH}/scripts/maya"
This package cannot exist without an externally set PROJECT_PATH
environment variable. Without it, the environment cannot be entered, and yet the package can still exist on your REZ_PACKAGES_PATH
, sending mixed messages about its availability.
If instead scripts for Maya were contained within the package itself..
packages/
alita/
scripts/
maya/
# package.py
def commands():
env["PYTHONPATH"] = "{root}/scripts/maya"
Then a package being available means payload being available too, and if you wanted to reuse this package in some other project, you could.