100% Validated Badge

The Value of Validation

Grant Viklund
4 min readOct 6, 2022

Having now spent time in the software world, the solution to many issues we encountered in production is much clearer. Adopting parts of the CI/CD (Continuous Integration / Continuous Deployment) workflow makes sense and should be used in asset production. The concept is simple, validations define a contract that states a certain level of expected reliability and functionality and this contract can be validated at any point in the pipeline. When a new asset is ready for publishing to be consumed downstream, the whole suite of validations can be run to make confirm contract is met and the asset is ready to deploy. If at a later point in time an issue is discovered further down the pipeline, new validations can be created to keep it from happening again.

Tests can be broken into various categories (Unit, Regression, Integration, etc) but they share common tooling and can be run selectively or together as a suite. There are tools common to the software world like Jenkins, GitHub Actions and Buildkite to help manage running deployments and can handle all the steps needed to make an asset ready including the validation suite on each update. It is completely reconfigurable as needs cahnge and removes the user having to remember all the steps needed to move an asset into the pipeline from the equation . The less hand holding, the more predictable and reliable the solution.

Rather than relying on manual validation checks (themselves prone to unintentional error) and each studio building their own validation solution, having a common toolset to construct them makes sense. There is high value in automating the steps when publishing an asset, so there is also high value in lowering the barrier and resistance to build them (look at Python’s test tool suite to see how easy they can be to implement). With distributed teams being more of the norm, having a portable solution is becoming a must to guarantee that assets will work across tooling, standards and geography while avoiding impacts to production.

The open source framework ripe for this type of tooling is Pixar’s USD. USD already relies on a schema format designed to provide a standard description for elements in a scene. These definitions give the platform flexibility and let users expand the capabilities with their own schemas when necessary. By standardizing the structure, the tools can react accordingly (or not at all) and the scene will stay intact.

So, why not take this one step further and provide the ability to extend the schema format with the ability to intelligently validate its data? I have been thinking about this for a while now, and since the topic was brought up at the USD Council meeting at Siggraph 2022, this has reaffirmed my thoughts that this is a necessary feature and would provide big dividends for anyone using the framework. Similar to open source repos on Github that show various stats & badges demonstrating the condition of the code, Asset Stores that sell models & rigged models could demonstrate the reliability of the asset using common standards. Buyers hiring service providers can create validations for partner studios to build assets against that meet their specific requirements. Studios can guarantee that a new update to a character rig will propagate properly down the production pipeline. Assets available in the metaverse can be checked before inclusion in your worlds. And with Materials… ‘nuf said.

Just as with USD being a standard set of element descriptions for scenes, having a common set of tools to build portable validations will make sure that assets are going to meet their expectations as best possible. Having this common standard also means that I can run the validations on my end just as easily as you can on yours. I would no longer have to send out my suite of proprietary tests in the hope that everything works on your end. This saves me from not only sending out potentially vulnerable code but also is one less application my studio would have to support.

As the production world shifts to a distributed one making sure the data we work with is as common, transparent and interoperable as possible is going to be the new normal. It is something larger studios have had the resources to tackle with proprietary solutions in the past but with the growth of collaborations between companies, this is a necessity for all parties that can’t always get the attention it deserves. So as the USD standard continues to evolve there are good opportunities to look at what other industries use to help make their products more reliable. With that in mind, the argument for a quality validation solution for production assets becomes a low lift, high value one to adopt.

--

--

Grant Viklund
Grant Viklund

Written by Grant Viklund

A creative technologist working in all things Computer Graphics, from VFX & Animation to Video Games, the Metaverse & new Interactive Experiences.

No responses yet