Houdini Solaris

source: https://vimeo.com/375782717?embedded=true&source=vimeo_logo&owner=1723479

  1. Switch interface from Build to Solaris
  2. under obj network build a geometry and inside get a sphere primitive > change to mesh

  1. in viewport select the top roll of poly
  2. either hit tap in the viewport to add a group node

  1. or in the group node properties hit the arrow to select the top roll of polygons. after select, hit enter to confirm the selection
  2. give the group name as top

  1. for the bottom roll, use a group combine node
  2. set the name of this group combine > specified it as Equals all but “top”(the group name)

  1. end the geo with a null > rename the null to OUT

  1. switch network to stage(don’t need to get lopnetwork in stage) or bring in a lopnetwork node inside obj network

  1. continue with obj network but highly recommend to switch to stage network for solaris
  2. get a lop network node
  3. inside the lopnetwork > get SOP import to import the sphere and other object. Can also get USD import

  1. in sop import click on the arrow to open floating operator chooser then choose the sphere OUT from the list
  2. scroll down for subset group. type in the both group name. because there has 2 groups. so the import group section won’t pick both of them. It needs to use subset group instead.
  3. in the scene graph tree now it’ll show the sphere mesh primitive and its geometry subset.

  1. for this demo, add a few more object geometries in the obj network
  2. add those into the lopnetwork by using SOP import. and again they can be USD import from production

  1. transform them little bit off the ground if needed
  2. add a SOP create grid for the ground and rename it
  3. put a merge node to merged them all
  1. add the material library node

  1. get 5 principle shader inside material library for this demo
  2. name them and set them all with different colour.

  1. in the attribute of the material library node > hit auto – fill materials
  2. The materials should be shown into the scene graph
  1. add a assign material node

  1. drag and drop the geo and the material from the scene graph to the assign material slots
  2. hit + to add more material assign slots
An alternative way is to activate assign to geometry in material library. drag the object to the geo path. So it can use 1 less node to the tree. However the material library will not be friendly if it needs to pass on to another scene file.
  1. add a light, change the light type if needed
  2. move it to a desire location
  1. add a dome light > change its colour
  2. merge the lights with the assignmaterial node

  1. add the Karma renderer node after
  2. and now the viewport should be able to switch to karma view
  1. in the Karma node attribute, switch it to XPU for a faster render result
  1. for a floating renderview, go to window > new floating panel
  2. switch the panel to scene view
  1. change the floating panel network to obj/lopnet or stage/lopnet depends on where the lopnet graph sits at
  2. and then the view can be set to karma
  1. back to the graph
  2. add another light to the merge. change its light color.
  3. switch the light to disk

BigPS: better scene graph hierarchy. By default SOPimport note will not put the imported geo inside a group. It is a good practice to put the imported geo inside the geo group like in katana.

  1. For all of the SOP import node. Put /geo/$0S in the import path prefix

light linking:

  1. add a light linking node before karma render node

  1. in the light linking node > right side the geometry > select the geo(s) and create a collection

  1. Drag the collection or just the light into the center rules column
  2. then drag the light to the rules colour to either link or unlink to perform light linking

Light Mixer:

  1. add a light mixer node after the light
  2. in the property of the mixer node. drag and drop the lights on from the list to the right side
  3. change to sliders view to adjust each light properties
  4. the star column is for solo mode
  • red dot indicate it has been edited, right click to revert changes

Snap shots: not only store the render but the whole network. any value, even node structures are stored with the snap shot

  1. click on the little screen split bar to pop the snap shot section out
  2. click on snap to store
  1. right click and revert network to this snapshot will go back to that setting. not only limited to light mixer but everything to the stage network

render to disk

  1. inorder to render to disk, needs to add a camera
  2. in the karma render settings node > sure the output picture location by default $HIP is to wherever the scene has saved at
  3. set the camera, resolution, primary samples, min and max sample for the secondary
  4. under image output and filters > the denoiser options are for nVidia GPU only. It will works for both CPU and XPU render.
  1. under render > scheduler > show the progress of the render and jobs being line up
  1. the USDrender_rop node will have the render to disk button
  2. the location should create an extra render folder as the output picture suggests.

ps. if for whatever reason the camera go tilted:

  • focus reset cam: space + g

Collection and Instancer:

  1. create a cube, sphere, torus in obj network
  1. SOPImport all 3 of them into lopnetwork in stage network
  2. add a SOP Create grid just to demonstrate stagemanager
  3. merge them together
  4. add an instancer
  1. inside instancer, need some sort of point for the object to put on. For now, put an add node in there so it has at least 1 point to work with
  2. click on preview on the add node and get back out
  1. in the instancer switch prototype index as Index for preview
  2. drag the index number to see each object

Stage manager: put geo into groups

  1. connect the stage manager after the merge
  2. for the option. hit the plus to add a folder
  3. name the folder and put some the objects into the folder

To utilize it, need to convert this folder into collection


Collection:

  1. add a collection node after stagemanager
  2. in the collection option > set a collection name
  3. inside primitive type in the /folder name/ with / / inbetween > put star at the end to grab all the geo inside the folder
  4. now the object inside the folder has become a collection and that collection can be use in instancer
  1. connect the instancer back
  2. set the prototype Primitive to %collection name
  3. now when drag the index number. It will only show the object which is with in the folder and the collection

Instancer(advance):

  1. now when the prototype index set to Random
  2. and inside the instancer get a grid or any other object that can draw more points the instancer will scatter the object from the collection
  1. further more to add a scatter node it’ll have futher control on how many object scatter across the object points.

Attribute Wrangle:

  1. set the prototype index to index attribute inorder to use the attribute wrangle
  2. go into the instancer node and add an attribute wrangle node at the end
  3. in the VEXpression type in: i@index = 1;
  4. so now this will show the index 1, if it change to 2 it’ll display only index 2. just like dragging the slider from the outside but now control by attribute expression. by that, it can map into a noise value
  1. add an attribute noise node before the attribute wrangle
  2. set the name to c and make it a float
  3. operation to add, and range value min 0 max 1
  1. in the attribute wrangle change the expression to: i@index = chramp (“i”, @c);
  2. click on the create parameter button at the right to create a ramp
  3. set the point on the left to value 0 and the right point value to 3 so base on the ramp it scatter across 0 – 3 value base on the noise
  1. it can to finer adjustment to put constant value to the ramp to fully control how much 1 and how much 2 and 3 are mapped across the ramp
  2. also add another VEX to control the scale: @pscale = chramp(“‘”ps”, @c);
  3. press create spare parameter button on the right
  1. further adjust the attribute noise element size and roughness
  1. for random orientation can use an attribute randomize
  2. set the attribute name as orient and dimentions to 4
  1. packed the instancer group with the output node at the end

Assign Material to poly groups for instancer: Experimental

  1. for each object>create group node> select the poly and put them into the group(previously covered)> give the group a name
  1. back to Stage / Lopnetwork /
  2. create material library , make a few materials for each type so in this case there ha 3 materials, cube_mtl, sphere_mtl and torus_mtl
  1. for assign material node > put that before the stagemanager and collection node
  2. drag and drop the mesh group for material assign and the material accordingly
  1. for the instancer > plug in both the primary and 2ndary input. for some reason the 2ndary input doesn’t carry over material library in scene graph. but for only the primary input then then the material assign needs to include th einstancer path. so pluggin into both the primary and 2ndary seems the only way(further investigation needed)
  1. at least is to add light / camera / karma for the render

Randomize basecolor attribute parameter:

  1. for this example will randomize basecolor of the materials
  2. into the instancer
  3. add an attribute randomize
  4. set the attribute name as basecolor
  1. get back outside of stage/lopnet/ instancer node
  2. point attribute to copy > add basecolor at the end
  3. make sure to use karma cpu to render. as of 19.5 xpu doesn’t support randomize basecolor attribute

Assign randomize material on instancer with attribute: (might not work, seems like bug)

  1. add attribute wrangle and assign a vector attribute with the value of 1, in this example for the cube type in: V@cubeVector_attribute = 1;
  1. back to stage / lopnetwork / all the SOP import node
  2. check on the attributes > add the existed attribute that has just been assign to the animal.
  1. inside the material library > add a bind node and plug into the basecolor
  2. the name of the bind node rename it to the custom vector attribute name
  1. inside the instancer > in the attribute randomize, put the custom attribute name cubeVector_attribute into the attribute name
  1. in the instancer node > point attribute to copy > also should be the custom attribute name cubeVector_attribute

USD export possibilities:

setup the geo with multiple groups and additional attribute

  1. make sure the geo is not primitive otherwise it will not import into lopnetwork correctly > in this exmaple of a primitive sphere, switch it to geo mesh
  1. setup multiple groups with unique given names
  1. use attribute create node to assign one of the group with an attribute name
  2. close it will the OUT null
  3. save the houdini file

lop network multiple variations of the sop import:

  1. in stage > lop network > sop import, load OUT_sphere from make sure it is load as reference
  2. noted that by default the path in the scene graph is set to /sopimport1
  1. change the primitive path to a desire hierarchy in this case /geo/sphere/
  2. noted from the scene graph that the geo is being put inside as such
  3. check off import path prefix
  1. check on the import group and assign the group names to bring in all the groups
  1. for the usd file layer save path save where the houdini file is and create a geo folder to put in this usd file
  1. assign the groups into subset group for material assign
  1. load the attribute that was being created
  1. name this sop import as the original

setup individual hierarchy of the groups as subcomponent

  1. duplicate the original sopimport by alt drag
  2. change to just load the specific group and the output name as the specific group as well
  3. connect them one into another so in scene graph it is in proper hierarchy. Just like loading 2 different object into scene graph
  1. however they are loaded in as “component” which is fine but it can be redefine them if needed
  1. get a configure primitive node
  2. put in the path of things that needs to be change in this case is everything after the sphere group so it will be /geo/sphere/*
  3. and change the kind to subcomponent

setup USD variant export

  1. get the add variants to existing primitive node
  2. close the previous chain with an out null
  3. connect the OUT null to the begin and end of the variant block
  4. add a null in between for the name of the variant
  1. add a prune for the branch out variant
  2. prune out the groups that don’t need
  1. also get a null for the name of the variant
  1. connect that to the end variant block
  2. set the primitive path the same as how it was being setup in the scene graph
  3. name the variant set to the name of this asset
  1. add a set variant node
  1. from the pull down of the varient set it should have the set that was just being assigned from the block
  2. after that the varient name should pop up the 2 names from the variant block null name, pick the one wants to be default when it load
  1. add a configure layer node
  1. pick the default primitive from the drop down
  2. then pick the save path from the drop down
  1. end with a USD Rop node to write out the file
  1. make sure “use relative paths” is enable
  2. set the output file name
  3. hit save to disk
  4. the primary one will be outside of the subfolder and the variants will be inside the subfolder

note: currently I have no idea why it write into usdnc file instead of just usd


USD UV Texture: texture loading node that can be use for Karma and materialX. However, Haven’t found supportive artical about the usage and benefit of this image node

feedback colour is the default colour when no file is loaded


MaterialX: open source material that can be supported and rendered in different engine. Also allow to store in USD. MaterialX has its standard surface uber shader that covers pretty much all attributes. However if user wants to create their own. Using subnet node to put desire attribute in it to create custom shader.

  1. inside material library, get a subnet node. It is recommended to rename it
    • MaterialX has its own version of subnet that only allows materialX nodes inside the materialX Subnet
  1. The auto fill material in matterial library may not pick up any subnet that doesn’t have a unique name. However, they can always fill it in manually even if the subnet did not rename
  1. inside subnet there has the input and output node. More importantly is to make sure the custom node is output to the output node.
    • The MaterialX subnet has slightly different nodes inside. As it has already prepared a displacement output
  • also node that inside the materialX subnetwork. Output node usage can be changed eg: for volume shader, displacement sader, etc
  1. since subnet can build any type of custom shader including Karma shader. To utilize materialX shading network. Get materialX surface and materialX surface material node
  1. for a base material get a diffuse BSDF node.
  2. diffuse bsdf include diffuse colour and its roughness value
    • as for materialX subnet, there has no need to put the materialX surface material in between
  1. to add specular reflection, the diffuse bsdf node needs to layer(verticle blend) with a dielectric node
  2. the dielectric bsdf node will have the attribute for ior, reflection, roughness, etc
  1. however for metallic material. A conductor bsdf is needed. It’ll have the ior and extinction coef attribute. As well as the normal for it
  1. to be more artist friendly, get the artistic ior node for calculating the correct ior and extinction coef value.
  1. to transition dieclectric and metallic material a materialX mix(horizontal) node is needed
  1. after the mix(horizontal blend) user can add an additional dielectric bsdf node with a layer node for an additional clear coat blend (verticle blend)
  2. additional node can be added as a verticle blend such as sheen and thin film
  1. back to the subnet node, it can add custom parameter by clicking the gear icon
  2. the parameter page will pop up. simply drag and drop the parameter that wants to be expose to the editor. Put the name(parameter name) and label(the display name for user)
  1. hitting apply will have the parameter exposed.
  2. using can always right click on the the parameter name in the editor to group/tab the parameter for further UI adjustment.

materialX image: this is the image node for materialX

  • besure the signature type is fitting to the input. eg, for roughness the image signature should switch to float 1, normal as vector 3, etc

materialx normal map: is a normal map node just like every other render engine

  • be sure that the input image node is set to vector 3. for a default normal value should be 0.5, 0.5, 1
  • normal map will kinda blur off if subsurface is fully in use. It can balance it by reducing the subsurface gain or simply use displacement map.

Material Geometry Property Value: this is a PrimVar node. And it’s also available for UV coord. switch the signature to vector 2 and the Geomprop to uv then it’ll read the texture coord for the image node. Some render engines need this to map the texture properly.

  1. if the geo has some attribute being define in the obj network. those attribute can be recall by geompropvaue node.
  2. Select the mesh in scene graph. look into scene graph detail.
  3. Put any primVar value into the geompropvalue node geomproup input. define the signature type.

MaterialX absolute value: this node gives pure value. no need to assign a name and become an attribute or stick an extra value to the shader. it is purely a value node that can link into any shader for master control value. It can give out color/vector/float values.


mtlxTextureCoord and mtlxUSDTransform2d: these 2 combined to perform a 2d placement texture coord where user can rotate scale translate the UV. Instead of using geom property value. This setup will be more useful in most cases


Procedural 3D node and materialX position node: all the procedural 3d node like noise nodes require a position input. MaterialX Position node is like a 3Dplacement node in maya

  1. pick either model or object will make the texture translate with the model when it moves.
  1. add a multiply node to multiply its position value to have a tighter procedural noise

Export: it is better to include a material surface material node even on a standard surface uber shader for export. Simply right click on to any node and hit save as materialX. Other application like Maya/clarriss will be able to load it.


focus view: space + f
home view: space + g


Material library / Material network: material library inside stage is the same as material network in obj. but if user choose to put material network inside obj. Later will need to sop import that material network into stage or lopnetwork

  1. inside obj network, get a material network
  2. create some materials inside it
  3. go to stage network
  4. get a scene import(material node)
  5. by default the node is set to import everything in the additional materials
  6. or it can be manually enable or disable the list of material that wanted to be import by clicking the bundle chooser button on the right

VEX base normal map node : displacement texture

  1. get a displacement texture
  2. switch the texture type to normal for normal map

MaterialX Displacement: It is best to put materialX shader in a materialX subnet. MaterialX Subnet has a proper setup for displacement.

  • make sure the image node output set to float
  • houdini’s scale is 1unit =1meter. So the displacement scale should set to 0.01
  • since the materialX displacement doesn’t have any default height option so a remap is needed
  • depend on the displacement map is half float or full float. remap needs to be adjust accordingly. If it is a baked 32bit displacement map with negative value, then it shouldn’t need a remap node.
  • some of the Megascan assets require to color offset their displacement map. To do that, use a color correct node to reduce the gain to 0.75 will have a better result

ps. houdini 20 for remap, set to colour4(FA) somehow remap float will be gone when a triplanar image node is pluged in


USD export: source from NINE-BETWEEN https://youtu.be/cE-Bbdspu_8?si=phqFLsGokzNE9o3

  1. assuming there has 2 variant of the asset inside the geometry node
  1. inside stage, get a component builder
  2. in the first component geometry node, change source to external SOP and pick which variant wants to be saved
  3. rename the component geo node of with _variant for practical reason
  1. in side the material library
  2. create a material
  3. rename it , make sure the material is flagged with the orange icon on the right
  1. since the material is renamed properly it’ll show its name at the scene graph
  2. component material is like the assign material node
  3. The material path of the component material node will put in the same name of the renamed material automatically with the default script
  4. the script will grab the first material that is sorted by alphabetical order.
  5. however since it component material node works like assign material node. it should be able to assign those manually
  6. make sure to rename the component material node
  1. at last rename the component output name as it’ll become the file name of the USD
  2. recommend to generate thumbnail by checking the “view thumbnail camera” button, set the viewport to Karma and hit generate thumbnail button
  3. hit save to disk and it’ll generate 5 files. the geo usd, material usd, an usd that reference the 2 , a container usd, and a png for thumbnail

Reference node: to load USD file

  1. simply select the file from disc on the file pattern input

export USD assets with multiple variants:

  1. duplicate or get tanother component geo node. select the desire obj output
  2. add a component geometry variant node. it is like a merge node
  1. add an extra material
  2. make sure to auto-fill the material
  1. for the component material node.
  2. instead of using the script for the material path. drag one of the material onto it. leave the primitive as the script so it applys that material for both objects
  3. make another component material node. drag the other material over to the material path so all objects will have the 2nd materials as well.
  4. path that into the graph. so now both 2 objects should have 2both material assigned to them.
  1. to export this USD file, make sure the component output node is no longer just the single asset variant name. choose a name that can represent all the variants
  2. save the file and it’ll create another folders that contains the new 4 USDs

explore variants node

  1. add the explorvariants node to check.
  2. set the mode to varaints
  3. set layout to stacked XY
  4. give a bit of spacing
  5. this can be used after the reference node to check the output USD file

Layout node: to reference USD and populate in a scene

  1. recommended to prepare assets using solaris layout asset gallery
  2. in the solaris layout, add a new layout asset gallery pane
  1. by default it’ll load the default houdini asset database
  2. create a new data base
  3. browse a location and save a .db file recommend to save with the usd export asset folder

  1. now it the gallery is empty.
  2. hit the bear+ icon to add asset
  3. load the usd file with the thumbnail that has generated during the usd export
  4. then the asset will be added to the gallery
  1. get a geo that wants to be scatter or plays the objects on. in this case there has a grid
  2. get the layout node
  3. in the layout node, drag and drop the asset from the gallery
  1. make sure the layout node is selected
  2. select the asset from the list
  3. pick a tool, in this case use Place to place it manually
  4. hit enter or the show handle button on the left
  5. start dragging on the plain to place the asset. hold and drag to adjust the size
  1. in the option by default the method is using Point Instancer. this is a more memory conservative way. because it doesn’t matter how many asset been placed it will consider as a single layout item
  1. however it can always be split into multiple individually by choosing instanceable reference then now it’ll split into multiple item under layout
  1. by RMB on it. user can choose variants
  2. it’ll automatically add a setvariant node into the scene
  1. or this action can be done manually by placing a set variant node
  2. drag and drop the item that wants to be switch variant
  3. select the varient set. either geo or material
  4. and choose the variant name or pick from the drop down menu

Instancer and collection with USD: source:NINE-Between example

  1. get an instancer, weather feeding a geo outside or the instancer or insider are fine
  2. if the geo is feeded outside, get a lopimport node inside the instancer
  1. pick a LOP path
  2. then pick the primitive within it.
  1. get an unpack USD, switch the output to polygons to define some attributes
  1. drop an attribute paint node to paint some mask by hitting the icon on the right and press enter or the geo select button at the left and start painting
  2. inside the attribute paint node, go to attribute, and give the attribute a name. by default it set to mask.
  1. get a scatter and align node
  2. adjust the coverage if needed
  3. or to utilize the mask by changing the point mount method to by coverage
  4. check on the density attribute and pick the mask or type in the name of a painted mask
  1. going back to the stage network
  2. add a reference node feed into the instancer
  3. pick the USD file at the file pattern.
  4. then it should scatter with the USD
  1. to randomize the variant, add an explore variant node
  2. leave the mode to duplicate variants
  1. add a collection node in between
  2. give a name to the collection
  1. drag and drop the path of a desire reference asset into the primitive
  2. add /* to the path at the end to indicate everything within the reference folder
  1. going back to the instancer node
  2. At prototype primitive > add or replace the collection from the drop down menu
  3. uncheck only copy specified prototype primitive

material library Visualize node: to preview a node or a chain

  1. select any node and press x for activating visualize node

Edit material network: the node that allows to reference edit materials for lighting

  1. add the edit material network node after reference
  2. drag and drop which material wants to edit in the material path
  3. hit load
  1. inside the edit material node there has 2 parts that split by the collect node
    • top part for reference edit the shader
    • bottom part is a basic preview shader in case the engine can’t read any of the shader
  2. everything is gray out, check the attribute that wants to be edit an edit it
  3. inside this edit material node, using can add or subtract nodes into the shader if needed
  • there has an enviroment variable that can always viewport display the preview shader to save some resources. will google that later

Load simulation into solaris and export usd after lighting and scene assemble:
source from Nine between: https://www.youtube.com/watch?v=cE-Bbdspu_8&t=7382s

sublayer node: it is like a reference node but it can load multiple references and off load them when they are not needed.

  1. loading USDs with sublayer node in stage network
  2. lopimport it in obj network
  3. unpack the usd so it can be edit again
  4. convert it to vdb for collision
  5. convert it back to poly
  1. do the particle simulation thing
  2. most important is to set attributes
  1. make sure the simulation has P(position), pscale(scale), v(velosity)
  2. click the little trail icon on the side of the obj network viewport to view the velosity
  3. hit D for display option under geo, point size to view pscale
  4. pscale and v will take use in render. if they are not defined, it’ll default to a value of 1 which will make them super huge in render
  5. also make sure to delete the attributes that is not needed to reduce a USD size after
  1. back to stage network
  2. bring in the simulation with a sop import node
  3. make sure the import path prefix has /sim/$0S to put the simulation in a group
  4. add a material library and a light
  1. assign the whole sim group to the shader. either directly to material library or assign material node
  1. adding velocity motion blur in Karma render settings
  2. since the point primitive(the simulation particles) have velocity, under karma render settings > camera effect > swich velocity blur to velocity blur. then now it will take that attribute into use. There will be no effect if there has no v attribute. (may need to rename the v attribute to velosities)
  • It is common to add multiple Karma render setting node for different resolutions of render for preview and final renders

USD Rop node: output the render information, the whole lopnetwork into USD so even by different render engine can render with this information.

  1. set a frame range, by default it renders only a single frame
  2. the output file name there has no need to add .$F. in the file name to output a file every frame. Unless it has too many frames or need to split into 2 renders.
  3. may also switch on “flush data after each frame” when splitting usd files for clearing caches.

MaterialX Volume shader:

  • the velocity is set to the same as the density so velocity doesn’t exeed the density.
  • make sure there has the density attribute because it’ll be needed for primVar later
  1. bring in the sim as it was from the previous section
  2. get a materialX subnet, since the subnet output default is a surface shading output, change it to volume
  1. add a materialX volume
    • vdf: for smoke(density)
    • edf: for fire(emission)
  • note: if not using mtlxSubnet, than just start the shader with the mtlxvolume under material library
  1. assign to volume primitive to the material library or assign material node
  1. adding a materialX Anisotropic VDF node, increase the size. There should have voxel showing in the render. This is just to visualize the bounding of the voxel.
  • something to note is that somehow Karma doesn’t refresh when rendering volume shader. Restart render if it doesn’t render anything
  1. get a materialX Geom Prop Value to recall the primVar: density attribute from the simulation
  2. connect it both into the anisotropic absoption and scattering. That will replace the value that was punch in earlier.
  1. by adding multply nodes to future control the density value
  1. note that the anisotropic is accepting vector 3 data, so that means it can bring in the colour
  2. multiplying the density with a constant colour
  3. bring an materialX constant, switch the output to colour and multiply it with the primVar multiplied value for both absoption and scattering
  4. the colour for absorption is a extinction base so it’ll render the opposite colour. scattering input is not.

Camera:

  • be ware that the number in the close range clipping range and the focal length value my blown up. reset them if needed
  • suggest to keep the close clipping range low otherwise stuff will get cut out
  • depth of view control by F-stop and focus distance. F-stop 0 means no DoF
  • with the transform handle and the camera selected, hit shift +F will show focal plain
  • shift+click on anywhere again will focus on where being clicked.

AOV

  1. check on the pass if needed and it’ll show in the render output
  1. under the aov tab, there has a section for extra render vars
  2. hit the + and type in the name of the aov or select from a drop down on the right.

LPE tags: similar to light partitions that can store along with passes

  1. when the split per LPE tag is activated in the AOV, in side the light under karma tab
  2. under LPE tag, switch to set or create
  3. type if $OS(or custom name) to set the tag as the name of the light
  4. it’ll add another pass for the aov that has checked on split LPE.
  1. by default the denoiser is only added to aov C. click on the left to add denoiser to more aovs

USD Render Rop: a render node

  1. after render settings are done.
  2. set the render current frame or range of frames in the rop node. hit render to disk
  3. if multiple render settings node has added. under render settings drop down should have different renders settings to select. Make sure the other render setting primitive is set to a different name
  4. render delegate is to choose render engine

Configure Layer: is a layer node to output different path command for USD to split different catagories of the scene in to different USDs

  1. either by typing start new layer or configure layer will bring in the same node. start new layer will have some different default settings
  1. by default under the scene graph layer tab any object either generated by houdini or reference in will be set to implicit in save control
  1. bring in ta configure layer node and set its save path will turn the asset into explicit
  2. it is a good practice to separate the configure layer node with a null to make the graph more readable.
  1. Can give another configure layer to all the lights and another layer to all the cameras
  2. when they merge with a merge node, set the merge style to separate layers
  1. for asset that being scattered by layout and instances . The configure layer can be set to flatten layer so it combines the whole thing together.
  2. noted that the referenced in asset will not be saved.
  1. by the end for the USD Rop node, change the save style to flatten implicit layers
  2. then hit save to disk
  1. The ROP will save a container usd file that contains the location of other branches USDs
  2. Work with software engineer to setup a better database folder structure for pipeline.

Bring pivot and object bottom to ground level:

option 1: use match size node the from value> set justify Y to min > leave X Z to center

option 2: transform node set translate to -$CE to X and Z, Y set to -bbox(0,D_YMIN) to move the object to center above ground. Then pivot translate to the opposite direction as $CE X and Z , Y to bbox(0,DYMIN)


Show UV space: in sop (obj) > select geo > Space+5

Leave a comment