Friday 10 May 2019

Liang–Barsky Line Clipping And Why You Need Not Bother

























I spent numerous hours working out how to efficiently clip strands/polylines in VEX for a project with dense strand meshes. I thought I'd been really clever implementing the Liang-Barsky algorithm on the NDC cube and then I stumbled into the problem of how to interpolate all existing attributes at the intersection points. Asking for help on various forums - Matt Estela pointed out that the same effect could be achieved with four Clip SOPs in NDC space with the additional bonus of 'free' attribute interpolation. I can't bear the thought of all that time going to waste so here is a vex implementation of Liang-Barsky - maybe someone can find some kind of use for it!

File is here.

Monday 6 May 2019

Cantor Pairing For Impact Data And Generating Dynamic Bullet Constraints






There have been some fantastic resources for demonstrating how to create dynamic constraints in Bullet - most notably Rich Lord's stuff and, as usual, examples on Matt Estela's site.

The process, however, of establishing which collisions took place between which objects can be quite tricky and intensive - particularly as the Impact Data which records this stuff contains many duplicates of impacts between the same objects across multiple substeps. Usually you'd use a few For Loops to rationalise and structure this data.

The scene below uses a simple pairing formula called Cantor Pairing to essentially encode a collision between a pair of objects into a single number - it's then very easy to see if that collision has happened before and/or to remove duplicates of that collision. It also seems to be quite fast.

The sample scene does that encoding and then establishes constraints dynamically at impact points. I think (hope) it's as simple as it can get but, as always with Houdini, there are probably better ways.

hipfile.

Sunday 5 May 2019

Weighted Average of Quaternions Without Flipping?


























If you look around the Internet for the best mechanism to create an average quaternion, and particularly a weighted average of multiple quaternions you will almost always end up at this pdf from NASA.

A simpler way of averaging quaternions is possibly just to sequentially lerp or slerp between pairs of quaternions and adjust the weights at each step but this may or may not be accurate depending on the order in which you select the pairs. There's some debate on the best way to do this.

An additional problem with simply lerping quaternions is that they have a “double-cover” property, where there are two different quaternions (negatives of each other) that represent the same 3D rotation.  Normal lerp between them is dependent on which of those 'doubles' the quaternion actually is - you can go the long way round or the short way round. Looking at the orientation in the viewport you have no idea which one is being used. If, say, you have two similar quaternions but one is using something close to the negative of the other you can get very fast rotations around 180 degrees which look like flips. Slerp solves this problem but only between pairs (as well as providing a constant velocity as the interpolant increases which normal lerp doesn't).

When you use primuv to interpolate quaternions across a polygon in Houdini, that interpolation is a simple lerp between the numbers in the quaternions - you cannot guarantee that the blend will take the shortest arc (in which case you'll get strange intermediate rotations).

I found some Python code for implementing the NASA paper and made a scene to compare and contrast the different types of interpolation. The NASA results do look way better than naive primuv-style lerp. But, since it's python/numpy it's not that fast.

File is here.