Bendy Bones: implement a new curve-aware vertex to segment mapping mode. #110758

Merged
Alexander Gavrilov merged 1 commits from angavrilov/blender:pr-bbone-curved-mapping into main 2023-09-26 14:17:29 +02:00

Currently vertices are mapped to B-Bone segments without taking the
rest pose curve into account. This is very simple and fast, but causes
poor deformations in some cases where the rest curvature is significant,
e.g. mouth corners in the new Rigify face rig.

This patch implements a new mapping mode that addresses this problem.
The general idea is to do an orthogonal projection on the curve. However,
since there is no analytical solution for bezier curves, it uses the
segment approximation:

  • First, boundaries between segments are used for a binary space
    partitioning search to narrow down the mapping to one segment.
  • Then, a position on the segment is chosen via linear
    interpolation between the BSP planes.
  • Finally, to remove the sharp discontinuity at the evolute surface
    a smoothing pass is applied to the chosen position by blending to
    reduce the slope around the planes previously used in the BSP search.

In order to make per-vertex processing faster, a new array with the
necessary vectors converted to the pose space, as well as some
precomputed coefficients, is built.

The new mode is implemented as a per-bone option in order to ensure
backward compatibility, and also because the new mode may not be
optimal for all cases due to the difference in performance, and
complications like the smoothed but still present mapping
discontinuities around the evolute surface.

Straight (original) mapping Curved (new) mapping Comparison of deformations
Straight Curved Comparison The setup emulates a mouth corner.

The colors above represents the head-tail position along the curve a vertex would be mapped to, with a darker blue or red fill used for points mapped exactly to the start or end of the curve. In case of the curved mode, beyond the picture bounds to the left all lines eventually align parallel to the middle (green) line.

This Blender File reproduces the mapping dynamically using a viewport node material (it supports between 2 and 14 segments), and was used to render the first two pictures. The file also contains a script that tests the actual implemented mapping via checking the result of PoseBone.bbone_segment_index for a number of empties placed based on the image rendered by the shader model. There are tests for the 4, 8 and 14 segment cases, but the segment count has to be changed manually.

This Blender File contains a simple deformation test case and was used to generate the third image.

Currently vertices are mapped to B-Bone segments without taking the rest pose curve into account. This is very simple and fast, but causes poor deformations in some cases where the rest curvature is significant, e.g. mouth corners in the new Rigify face rig. This patch implements a new mapping mode that addresses this problem. The general idea is to do an orthogonal projection on the curve. However, since there is no analytical solution for bezier curves, it uses the segment approximation: * First, boundaries between segments are used for a binary space partitioning search to narrow down the mapping to one segment. * Then, a position on the segment is chosen via linear interpolation between the BSP planes. * Finally, to remove the sharp discontinuity at the evolute surface a smoothing pass is applied to the chosen position by blending to reduce the slope around the planes previously used in the BSP search. In order to make per-vertex processing faster, a new array with the necessary vectors converted to the pose space, as well as some precomputed coefficients, is built. The new mode is implemented as a per-bone option in order to ensure backward compatibility, and also because the new mode may not be optimal for all cases due to the difference in performance, and complications like the smoothed but still present mapping discontinuities around the evolute surface. | Straight (original) mapping | Curved (new) mapping | Comparison of deformations | | -------- | -------- | -------- | | ![Straight](https://projects.blender.org/attachments/4a528ff8-76d4-47db-97f8-f4124b95ce95) | ![Curved](https://projects.blender.org/attachments/e186bb8e-e10b-42e2-90ee-ea9e3e6bd2f4) | ![Comparison](https://projects.blender.org/attachments/4834731d-db0e-41c7-ba9b-b93da24b8569) The setup emulates a mouth corner. | The colors above represents the head-tail position along the curve a vertex would be mapped to, with a darker blue or red fill used for points mapped exactly to the start or end of the curve. In case of the curved mode, beyond the picture bounds to the left all lines eventually align parallel to the middle (green) line. This [Blender File](https://projects.blender.org/attachments/cb1eb40d-2682-41bb-8b70-e579fcb68b4a) reproduces the mapping dynamically using a viewport node material (it supports between 2 and 14 segments), and was used to render the first two pictures. The file also contains a script that tests the actual implemented mapping via checking the result of `PoseBone.bbone_segment_index` for a number of empties placed based on the image rendered by the shader model. There are tests for the 4, 8 and 14 segment cases, but the segment count has to be changed manually. This [Blender File](https://projects.blender.org/attachments/d4e1bdfa-64a9-402f-961a-3c9718532b14) contains a simple deformation test case and was used to generate the third image.
Alexander Gavrilov added this to the Animation & Rigging project 2023-08-03 15:29:59 +02:00
Alexander Gavrilov requested review from Nathan Vegdahl 2023-08-03 15:30:27 +02:00
Alexander Gavrilov requested review from Sybren A. Stüvel 2023-08-03 15:30:27 +02:00
Alexander Gavrilov requested review from Demeter Dzadik 2023-08-03 15:30:27 +02:00
Author
Member

@blender-bot package

@blender-bot package
Member

Package build started. Download here when ready.

Package build started. [Download here](https://builder.blender.org/download/patch/PR110758) when ready.
Member

I've only skimmed the code so far. But there are some higher-level things I'd like to discuss:

  1. I could be mistaken, but based on your visualizations it looks like the new mapping has singularities that are resolved by introducing arbitrary-ish discontinuities. Maybe that's unavoidable. But if there's a mapping that e.g. blends gracefully and is smooth everywhere, so that everything is mapped in a clear and well-behaved way regardless of distance from the b-bone, I think that would be good. If that's not feasible, then oh well. But I think it's worth some investigation.
  2. As someone who lived through Blender not taking b-bone curvature into account at all (i.e. vertices would be deformed even in the rest pose, because the deformation code didn't account for b-bones being curved in the rest post), I feel like the naming could maybe be a bit more descriptive. "Straight mapping" sounds to me like it's computing the mapping with the b-bone itself in a straightened-out pose, like in the bad old days. Maybe we could call these "parallel mapping" and "curved mapping" instead? Then again, maybe it only feels confusing to me because of my memory of those old issues.

In any case, the functionality definitely looks useful to me! So from my perspective, it's mainly a matter of making sure we make the right set of trade offs in the solution here.

I've only skimmed the code so far. But there are some higher-level things I'd like to discuss: 1. I could be mistaken, but based on [your visualizations](https://projects.blender.org/attachments/3a073250-e965-4725-aa0e-3a0d1de86310) it looks like the new mapping has singularities that are resolved by introducing arbitrary-ish discontinuities. Maybe that's unavoidable. But if there's a mapping that e.g. blends gracefully and is smooth everywhere, so that everything is mapped in a clear and well-behaved way regardless of distance from the b-bone, I think that would be good. If that's not feasible, then oh well. But I think it's worth some investigation. 2. As someone who lived through Blender not taking b-bone curvature into account at all (i.e. vertices would be deformed *even in the rest pose*, because the deformation code didn't account for b-bones being curved in the rest post), I feel like the naming could maybe be a bit more descriptive. "Straight mapping" sounds to me like it's computing the mapping with the b-bone itself in a straightened-out pose, like in the bad old days. Maybe we could call these "parallel mapping" and "curved mapping" instead? Then again, maybe it only feels confusing to me because of my memory of those old issues. In any case, the functionality definitely looks useful to me! So from my perspective, it's mainly a matter of making sure we make the right set of trade offs in the solution here.
Author
Member
  1. I could be mistaken, but based on your visualizations it looks like the new mapping has singularities that are resolved by introducing arbitrary-ish discontinuities. Maybe that's unavoidable. But if there's a mapping that e.g. blends gracefully and is smooth everywhere, so that everything is mapped in a clear and well-behaved way regardless of distance from the b-bone, I think that would be good. If that's not feasible, then oh well. But I think it's worth some investigation.

This discontinuity is fundamental and is essentially caused by the loss of 1:1 continuous, monotonic and unambiguous correspondence between orthogonal projection and nearest point on curve when the original point is further than the radius of curvature. You would observe the exact same jumpy behavior from Shrinkwrap when your points are too far inside the target mesh.

Also, the location of the discontinuities is not arbitrary, they are located on the evolute of the bone curve. It's just that since the bone shape is approximated with segments, the evolute also becomes segmented in a weird way instead of smooth.

Finally, note that this mapping has to be recomputed for every vertex every time the Armature modifier runs. Thus, it cannot be too complicated to compute.

Edit: but, in practice, I wonder if this discontinuity even matters? What would be the practical use case where you would want the bone to actually deform any vertices located in the problem area? In my view of the use case for this, the bone is a part of a loop surrounding some kind of hole in the mesh, so the middle would be empty.

  1. As someone who lived through Blender not taking b-bone curvature into account at all (i.e. vertices would be deformed even in the rest pose, because the deformation code didn't account for b-bones being curved in the rest post), I feel like the naming could maybe be a bit more descriptive. "Straight mapping" sounds to me like it's computing the mapping with the b-bone itself in a straightened-out pose, like in the bad old days. Maybe we could call these "parallel mapping" and "curved mapping" instead? Then again, maybe it only feels confusing to me because of my memory of those old issues.

Mapping here has nothing to do with deformation. It essentially corresponds to different Auto Weight algorithms, but for b-bone segments (and in case of segments you don't have an option of manual weight painting unfortunately).

Edit: However, in a way your impressions are actually right here I think - the deformation itself was fixed, but segment weighting was still based on the straightened out bone shape all the way up until now.

> 1. I could be mistaken, but based on [your visualizations](https://projects.blender.org/attachments/3a073250-e965-4725-aa0e-3a0d1de86310) it looks like the new mapping has singularities that are resolved by introducing arbitrary-ish discontinuities. Maybe that's unavoidable. But if there's a mapping that e.g. blends gracefully and is smooth everywhere, so that everything is mapped in a clear and well-behaved way regardless of distance from the b-bone, I think that would be good. If that's not feasible, then oh well. But I think it's worth some investigation. This discontinuity is fundamental and is essentially caused by the loss of 1:1 continuous, monotonic and unambiguous correspondence between orthogonal projection and nearest point on curve when the original point is further than the radius of curvature. You would observe the exact same jumpy behavior from Shrinkwrap when your points are too far inside the target mesh. Also, the location of the discontinuities is not arbitrary, they are located on the [evolute](https://en.wikipedia.org/wiki/Evolute) of the bone curve. It's just that since the bone shape is approximated with segments, the evolute also becomes segmented in a weird way instead of smooth. Finally, note that this mapping has to be recomputed for every vertex every time the Armature modifier runs. Thus, it cannot be too complicated to compute. Edit: but, in practice, I wonder if this discontinuity even matters? What would be the practical use case where you would want the bone to actually deform any vertices located in the problem area? In my view of the use case for this, the bone is a part of a loop surrounding some kind of hole in the mesh, so the middle would be empty. > 2. As someone who lived through Blender not taking b-bone curvature into account at all (i.e. vertices would be deformed *even in the rest pose*, because the deformation code didn't account for b-bones being curved in the rest post), I feel like the naming could maybe be a bit more descriptive. "Straight mapping" sounds to me like it's computing the mapping with the b-bone itself in a straightened-out pose, like in the bad old days. Maybe we could call these "parallel mapping" and "curved mapping" instead? Then again, maybe it only feels confusing to me because of my memory of those old issues. Mapping here has nothing to do with deformation. It essentially corresponds to different Auto Weight algorithms, but for b-bone segments (and in case of segments you don't have an option of manual weight painting unfortunately). Edit: However, in a way your impressions are actually right here I think - the deformation itself was fixed, but segment weighting was still based on the straightened out bone shape all the way up until now.
Member

This discontinuity is fundamental

Right, I'm aware that it's fundamental with this exact mapping. What I was suggesting is looking for a different mapping that's roughly equivalent near the bone where it matters, but transitions to something smooth where the discontinuity would otherwise be so that vertices in that region are also "well behaved" in some sense.

A trivial example of this kind of "getting rid of discontinuities" approximation is the smooth absolute value function sqrt(x*x + n), when you want something that approximates an absolute value but is continuous. And maybe such an approximation isn't possible in this case (it's certainly a much more involved mapping than the absolute value function!), in which case oh well. But if it is possible, I think that would be preferable.

Having said that, that could also be a future addition. If we have multiple mappings anyway, we can just add another to the list if we find a better one in the future. But I think taking a bit of time to try to figure that out now would still be worthwhile.

Also, the location of the discontinuities is not arbitrary, they are located on the evolute of the bone curve.

I could definitely be wrong here, but it looks to me like the discontinuities aren't actually on the exact evolute. As you mention, the b-bone is segmented, and it's that segmentation here that introduces the (possible) arbitrariness I was referring to. In deviating from the analytic evolute, you have to make some choices, I think?

But I don't think that's a meaningful issue or anything. The main point was what I mentioned above, about looking for a different mapping that wouldn't have the discontinuities at all (if possible).

> This discontinuity is fundamental Right, I'm aware that it's fundamental with this exact mapping. What I was suggesting is looking for a different mapping that's roughly equivalent near the bone where it matters, but transitions to something smooth where the discontinuity would otherwise be so that vertices in that region are also "well behaved" in some sense. A trivial example of this kind of "getting rid of discontinuities" approximation is the smooth absolute value function `sqrt(x*x + n)`, when you want something that approximates an absolute value but is continuous. And maybe such an approximation isn't possible in this case (it's certainly a much more involved mapping than the absolute value function!), in which case oh well. But if it is possible, I think that would be preferable. Having said that, that could also be a future addition. If we have multiple mappings anyway, we can just add another to the list if we find a better one in the future. But I think taking a bit of time to try to figure that out now would still be worthwhile. > Also, the location of the discontinuities is not arbitrary, they are located on the evolute of the bone curve. I could definitely be wrong here, but it looks to me like the discontinuities aren't actually on the exact evolute. As you mention, the b-bone is segmented, and it's that segmentation here that introduces the (possible) arbitrariness I was referring to. In deviating from the analytic evolute, you have to make some choices, I think? But I don't think that's a meaningful issue or anything. The main point was what I mentioned above, about looking for a different mapping that wouldn't have the discontinuities at all (if possible).
Author
Member

A trivial example of this kind of "getting rid of discontinuities" approximation is the smooth absolute value function sqrt(x*x + n), when you want something that approximates an absolute value but is continuous. And maybe such an approximation isn't possible in this case (it's certainly a much more involved mapping than the absolute value function!), in which case oh well. But if it is possible, I think that would be preferable.

How do you imagine that kind of thing applying here at all? What you need to do here is map the vertex to a point on the curve (basically Head Tail slider on a Copy Transforms constraint), because each vertex is weighted to exactly two adjacent segment joints with some weight split between them.

I could definitely be wrong here, but it looks to me like the discontinuities aren't actually on the exact evolute. As you mention, the b-bone is segmented, and it's that segmentation here that introduces the (possible) arbitrariness I was referring to. In deviating from the analytic evolute, you have to make some choices, I think?

It's not really arbitrary, it is caused by the first step of narrowing down the mapping to just one segment (pair of adjacent joints). Like I said, this mapping has to be computed from scratch for every vertex every time it is deformed, and thus has to be as fast as possible. Thus, for performance reasons this first step is done by binary partitioning of space using normal planes at joints. This means:

  • Cuts are done using just those planes, i.e. the boundary is polygonal and not smooth.
  • Planes are tried in binary search order, i.e. first cut is done using the middle joint, then the middle on each side, and so on. This ensures that the cost is log(num of segments), but as a result the resolution of the evolute approximation decreases exponentially as you move away from the end of the bone: you get one or two single b-bone segment wide pieces, then two segment, then four, etc.

Do you have any reply re whether you can think of any practical mesh, rig and weight paint where this discontinuity would actually matter, i.e. you would have vertices weighted to that b-bone in that area?

> A trivial example of this kind of "getting rid of discontinuities" approximation is the smooth absolute value function `sqrt(x*x + n)`, when you want something that approximates an absolute value but is continuous. And maybe such an approximation isn't possible in this case (it's certainly a much more involved mapping than the absolute value function!), in which case oh well. But if it is possible, I think that would be preferable. How do you imagine that kind of thing applying here at all? What you need to do here is map the vertex to a point on the curve (basically Head Tail slider on a Copy Transforms constraint), because each vertex is weighted to exactly two adjacent segment joints with some weight split between them. > I could definitely be wrong here, but it looks to me like the discontinuities aren't actually on the exact evolute. As you mention, the b-bone is segmented, and it's that segmentation here that introduces the (possible) arbitrariness I was referring to. In deviating from the analytic evolute, you have to make some choices, I think? It's not really arbitrary, it is caused by the first step of narrowing down the mapping to just one segment (pair of adjacent joints). Like I said, this mapping has to be computed from scratch for every vertex every time it is deformed, and thus has to be as fast as possible. Thus, for performance reasons this first step is done by binary partitioning of space using normal planes at joints. This means: * Cuts are done using just those planes, i.e. the boundary is polygonal and not smooth. * Planes are tried in binary search order, i.e. first cut is done using the middle joint, then the middle on each side, and so on. This ensures that the cost is log(num of segments), but as a result the resolution of the evolute approximation decreases exponentially as you move away from the end of the bone: you get one or two single b-bone segment wide pieces, then two segment, then four, etc. Do you have any reply re whether you can think of any practical mesh, rig and weight paint where this discontinuity would actually matter, i.e. you would have vertices weighted to that b-bone in that area?
Member

How do you imagine that kind of thing applying here at all?

I wasn't suggesting using the smooth absolute value function—that obviously doesn't apply. It was an example of taking a mapping with a discontinuity (in that case a first derivative discontinuity) and creating an approximation without one. What I was suggesting is seeing if we can figure out a continuous approximation of the current mapping in the PR.

Again, maybe that isn't possible or feasible. And I'm certainly not suggesting blocking this PR on it (apologies if I gave that impression). But I do think it would be preferable if possible, and is worth at least a cursory exploration before landing this.

Do you have any reply re whether you can think of any practical mesh, rig and weight paint where this discontinuity would actually matter

The issue with discontinuities is that they tend to make things finicky, and that's usually (though not always) undesirable. The shrinkwrap modifier is a good example of this, where its functionality is good, but it's extremely finicky to use in non-trivial cases. And sometimes that's just unavoidable, which is fine. Better to have useful-but-finicky functionality than no functionality at all.

But as a rule of thumb I think it's good to avoid discontinuities in these kinds of things when feasible. So I'm just suggesting some exploration into that here.

> How do you imagine that kind of thing applying here at all? I wasn't suggesting using the smooth absolute value function—that obviously doesn't apply. It was an example of taking a mapping with a discontinuity (in that case a first derivative discontinuity) and creating an approximation without one. What I was suggesting is seeing if we can figure out a continuous approximation of the current mapping in the PR. Again, maybe that isn't possible or feasible. And I'm certainly not suggesting blocking this PR on it (apologies if I gave that impression). But I do think it would be preferable if possible, and is worth at least a cursory exploration before landing this. > Do you have any reply re whether you can think of any practical mesh, rig and weight paint where this discontinuity would actually matter The issue with discontinuities is that they tend to make things finicky, and that's usually (though not always) undesirable. The shrinkwrap modifier is a good example of this, where its functionality is good, but it's extremely finicky to use in non-trivial cases. And sometimes that's just unavoidable, which is fine. Better to have useful-but-finicky functionality than no functionality at all. But as a rule of thumb I think it's good to avoid discontinuities in these kinds of things when feasible. So I'm just suggesting some exploration into that here.
Author
Member

I wasn't suggesting using the smooth absolute value function—that obviously doesn't apply. It was an example of taking a mapping with a discontinuity (in that case a first derivative discontinuity) and creating an approximation without one. What I was suggesting is seeing if we can figure out a continuous approximation of the current mapping in the PR.

I mean, what would even be the logic, and how would it look? Maybe you can draw on that picture to show it.

The logic of this mapping is that it does an orthogonal projection on the curve, with some approximation because of the segment representation to simplify and speed up the math. This discontinuity happens because there are points where there is no solution since the curve has ends, so they are mapped to the nearest end of the curve as a fallback, and you get locations where one point has a solution in the middle of the curve, while an adjacent point has no solution.

The issue with discontinuities is that they tend to make things finicky, and that's usually (though not always) undesirable. The shrinkwrap modifier is a good example of this, where its functionality is good, but it's extremely finicky to use in non-trivial cases. And sometimes that's just unavoidable, which is fine. Better to have useful-but-finicky functionality than no functionality at all.

Well, at least one point where the problem is smaller here compared to shrinkwrap is that this mapping normally isn't animated. You can imagine this discontinuity as a cut in a rubber sheet deformed by the bone; I think it would be pretty obvious if it happens in a rig, and you can remove weight from this bone to avoid it.

Also, as seen on the image the discontinuity always starts from the boundary planes of the curve ends, and I'm pretty sure that it is guaranteed to be always outside the volume limited by the planes of both ends (so e.g. for a bone in a mouth corner never assign inner bottom lip vertices to the top lip bone and vice versa, and you should be safe).

I also made a blend file that replicates the lines of the above picture dynamically using geometry nodes (but without the color fill), so you can use it to play with the boundary for different bone shapes.

> I wasn't suggesting using the smooth absolute value function—that obviously doesn't apply. It was an example of taking a mapping with a discontinuity (in that case a first derivative discontinuity) and creating an approximation without one. What I was suggesting is seeing if we can figure out a continuous approximation of the current mapping in the PR. I mean, what would even be the logic, and how would it look? Maybe you can draw on that picture to show it. The logic of this mapping is that it does an orthogonal projection on the curve, with some approximation because of the segment representation to simplify and speed up the math. This discontinuity happens because there are points where there is no solution since the curve has ends, so they are mapped to the nearest end of the curve as a fallback, and you get locations where one point has a solution in the middle of the curve, while an adjacent point has no solution. > The issue with discontinuities is that they tend to make things finicky, and that's usually (though not always) undesirable. The shrinkwrap modifier is a good example of this, where its functionality is good, but it's extremely finicky to use in non-trivial cases. And sometimes that's just unavoidable, which is fine. Better to have useful-but-finicky functionality than no functionality at all. Well, at least one point where the problem is smaller here compared to shrinkwrap is that this mapping normally isn't animated. You can imagine this discontinuity as a cut in a rubber sheet deformed by the bone; I think it would be pretty obvious if it happens in a rig, and you can remove weight from this bone to avoid it. Also, as seen on the image the discontinuity always starts from the boundary planes of the curve ends, and I'm pretty sure that it is guaranteed to be always outside the volume limited by the planes of both ends (so e.g. for a bone in a mouth corner never assign inner bottom lip vertices to the top lip bone and vice versa, and you should be safe). I also made a blend file that replicates the lines of the above picture dynamically using geometry nodes (but without the color fill), so you can use it to play with the boundary for different bone shapes.
Member

I mean, what would even be the logic, and how would it look?

My initial thought would be something like the lines (well, planes really) curving to become parallel before they intersect. Realistically, I think that would be difficult to implement efficiently, so I'm not necessarily suggesting that. But that's an example of the kind of thing I'm thinking about.

I think for a more feasible mapping, it would probably need to be something that takes good advantage of the BSP. We know the length of each b-bone segment, and in general a BSP can tell you what side of the dividing plane a point is on and its minimum distance from that plane. And with that kind of information available, a signed-distance-field-esk approach probably has the most potential. So that's along the lines of what I'm thinking, basically.

Well, at least one point where the problem is smaller here compared to shrinkwrap is that this mapping normally isn't animated.

Isn't it implicitly animated with shape keys? That comes up a lot with e.g. corrective shape keys, face rigs, etc.

> I mean, what would even be the logic, and how would it look? My initial thought would be something like the lines (well, planes really) curving to become parallel before they intersect. Realistically, I think that would be difficult to implement efficiently, so I'm not necessarily suggesting that. But that's an example of the kind of thing I'm thinking about. I think for a more feasible mapping, it would probably need to be something that takes good advantage of the BSP. We know the length of each b-bone segment, and in general a BSP can tell you what side of the dividing plane a point is on and its minimum distance from that plane. And with that kind of information available, a signed-distance-field-esk approach probably has the most potential. So that's along the lines of what I'm thinking, basically. > Well, at least one point where the problem is smaller here compared to shrinkwrap is that this mapping normally isn't animated. Isn't it implicitly animated with shape keys? That comes up a lot with e.g. corrective shape keys, face rigs, etc.
Author
Member

And with that kind of information available, a signed-distance-field-esk approach probably has the most potential.

The problem with that kind of thing is how wide to make it without extra options so that it is useful - too narrow a transition would still be a jump, while a too wide area would cause muddled deformation. Also, a point can be close to multiple planes and a careless approach can cause multiple small fractured discontinuities instead of a single easily understandable one...

Isn't it implicitly animated with shape keys? That comes up a lot with e.g. corrective shape keys, face rigs, etc.

True, I forgot about that. However, at least shape keys and the extent of their application are normally controlled by the rigger, so they can test that everything works well and an animator can't inadvertently mess it up 😄

And, in the way I imagine this being used (primarily lips, i.e. the bone is at most a quarter of a loop around a hole, and more likely 1/8th), the discontinuity would happen in an area where no vertices assigned to the bone would reasonably occur. I think something extending across the hole from one lip to another (like fangs, tusks or something) would likely warrant having their own deform bones anyway for control or rigidity.

> And with that kind of information available, a signed-distance-field-esk approach probably has the most potential. The problem with that kind of thing is how wide to make it without extra options so that it is useful - too narrow a transition would still be a jump, while a too wide area would cause muddled deformation. Also, a point can be close to multiple planes and a careless approach can cause multiple small fractured discontinuities instead of a single easily understandable one... > Isn't it implicitly animated with shape keys? That comes up a lot with e.g. corrective shape keys, face rigs, etc. True, I forgot about that. However, at least shape keys and the extent of their application are normally controlled by the rigger, so they can test that everything works well and an animator can't inadvertently mess it up 😄 And, in the way I imagine this being used (primarily lips, i.e. the bone is at most a quarter of a loop around a hole, and more likely 1/8th), the discontinuity would happen in an area where no vertices assigned to the bone would reasonably occur. I think something extending across the hole from one lip to another (like fangs, tusks or something) would likely warrant having their own deform bones anyway for control or rigidity.
Member

The problem with that kind of thing is how wide to make it without extra options so that it is useful

Sure, but that's the kind of thing I'm suggesting exploring. If you just don't want to do that exploration, that's totally fair. But I think it would be worth doing if you're up to it. And maybe nothing useful comes out of it, but also maybe something does.

However, at least shape keys and the extent of their application are normally controlled by the rigger, so they can test that everything works well and an animator can't inadvertently mess it up 😄

Yeah, that's true. But also, maybe there's a rig where I want the benefits of the better b-bone deformation and also want to allow the animators to push the shape keys to extremes sometimes. I mean, it's totally okay to basically say, "Hey, you can do one or the other, but not both." Sometimes that's just a reality of engineering features. But if feasible, it would be nice to not force users into that choice.

> The problem with that kind of thing is how wide to make it without extra options so that it is useful Sure, but that's the kind of thing I'm suggesting exploring. If you just don't want to do that exploration, that's totally fair. But I think it would be worth doing if you're up to it. And maybe nothing useful comes out of it, but also maybe something does. > However, at least shape keys and the extent of their application are normally controlled by the rigger, so they can test that everything works well and an animator can't inadvertently mess it up 😄 Yeah, that's true. But also, maybe there's a rig where I want the benefits of the better b-bone deformation and also want to allow the animators to push the shape keys to extremes sometimes. I mean, it's totally okay to basically say, "Hey, you can do one or the other, but not both." Sometimes that's just a reality of engineering features. But if feasible, it would be nice to not force users into that choice.
Member

Here's an example that demonstrates artifacts from the discontinuities in the mapping.

Blend file:
bbone_deform_artifacts.blend

Movie (notice the mouth corners):

This is based on the Rain asset, with that asset's actual cheek puff shape keys. However, the armature deformation setup is tossed together by me with auto-weighting, because the original rig was built in a way that wouldn't really be affected by the new deform mode (the b-bones were disconnected around the lip corners, and were generally fairly straight).

Nevertheless, I think this is reasonably representative, because it demonstrates that real production shape keys can and will move vertices enough to hit the discontinuity. That could perhaps be mitigated with more careful weight painting and bone placement... but then the rigger may also have to make choices between weights and bone placement that give the best deformations vs those which avoid the discontinuities of the mapping when shapes get applied.

Again, I'm not suggesting that a smooth mapping is necessarily achievable. But I also don't think we've explored it enough to reasonably rule it out either. And this test makes me more convinced that it really would be a good idea to do that exploration before landing this.

Here's an example that demonstrates artifacts from the discontinuities in the mapping. Blend file: [bbone_deform_artifacts.blend](/attachments/900276fd-f97a-44fb-a6d2-51ba8c04536c) Movie (notice the mouth corners): <video src="/attachments/659251d5-cdb0-4c2b-a222-e486cb283882" title="bbone_deform_artifacts.webm" controls></video> This is based on the [Rain asset](https://studio.blender.org/characters/5f1ed640e9115ed35ea4b3fb/v2/), with that asset's actual cheek puff shape keys. However, the armature deformation setup is tossed together by me with auto-weighting, because the original rig was built in a way that wouldn't really be affected by the new deform mode (the b-bones were disconnected around the lip corners, and were generally fairly straight). Nevertheless, I think this is *reasonably* representative, because it demonstrates that real production shape keys can and will move vertices enough to hit the discontinuity. That could perhaps be mitigated with more careful weight painting and bone placement... but then the rigger may also have to make choices between weights and bone placement that give the best deformations vs those which avoid the discontinuities of the mapping when shapes get applied. Again, I'm not suggesting that a smooth mapping is *necessarily* achievable. But I also don't think we've explored it enough to reasonably rule it out either. And this test makes me more convinced that it *really would be a good idea* to do that exploration before landing this.
Author
Member

After much experimenting using a shader model, I came up with a smoothing scheme without glitches I think.

Screenshot_20230816_125607.jpg

I wonder if the coefficients in the yellow and blue nodes are the best, and whether the purple one should use Smooth Step or not.

After much experimenting using a shader model, I came up with a smoothing scheme without glitches I think. ![Screenshot_20230816_125607.jpg](/attachments/4cd4f059-3089-4222-a596-a0bc3b41774a) I wonder if the coefficients in the yellow and blue nodes are the best, and whether the purple one should use Smooth Step or not.
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from 857dd786d2 to 241b7ba307 2023-08-16 15:01:18 +02:00 Compare
Author
Member

@blender-bot package

@blender-bot package
Member

Package build started. Download here when ready.

Package build started. [Download here](https://builder.blender.org/download/patch/PR110758) when ready.
Author
Member

Here's an example that demonstrates artifacts from the discontinuities in the mapping

So, like I commented above and in the chat, I implemented a smoothed mapping (with a blend file containing a material-based model for math testing and experiments, and making pictures). However, it does not solve the problem in your test case - although vertices don't jump any more, they still move unnaturally. I think this is unavoidable through smoothing, because of how big the magnitude of the shape change actually is compared to the size of the bone.

I think this test is an instance where the test itself needs fixing. The shape key is actually pulling the mesh away from the bone structure in the mouth corner - I'd say it's not what it should be doing, especially in such a delicate area. Also, the mouth corner itself isn't placed in the best way: it should probably be a bit higher to match the edge loop, and more forward (outward) to reduce the curve the corner bones take if you look from above. Some rest pose ease values also could use reducing.

I found that some tweaks to the position of the mouth corner bone joint and ease values seemed to mostly fix the problem.

> Here's an example that demonstrates artifacts from the discontinuities in the mapping So, like I commented above and in the chat, I implemented a smoothed mapping (with a blend file containing a material-based model for math testing and experiments, and making pictures). However, it does not solve the problem in your test case - although vertices don't jump any more, they still move unnaturally. I think this is unavoidable through smoothing, because of how big the magnitude of the shape change actually is compared to the size of the bone. I think this test is an instance where the test itself needs fixing. The shape key is actually pulling the mesh away from the bone structure in the mouth corner - I'd say it's not what it should be doing, especially in such a delicate area. Also, the mouth corner itself isn't placed in the best way: it should probably be a bit higher to match the edge loop, and more forward (outward) to reduce the curve the corner bones take if you look from above. Some rest pose ease values also could use reducing. I found that some tweaks to the position of the mouth corner bone joint and ease values seemed to mostly fix the problem.
Member

This is awesome! Thanks for putting in the time and effort.

I'm running into a weird bug with the latest build, though. When I open the file, some polygons around the mouth are missing. But it's inconsistent. Every time I open the file (no saving, literally just opening the same file) different faces are missing:

file opened once file opened again
missing_polys_01.png missing_polys_02.png

Sometimes they're missing from the corners of the mouth, sometimes from the lips, and sometimes no faces are missing and it's fine.

That aside, this is looking really good.

However, it does not solve the problem in your test case - although vertices don't jump any more, they still move unnaturally.

I agree that the vertices still move weird. But that's also the case (although to a slightly lesser extent) even with the straight mapping. I think that's just fundamental to any technique where weights are derived from vertex position.

The benefit of the smooth mapping (or at least the benefit I was looking for) isn't that it prevents all weirdness, it's that it prevents popping artifacts, which are immediately noticeable and disturbing even when fairly minor/subtle. A bit of weird movement can be smoothed over without too much effort, whereas resolving popping artifacts are a huge pain because they're quite noticeable even when made smaller.

For example:

Both of these have a subtle corrective smooth modifier applied to mitigate the artifacts. The left is with the new smooth mapping, the right is with the old discontinuous mapping. Unless you're paying close attention to the corners of the mouth, the left one looks fine. But the right one is still visually disturbing even when just casually noticing the mouth corners.

So the smooth mapping changes the problem from something that's really painstaking to address into something that's quite reasonable to address.

I think this test is an instance where the test itself needs fixing.

This test certainly isn't great rigging work, ha ha! I tossed it together quickly for illustrative purposes. However, I also didn't place the bones to intentionally trigger the artifact. I did try to place them as I actually would for a real rig, just very hastily done.

And in the general case, bone placement won't necessarily help you here, because moving a bone closer to some vertices moves it further away from others. So you're always playing a kind of balancing game. Additionally, bone placement affects deformations in more ways than just how the vertices get weighted to the b-bone segments, and the best deformations in those other respects may be better with the bones placed more center-mass rather than close to the mesh surface.

In any case, I'm pretty happy with the functionality now. Although it should probably get a bit more thorough testing. @Mets when you're back, can you take a crack at doing some rigging experiments with this, and see how it goes?

And I'll give this a proper code review soon as well.

Again, this is great work @angavrilov. Thanks a bunch!

This is awesome! Thanks for putting in the time and effort. I'm running into a weird bug with the latest build, though. When I open the file, some polygons around the mouth are missing. But it's inconsistent. Every time I open the file (no saving, literally just opening the same file) different faces are missing: | file opened once | file opened again | |----|----| | ![missing_polys_01.png](/attachments/7ca740ba-c8eb-4457-85d8-53c0f9cd4016) | ![missing_polys_02.png](/attachments/55da6852-4a53-4af3-94e5-da70451d2521) | Sometimes they're missing from the corners of the mouth, sometimes from the lips, and sometimes no faces are missing and it's fine. That aside, this is looking really good. > However, it does not solve the problem in your test case - although vertices don't jump any more, they still move unnaturally. I agree that the vertices still move weird. But that's also the case (although to a slightly lesser extent) even with the straight mapping. I think that's just fundamental to any technique where weights are derived from vertex position. The benefit of the smooth mapping (or at least the benefit I was looking for) isn't that it prevents all weirdness, it's that it prevents *popping* artifacts, which are immediately noticeable and disturbing even when fairly minor/subtle. A bit of weird movement can be smoothed over without too much effort, whereas resolving popping artifacts are a huge pain because they're quite noticeable even when made smaller. For example: <video src="/attachments/81cb838d-8297-402e-9fbe-5c060baabba9" title="smooth_vs_pop.webm" controls></video> Both of these have a subtle corrective smooth modifier applied to mitigate the artifacts. The left is with the new smooth mapping, the right is with the old discontinuous mapping. Unless you're paying close attention to the corners of the mouth, the left one looks fine. But the right one is still visually disturbing even when just casually noticing the mouth corners. So the smooth mapping changes the problem from something that's really painstaking to address into something that's quite reasonable to address. > I think this test is an instance where the test itself needs fixing. This test certainly isn't great rigging work, ha ha! I tossed it together quickly for illustrative purposes. However, I also didn't place the bones to intentionally trigger the artifact. I did try to place them as I actually would for a real rig, just very hastily done. And in the general case, bone placement won't necessarily help you here, because moving a bone closer to some vertices moves it further away from others. So you're always playing a kind of balancing game. Additionally, bone placement affects deformations in more ways than just how the vertices get weighted to the b-bone segments, and the best deformations in those other respects may be better with the bones placed more center-mass rather than close to the mesh surface. In any case, I'm pretty happy with the functionality now. Although it should probably get a bit more thorough testing. @Mets when you're back, can you take a crack at doing some rigging experiments with this, and see how it goes? And I'll give this a proper code review soon as well. Again, this is great work @angavrilov. Thanks a bunch!
Author
Member

And in the general case, bone placement won't necessarily help you here, because moving a bone closer to some vertices moves it further away from others. So you're always playing a kind of balancing game. Additionally, bone placement affects deformations in more ways than just how the vertices get weighted to the b-bone segments, and the best deformations in those other respects may be better with the bones placed more center-mass rather than close to the mesh surface.

Rather than being closer to the mesh surface, the tweaks I tried focus more on moving the corner vertically to match the position of the corner edge loop, and laterally to change the angle of the plane the corner bones lie in so that it points more forward than sideways (basically unbend the shape of the mouth corner bones a bit when you look from above), to reduce the effect where moving the vertices forward in the global sense makes their projection on the mouth surface defined by the bone appear move 'inward' from the corner into the middle of the mouth. After all, the discontinuity happens for vertices on the opposite side vertically when you move inward.

> And in the general case, bone placement won't necessarily help you here, because moving a bone closer to some vertices moves it further away from others. So you're always playing a kind of balancing game. Additionally, bone placement affects deformations in more ways than just how the vertices get weighted to the b-bone segments, and the best deformations in those other respects may be better with the bones placed more center-mass rather than close to the mesh surface. Rather than being closer to the mesh surface, the tweaks I tried focus more on moving the corner vertically to match the position of the corner edge loop, and laterally to change the angle of the plane the corner bones lie in so that it points more forward than sideways (basically unbend the shape of the mouth corner bones a bit when you look from above), to reduce the effect where moving the vertices forward in the global sense makes their projection on the mouth surface defined by the bone appear move 'inward' from the corner into the middle of the mouth. After all, the discontinuity happens for vertices on the opposite side vertically when you move inward.
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from 241b7ba307 to 2b4bde578f 2023-08-17 15:31:56 +02:00 Compare
Author
Member

@blender-bot package

@blender-bot package
Member

Package build started. Download here when ready.

Package build started. [Download here](https://builder.blender.org/download/patch/PR110758) when ready.
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from 2b4bde578f to 5327a7d5d8 2023-08-18 18:53:54 +02:00 Compare
Author
Member

@blender-bot package

I tweaked the math to reduce the dependency of the width of the smoothing band on the number of segments by scaling the radius appropriately.

@blender-bot package I tweaked the math to reduce the dependency of the width of the smoothing band on the number of segments by scaling the radius appropriately.
Member

Package build started. Download here when ready.

Package build started. [Download here](https://builder.blender.org/download/patch/PR110758) when ready.
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from 5327a7d5d8 to 479d0b8fd0 2023-08-19 21:12:00 +02:00 Compare
Author
Member

@blender-bot package

Found a way to make the mapping isolines bordering the pure blue and red areas even smoother:

Before After
Before After
@blender-bot package Found a way to make the mapping isolines bordering the pure blue and red areas even smoother: | Before | After | | -------- | -------- | | ![Before](https://projects.blender.org/attachments/11648a55-fcab-4d1a-be98-2c6f41d0ead8) | ![After](https://projects.blender.org/attachments/e186bb8e-e10b-42e2-90ee-ea9e3e6bd2f4) |
Member

Package build started. Download here when ready.

Package build started. [Download here](https://builder.blender.org/download/patch/PR110758) when ready.
Author
Member

@Mets @nathanvegdahl In the interest of final polishing, I would like some opinions on these questions regarding configuring the smoothing math, using the attached file containing the material based model of the mapping. The bone is keyframed in the file to allow viewing how the mapping responds to changes in the bone shape by just stepping through the timeline (note that this represents different rest shapes of the bbone in the actual implementation).

  1. Which falloff smoothing method in the purple box should be used?

    This is a tradeoff of removing kinks in the mapping isolines (caused by first derivative discontinuities) at the cost of introducing more easing into the smoothed transition where it touches areas mapped completely to one of the curve ends. The non-smoothed approach has very visible angles located where the lines cross the smoothing band border parallel to the middle green line, but the smoothed transition is relatively uniform. Half Smoothstep smoothes out those kinks, but there is significant non-iniformity in the gradient. Piecewise Smoothstep is a compromise that splices a section of a parabola before the threshold and a straight line after it in a way that ensures continuous first derivative; it basically smoothes out the kinks from a sharp angle into a curved section with its size depending on how big the threshold is (zero threshold is no smoothing), and only adds easing to the outermost band.

    Note that this does nothing to remove kinks caused by the original mapping rather than smoothing - they can be seen where the lines cross planes perpendicular to the curve at its ends.

    This smoothing makes the picture nicer (I plan to use it in the docs as well), but I have a feeling it may not be really important for the actual deformation.

  2. Should the smoothing radius depend on the length of the curve, or the uncurved bone (should the Multiply in the green box be unmuted)?

    Basically, should the width of the smoothed transition between the ends of the curve where the lines converge far to the left of the image depend on the length of the curve itself, or the distance between the head and tail of the bone. Depending on the length of the curve like now seems to be more natural, as the spacing between lines where they cross the curve obviously depends on that, but maybe the other way makes sense more? Note especially the difference in behavior in frames 35-40.

  3. What should be the final smoothing radius specified in the blue node?

    This is the actual value, the interpretation of which depends on the previous two points. Increasing it makes the transition smoother, but can cause the actual end points of the bone to be mapped further into the curve when the end curvature is high. Also, higher smoothing reduces the apparent width because of increased easing and requires increasing this value to maintain the same spacing in the middle of the transition.

  4. Is the value in the yellow node the best?

    This basically controls how strongly lines are pushed away from the center, compared to the smoothing that mostly affects the outer boundary.

@Mets @nathanvegdahl In the interest of final polishing, I would like some opinions on these questions regarding configuring the smoothing math, using the attached file containing the material based model of the mapping. The bone is keyframed in the file to allow viewing how the mapping responds to changes in the bone shape by just stepping through the timeline (note that this represents different *rest* shapes of the bbone in the actual implementation). 1. Which falloff smoothing method in the *purple* box should be used? This is a tradeoff of removing kinks in the mapping isolines (caused by first derivative discontinuities) at the cost of introducing more easing into the smoothed transition where it touches areas mapped completely to one of the curve ends. The non-smoothed approach has very visible angles located where the lines cross the smoothing band border parallel to the middle green line, but the smoothed transition is relatively uniform. Half Smoothstep smoothes out those kinks, but there is significant non-iniformity in the gradient. Piecewise Smoothstep is a compromise that splices a section of a parabola before the threshold and a straight line after it in a way that ensures continuous first derivative; it basically smoothes out the kinks from a sharp angle into a curved section with its size depending on how big the threshold is (zero threshold is no smoothing), and only adds easing to the outermost band. Note that this does nothing to remove kinks caused by the original mapping rather than smoothing - they can be seen where the lines cross planes perpendicular to the curve at its ends. This smoothing makes the picture nicer (I plan to use it in the docs as well), but I have a feeling it may not be really important for the actual deformation. 2. Should the smoothing radius depend on the length of the curve, or the uncurved bone (should the Multiply in the *green* box be unmuted)? Basically, should the width of the smoothed transition between the ends of the curve where the lines converge far to the left of the image depend on the length of the curve itself, or the distance between the head and tail of the bone. Depending on the length of the curve like now seems to be more natural, as the spacing between lines where they cross the curve obviously depends on that, but maybe the other way makes sense more? Note especially the difference in behavior in frames 35-40. 3. What should be the final smoothing radius specified in the *blue* node? This is the actual value, the interpretation of which depends on the previous two points. Increasing it makes the transition smoother, but can cause the actual end points of the bone to be mapped further into the curve when the end curvature is high. Also, higher smoothing reduces the apparent width because of increased easing and requires increasing this value to maintain the same spacing in the middle of the transition. 4. Is the value in the *yellow* node the best? This basically controls how strongly lines are pushed away from the center, compared to the smoothing that mostly affects the outer boundary.
Member

Thanks for putting together this awesome blend file! Really helps.

  1. Unless there's a demonstrable quality difference between the methods, I think we should go with the fastest executing one, which I assume is non-smoothed(?). The important thing is that the discontinuities are removed, which is already accomplished by non-smoothed.
  2. I think unmuted is probably a bit better, because it seems to behave more consistently as the curve of the b-bone changes.
  3. Do you mean the Max Width vs Curve Length value node? 0.25 as you currently have it set seems fine to me. Tweaking it higher or lower gives results that might be better in specific cases, but 0.25 seems like a decent all-arounder. If that's not good enough, then adjusting it automatically based on maximum curvature in the b-bone might be worth exploring.
  4. Do you mean the Scale asymptote node? It basically seems like the larger the value, the better, with a rapid diminishing returns. So maybe 8.0? But the value you have set (3.0) also seems fine.

(As a side note: please use unique labels to specify what nodes you're talking about. Also using colors is great, to help people scan for them quickly. But without a label I'm a bit unsure "Did I really find the specific node he was talking about?" And on top of that, I'm partially color blind.)

Thanks for putting together this awesome blend file! Really helps. 1. Unless there's a demonstrable quality difference between the methods, I think we should go with the fastest executing one, which I assume is non-smoothed(?). The important thing is that the discontinuities are removed, which is already accomplished by non-smoothed. 2. I think **un**muted is probably a bit better, because it seems to behave more consistently as the curve of the b-bone changes. 3. Do you mean the `Max Width vs Curve Length` value node? `0.25` as you currently have it set seems fine to me. Tweaking it higher or lower gives results that might be better in specific cases, but `0.25` seems like a decent all-arounder. If that's not good enough, then adjusting it automatically based on maximum curvature in the b-bone might be worth exploring. 4. Do you mean the `Scale asymptote` node? It basically seems like the larger the value, the better, with a rapid diminishing returns. So maybe `8.0`? But the value you have set (`3.0`) also seems fine. (As a side note: please use unique labels to specify what nodes you're talking about. *Also* using colors is great, to help people scan for them quickly. But without a label I'm a bit unsure "Did I really find the specific node he was talking about?" And on top of that, I'm partially color blind.)
Member

Also, I notice that in the "After" image, it looks like there's a now a slight discontinuity on the outside edges, which isn't in the "Before". Is that just an artifact in the visualization that's not actually in the mapping?

Also, I notice that in the "After" image, it looks like there's a now a slight discontinuity on the outside edges, which isn't in the "Before". Is that just an artifact in the visualization that's not actually in the mapping?
Author
Member

Thanks for putting together this awesome blend file! Really helps.

I just polished up further a bit what I've been using to design the smoothing math in the first place. It would have been impossible to come up with a good solution without a quick and easy way to test it. 😄

  1. Unless there's a demonstrable quality difference between the methods, I think we should go with the fastest executing one, which I assume is non-smoothed(?). The important thing is that the discontinuities are removed, which is already accomplished by non-smoothed.

Of course smoothing has performance impact, but it's not much compared to the arctangent used by the slope based factor.

There is a demonstrable quality difference in the pictures, but probably not in the deformations. It's probably better to think about whether a more linear weight interpolation may be preferrable, or a heavily eased one.

  1. I think unmuted is probably a bit better, because it seems to behave more consistently as the curve of the b-bone changes.

Well, basically the idea is: "vs Bone Length" unmuted is to keep the width of the smoothed band consistent relative to the length of the bone, but the ratio of compression of the bands may change depending on how curved the bone is (since at the point where they cross the curve their width depends on the length of the curve). The muted way keeps the ratio of compression of the bands consistent, but the actual width will depend on the curvature.

  1. Do you mean the Max Width vs Curve Length value node? 0.25 as you currently have it set seems fine to me. Tweaking it higher or lower gives results that might be better in specific cases, but 0.25 seems like a decent all-arounder. If that's not good enough, then adjusting it automatically based on maximum curvature in the b-bone might be worth exploring.

Muted "vs Bone Length" already adjusts it based on curvature effectively (by not deliberately adjusting it back to compensate for it), so if you think vs Bone Length is better because of consistency, then keeping this fixed makes more sense.

  1. Do you mean the Scale asymptote node? It basically seems like the larger the value, the better, with a rapid diminishing returns. So maybe 8.0? But the value you have set (3.0) also seems fine.

That constant is basically used as in arctan(3.0 * arg), so increasing it naturally gives diminishing returns. I think I probably chose 3 as a nice round value about where the diminishing returns start.

Also, I notice that in the "After" image, it looks like there's a now a slight discontinuity on the outside edges, which isn't in the "Before". Is that just an artifact in the visualization that's not actually in the mapping?

Do you mean darker blue and red areas? I just added separate shading for exact 0 or 1 values to show where the boundary lies, since drawing lines doesn't work well for those boundaries (note how ugly they are in this file, where they are enabled for 'debug'). The Before picture is equivalent to disabling Smooth Falloff and setting "Max Width" to 0.2 instead of 0.25 (to roughly keep similar line density in the middle of the transition).

Did you check other segment counts? You can set the bone to any even value between 2 and 14 and it should work, although I found a bug in the "vs Bone Length" bit so download the attachment to this comment to fix segment counts above 8.

> Thanks for putting together this awesome blend file! Really helps. I just polished up further a bit what I've been using to design the smoothing math in the first place. It would have been impossible to come up with a good solution without a quick and easy way to test it. 😄 > 1. Unless there's a demonstrable quality difference between the methods, I think we should go with the fastest executing one, which I assume is non-smoothed(?). The important thing is that the discontinuities are removed, which is already accomplished by non-smoothed. Of course smoothing has performance impact, but it's not much compared to the arctangent used by the slope based factor. There is a demonstrable quality difference in the pictures, but probably not in the deformations. It's probably better to think about whether a more linear weight interpolation may be preferrable, or a heavily eased one. > 2. I think **un**muted is probably a bit better, because it seems to behave more consistently as the curve of the b-bone changes. Well, basically the idea is: "vs Bone Length" unmuted is to keep the width of the smoothed band consistent relative to the length of the bone, but the ratio of compression of the bands may change depending on how curved the bone is (since at the point where they cross the curve their width depends on the length of the curve). The muted way keeps the ratio of compression of the bands consistent, but the actual width will depend on the curvature. > 3. Do you mean the `Max Width vs Curve Length` value node? `0.25` as you currently have it set seems fine to me. Tweaking it higher or lower gives results that might be better in specific cases, but `0.25` seems like a decent all-arounder. If that's not good enough, then adjusting it automatically based on maximum curvature in the b-bone might be worth exploring. Muted "vs Bone Length" already adjusts it based on curvature effectively (by not deliberately adjusting it back to compensate for it), so if you think vs Bone Length is better because of consistency, then keeping this fixed makes more sense. > 4. Do you mean the `Scale asymptote` node? It basically seems like the larger the value, the better, with a rapid diminishing returns. So maybe `8.0`? But the value you have set (`3.0`) also seems fine. That constant is basically used as in `arctan(3.0 * arg)`, so increasing it naturally gives diminishing returns. I think I probably chose 3 as a nice round value about where the diminishing returns start. > Also, I notice that in the "After" image, it looks like there's a now a slight discontinuity on the outside edges, which isn't in the "Before". Is that just an artifact in the visualization that's not actually in the mapping? Do you mean darker blue and red areas? I just added separate shading for exact 0 or 1 values to show where the boundary lies, since drawing lines doesn't work well for those boundaries (note how ugly they are in this file, where they are enabled for 'debug'). The Before picture is equivalent to disabling Smooth Falloff and setting "Max Width" to 0.2 instead of 0.25 (to roughly keep similar line density in the middle of the transition). Did you check other segment counts? You can set the bone to any even value between 2 and 14 and it should work, although I found a bug in the "vs Bone Length" bit so download the attachment to this comment to fix segment counts above 8.
Member

Of course smoothing has performance impact, but it's not much compared to the arctangent used by the slope based factor.

Yeah, that's a good point. Having said that, I was playing around with the nodes like a dumb idiot (no real logic or understanding behind it), and I stumbled upon this, which removes the arctangent entirely:

no_arctan.png

It just replaces the whole arctangent construction with a clamped multiplier by a large value. And it appears to give essentially identical results to using the arctangent with a large value in Scale asymptote.

Do you mean darker blue and red areas? I just added separate shading for exact 0 or 1 values to show where the boundary lies

Got it. Awesome.

The muted way keeps the ratio of compression of the bands consistent, but the actual width will depend on the curvature.

The way in which unmuted feels more consistent to me is that it does a better job of ensuring the curved mapping within a consistent range around the bone, before starting to transition to the straightened mapping. This is especially obvious comparing between the quarter-circle and half-circle examples. And I think that's probably what we want for this, since the whole point is the curved mapping, so doing our best to "guarantee it" (on a best-effort basis) within a certain range of the bone seems like it makes sense.

That's also why I was wondering about adaptively changing that value, because it still doesn't do a great job of that in the ease-up example. Which makes me wonder if there's some other criteria that can give us even more consistent results in that sense.

Did you check other segment counts?

I did, yes. It seems pretty consistent/reasonable across the different segment counts, so I'm pretty happy with that.

The Before picture is equivalent to disabling Smooth Falloff and setting "Max Width" to 0.2 instead of 0.25 (to roughly keep similar line density in the middle of the transition).

Ah, interesting. Now that you mention it, setting it to 0.2 does feel a little more consistent to me across the examples. So maybe that's actually the value we should go with. (Unless you want to take some time to explore different criteria, as mentioned earlier.)

> Of course smoothing has performance impact, but it's not much compared to the arctangent used by the slope based factor. Yeah, that's a good point. Having said that, I was playing around with the nodes like a dumb idiot (no real logic or understanding behind it), and I stumbled upon this, which removes the arctangent entirely: ![no_arctan.png](/attachments/99af412a-2937-4227-a9b6-dd0b78e3bc61) It just replaces the whole arctangent construction with a clamped multiplier by a large value. And it appears to give essentially identical results to using the arctangent with a large value in `Scale asymptote`. > Do you mean darker blue and red areas? I just added separate shading for exact 0 or 1 values to show where the boundary lies Got it. Awesome. > The muted way keeps the ratio of compression of the bands consistent, but the actual width will depend on the curvature. The way in which unmuted feels more consistent to me is that it does a better job of ensuring the curved mapping within a consistent range around the bone, before starting to transition to the straightened mapping. This is especially obvious comparing between the quarter-circle and half-circle examples. And I think that's probably what we want for this, since the whole point is the curved mapping, so doing our best to "guarantee it" (on a best-effort basis) within a certain range of the bone seems like it makes sense. That's also why I was wondering about adaptively changing that value, because it still doesn't do a great job of that in the ease-up example. Which makes me wonder if there's some other criteria that can give us even more consistent results in that sense. > Did you check other segment counts? I did, yes. It seems pretty consistent/reasonable across the different segment counts, so I'm pretty happy with that. > The Before picture is equivalent to disabling Smooth Falloff and setting "Max Width" to 0.2 instead of 0.25 (to roughly keep similar line density in the middle of the transition). Ah, interesting. Now that you mention it, setting it to 0.2 does feel a little more consistent to me across the examples. So maybe that's actually the value we should go with. (Unless you want to take some time to explore different criteria, as mentioned earlier.)
Member

Didn't read everything but I could swear I've tested this in the past, but anyways, Nathan poked me and I tested again.

Based on all the images and the cool material set-up, I'm just gonna trust that you guys have figured out the best algorithm.

the original rig was built in a way that wouldn't really be affected by the new deform mode (the b-bones were disconnected around the lip corners, and were generally fairly straight)

I guess it would be interesting to point out that, yes, Rain's lips are disconnected at the corners at least partially because of the issues that this patch is trying to solve; The way bendy bones behaved on mouth corners was just super wonky. At the time, I didn't even realize why this was happening.

I think the results in the demo file speak for themselves, it's a great improvement.

I have only one not-so-small suggestion, but I hate to drag out this patch even more for you, so do with it what you wish:
What about an "Automatic" setting, which will pick between "Straight" and "Curved" depending on how curved the bendy bone is in the rest pose? Just hardcode whatever threshold you find best. Maybe it's a silly idea though.

That's all I got, great patch!

Didn't read everything but I could swear I've tested this in the past, but anyways, Nathan poked me and I tested again. Based on all the images and the cool material set-up, I'm just gonna trust that you guys have figured out the best algorithm. > the original rig was built in a way that wouldn't really be affected by the new deform mode (the b-bones were disconnected around the lip corners, and were generally fairly straight) I guess it would be interesting to point out that, yes, Rain's lips are disconnected at the corners at least partially because of the issues that this patch is trying to solve; The way bendy bones behaved on mouth corners was just super wonky. At the time, I didn't even realize why this was happening. I think the results in the demo file speak for themselves, it's a great improvement. I have only one not-so-small suggestion, but I hate to drag out this patch even more for you, so do with it what you wish: What about an "Automatic" setting, which will pick between "Straight" and "Curved" depending on how curved the bendy bone is in the rest pose? Just hardcode whatever threshold you find best. Maybe it's a silly idea though. That's all I got, great patch!
Author
Member

Yeah, that's a good point. Having said that, I was playing around with the nodes like a dumb idiot (no real logic or understanding behind it), and I stumbled upon this, which removes the arctangent entirely:

It just replaces the whole arctangent construction with a clamped multiplier by a large value. And it appears to give essentially identical results to using the arctangent with a large value in Scale asymptote.

Actually, the point of this part is to provide a smooth transition between no smoothing when the lines are spaced widely apart outside the curve, and narrowly spaced lines that require smoothing on the inside. A huge multiplier or removing the arctangent remove this smoothing component and cause kinks as the lines cross the curve, and some wobble; so I don't think it's a good idea.

However, what works is to replace arctan(y/x) with just y/(x+y) - it provides the same function profile at a fraction of the cost. Since this is all heuristic the exact values don't matter much, the important part is the overall shape of the function curve; and these two functions are about the same.

The way in which unmuted feels more consistent to me is that it does a better job of ensuring the curved mapping within a consistent range around the bone, before starting to transition to the straightened mapping. This is especially obvious comparing between the quarter-circle and half-circle examples. And I think that's probably what we want for this, since the whole point is the curved mapping, so doing our best to "guarantee it" (on a best-effort basis) within a certain range of the bone seems like it makes sense.

I see, makes sense.

That's also why I was wondering about adaptively changing that value, because it still doesn't do a great job of that in the ease-up example. Which makes me wonder if there's some other criteria that can give us even more consistent results in that sense.

Ah, interesting. Now that you mention it, setting it to 0.2 does feel a little more consistent to me across the examples. So maybe that's actually the value we should go with. (Unless you want to take some time to explore different criteria, as mentioned earlier.)

I tried to factor in the distance from the bsp plane to the nearest curve end ("Pinching attempt" box in the attached file), but that didn't produce good results. I basically clamp the half-width of the smoothing band for each plane to not be greater than 0.75 of the distance from the plane to the nearest end. Maybe there is a better way to apply this info, but in this primitive attempt the effect is either too strong, or not significant enough to warrant the complexity; it also diminishes with more or fewer bone segments than the default 8...

So unless you have some better ideas how to factor it in, it's probably better to just select the best overall Max Width value (0.2, 0.25 or whatever is best) and be done with it.

Changes in the file: max width to 0.2 (since you liked it better), smooth falloff to piecewise 0.2 (reduce easing but not completely), replaced arctangent, added muted pinching experiment, unmuted magenta bsp lines in the shader on the right.

> Yeah, that's a good point. Having said that, I was playing around with the nodes like a dumb idiot (no real logic or understanding behind it), and I stumbled upon this, which removes the arctangent entirely: > > It just replaces the whole arctangent construction with a clamped multiplier by a large value. And it appears to give essentially identical results to using the arctangent with a large value in `Scale asymptote`. Actually, the point of this part is to provide a smooth transition between no smoothing when the lines are spaced widely apart outside the curve, and narrowly spaced lines that require smoothing on the inside. A huge multiplier or removing the arctangent remove this smoothing component and cause kinks as the lines cross the curve, and some wobble; so I don't think it's a good idea. However, what works is to replace `arctan(y/x)` with just `y/(x+y)` - it provides the same function profile at a fraction of the cost. Since this is all heuristic the exact values don't matter much, the important part is the overall shape of the function curve; and these two functions are about the same. > The way in which unmuted feels more consistent to me is that it does a better job of ensuring the curved mapping within a consistent range around the bone, before starting to transition to the straightened mapping. This is especially obvious comparing between the quarter-circle and half-circle examples. And I think that's probably what we want for this, since the whole point is the curved mapping, so doing our best to "guarantee it" (on a best-effort basis) within a certain range of the bone seems like it makes sense. I see, makes sense. > That's also why I was wondering about adaptively changing that value, because it still doesn't do a great job of that in the ease-up example. Which makes me wonder if there's some other criteria that can give us even more consistent results in that sense. > > Ah, interesting. Now that you mention it, setting it to 0.2 does feel a little more consistent to me across the examples. So maybe that's actually the value we should go with. (Unless you want to take some time to explore different criteria, as mentioned earlier.) I tried to factor in the distance from the bsp plane to the nearest curve end ("Pinching attempt" box in the attached file), but that didn't produce good results. I basically clamp the half-width of the smoothing band for each plane to not be greater than 0.75 of the distance from the plane to the nearest end. Maybe there is a better way to apply this info, but in this primitive attempt the effect is either too strong, or not significant enough to warrant the complexity; it also diminishes with more or fewer bone segments than the default 8... So unless you have some better ideas how to factor it in, it's probably better to just select the best overall Max Width value (0.2, 0.25 or whatever is best) and be done with it. Changes in the file: max width to 0.2 (since you liked it better), smooth falloff to piecewise 0.2 (reduce easing but not completely), replaced arctangent, added muted pinching experiment, unmuted magenta bsp lines in the shader on the right.
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from 479d0b8fd0 to ffd0468044 2023-08-24 17:13:07 +02:00 Compare
Author
Member

@blender-bot package

Updated the implementation:

  • Replaced atan2(y, x) with y/(x + y).
  • Smoothing width is now relative to the straight bone length (head-tail distance) rather than curve length.
  • Width is set to 0.222 (equivalent to old 0.2)
  • New width-relative falloff curve.
@blender-bot package Updated the implementation: * Replaced `atan2(y, x)` with `y/(x + y)`. * Smoothing width is now relative to the straight bone length (head-tail distance) rather than curve length. * Width is set to 0.222 (equivalent to old 0.2) * New width-relative falloff curve.
Member

Package build started. Download here when ready.

Package build started. [Download here](https://builder.blender.org/download/patch/PR110758) when ready.
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from ffd0468044 to 2533389399 2023-09-04 16:18:56 +02:00 Compare
Author
Member

I rebased the branch on main.

Also, here are some per-vertex per-bone performance calculations, based on counting purely floating point operations and assuming division is about 3 times slower than addition or multiplication. The values are compared to the cost of transforming a vertex by one ordinary bone without preserve volume:

Operation Min Cost Max Cost Comment
Deformation of a vertex by a B-Bone 2 2
Deformation by a B-Bone with Preserve Volume 1.7 5.32 Depends on whether scale is used.
Additional final averaging cost: 3.91
Straight mapping 0.33 0.33
Curved mapping, 7-14 segments 2.56 5.8 Depends on whether the point falls
within the area that requires smoothing
Curved mapping, 15-30 segments 3.08 7.13
I rebased the branch on main. Also, here are some per-vertex per-bone performance calculations, based on counting purely floating point operations and assuming division is about 3 times slower than addition or multiplication. The values are compared to the cost of transforming a vertex by one ordinary bone without preserve volume: | Operation | Min Cost | Max Cost | Comment | | --- | --- | --- | --- | | Deformation of a vertex by a B-Bone | 2 | 2 || | Deformation by a B-Bone with Preserve Volume | 1.7 | 5.32 | Depends on whether scale is used.<br> Additional final averaging cost: 3.91 | | Straight mapping | 0.33 | 0.33 || | Curved mapping, 7-14 segments | 2.56 | 5.8 | Depends on whether the point falls<br>within the area that requires smoothing | | Curved mapping, 15-30 segments | 3.08 | 7.13 | |
Alexander Gavrilov added the
Interest
Animation & Rigging
label 2023-09-07 11:54:18 +02:00
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from 2533389399 to 5cfb6ab797 2023-09-12 12:06:39 +02:00 Compare
Nathan Vegdahl requested changes 2023-09-15 17:55:55 +02:00
Nathan Vegdahl left a comment
Member

This isn't a full review yet, but at least a start so we can get things moving again. I'll continue next week.

This isn't a full review yet, but at least a start so we can get things moving again. I'll continue next week.
@ -1526,0 +1555,4 @@
* Normally this should not be a problem, but just in case ensure that:
* - All splits between the head and the primary point away from the head.
* - All splits between the tail and the primary point towards the tail.
* - The primary split plane conforms to both requirements.
Member

It's definitely possible to make these conditions not hold with a b-bone. And this function just returns false in that case, which isn't checked or used in the calling code.

I tried disabling the last_start < first_end if clause (lines 1578-1580), and that results in a weird mapping when the conditions don't hold, but nothing really bad happens. And the straight mapping method also results in a similarly weird mapping in these cases, so that just seems par for the course.

So I think we can just remove these checks entirely, and just accept that if people make weird extreme b-bones, they're going end up with weird mappings.

It's definitely possible to make these conditions not hold with a b-bone. And this function just returns `false` in that case, which isn't checked or used in the calling code. I tried disabling the `last_start < first_end` if clause (lines 1578-1580), and that results in a weird mapping when the conditions don't hold, but nothing really bad happens. And the straight mapping method *also* results in a similarly weird mapping in these cases, so that just seems par for the course. So I think we can just remove these checks entirely, and just accept that if people make weird extreme b-bones, they're going end up with weird mappings.
Author
Member

The return value is not used, but failing the conditions still disables the curved mapping through side effects.

The return value is not used, but failing the conditions still disables the curved mapping through side effects.
angavrilov marked this conversation as resolved
@ -1643,0 +1776,4 @@
int stack_idx[MAX_BBONE_SUBDIV];
int start, end, bias, stack_ptr = 0;
/* Do the first split at the specified joint index. */
Member

If this can all just be rolled into the main binary traversal further below, I think that would be cleaner.

If this can all just be rolled into the main binary traversal further below, I think that would be cleaner.
Author
Member

Well, bias cannot be rolled into it cleanly, and it ensures that the ends get the maximum number of bsp splits for symmetry and smoothness.

Well, `bias` cannot be rolled into it cleanly, and it ensures that the ends get the maximum number of bsp splits for symmetry and smoothness.
Author
Member

Biasing toward the previous rather than the first split also seems ok, so fixed.

Biasing toward the previous rather than the first split also seems ok, so fixed.
angavrilov marked this conversation as resolved
@ -183,0 +185,4 @@
float point[3];
float tangent[3];
/* Dot product of point and tangent to speed up distance computation. */
float plane_bias;
Member

Instead of tangent and plane_bias, maybe plane_normal and plane_offset? That feels more clear to me given how they're used.

Instead of `tangent` and `plane_bias`, maybe `plane_normal` and `plane_offset`? That feels more clear to me given how they're used.
angavrilov marked this conversation as resolved
@ -183,0 +188,4 @@
float plane_bias;
/* Inverse width of the smoothing at this level in head-tail space.
* Hack: this value is actually indexed by bsp depth (0 to bsp_depth-1), not joint index.
Member

Nit: instead of "Hack" use "Optimization".

Nit: instead of "Hack" use "Optimization".
angavrilov marked this conversation as resolved
@ -192,0 +205,4 @@
/* Index of the primary split plane joint, or -1 if linear mode. */
short bbone_primary_split;
/* Maximum number of levels in the joint BSP tree. */
short bbone_bsp_depth;
Member

Both bbone_primary_split and bbone_bsp_depth appear unnecessary to store precomputed here, based on looking at the rest of the code. bbone_primary_split can trivially be computed on the fly at the start of the binary search (and equivalent code will be run anyway at each depth for further splits). And bbone_bsp_depth only appears to actually be used within the same function that computes it, and is otherwise only used for an assert that I don't think really guards against anything meaningful.

Both `bbone_primary_split` and `bbone_bsp_depth` appear unnecessary to store precomputed here, based on looking at the rest of the code. `bbone_primary_split` can trivially be computed on the fly at the start of the binary search (and equivalent code will be run anyway at each depth for further splits). And `bbone_bsp_depth` only appears to actually be used within the same function that computes it, and is otherwise only used for an assert that I don't think really guards against anything meaningful.
Author
Member

bbone_primary_split can be adjusted by the above condition checks to ensure the conditions hold, even if somehow they don't hold for segments / 2.

`bbone_primary_split` can be adjusted by the above condition checks to ensure the conditions hold, even if somehow they don't hold for `segments / 2`.
angavrilov marked this conversation as resolved
@ -192,0 +207,4 @@
/* Maximum number of levels in the joint BSP tree. */
short bbone_bsp_depth;
/* Inverse of the total length of the segment polyline. */
float bbone_segment_scale;
Member

I think we can just call this bbone_total_length_inverse.

I think we can just call this `bbone_total_length_inverse`.
angavrilov marked this conversation as resolved
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from 5cfb6ab797 to d552481667 2023-09-16 09:35:25 +02:00 Compare
Nathan Vegdahl requested changes 2023-09-18 14:56:19 +02:00
Nathan Vegdahl left a comment
Member

Over-all this is looking good to me. Just a handful of additional changes.

Over-all this is looking good to me. Just a handful of additional changes.
@ -1643,0 +1735,4 @@
/* Perform a BSP binary search to narrow things down to one segment.
* Checked BSP planes are stored for the smoothing pass later. */
float boundary_dist[MAX_BBONE_SUBDIV + 1];
int stack_idx[MAX_BBONE_SUBDIV];
Member

Minor readability improvement: stack_idx -> boundary_idx_stack. stack_idx reads to me like it's a single index into a stack (like stack_ptr), rather than a stack of indices into another array. And including boundary in the name makes it clearer what the indices in the stack are indexing into.

Another possibility might be joint_idx_stack, since that's (I think?) equivalent here. Although that makes the relationship to the boundary_dist array a little less clear.

Minor readability improvement: `stack_idx` -> `boundary_idx_stack`. `stack_idx` reads to me like it's a single index into a stack (like `stack_ptr`), rather than a stack of indices into another array. And including `boundary` in the name makes it clearer what the indices in the stack are indexing into. Another possibility might be `joint_idx_stack`, since that's (I think?) equivalent here. Although that makes the relationship to the `boundary_dist` array a little less clear.
angavrilov marked this conversation as resolved
@ -1643,0 +1776,4 @@
const float d1 = fabsf(boundary_dist[start]);
const float d2 = fabsf(boundary_dist[end]);
head_tail = segment_size * (start * d2 + end * d1) / (d1 + d2);
Member

I'm having a little trouble following the math on this. If I understand correctly, the goal here is to compute a [0, 1] parameter that represents the [head, tail] position across the whole length of the bone. If that's the case, this code is easier for me to follow:

const float t = d1 / (d1 + d2);  /* Position between start and end. */
head_tail = (segment_size * start) + (segment_size * t);  /* Position between head and tail. */

Maybe that's just me, but it also takes one less multiply.

I'm having a little trouble following the math on this. If I understand correctly, the goal here is to compute a [0, 1] parameter that represents the [head, tail] position across the whole length of the bone. If that's the case, this code is easier for me to follow: ```c++ const float t = d1 / (d1 + d2); /* Position between start and end. */ head_tail = (segment_size * start) + (segment_size * t); /* Position between head and tail. */ ``` Maybe that's just me, but it also takes one less multiply.
Author
Member

I think I was thinking in terms of a 'weighted sum' here, hence the somewhat weird math expression.

I think I was thinking in terms of a 'weighted sum' here, hence the somewhat weird math expression.
angavrilov marked this conversation as resolved
@ -1643,0 +1779,4 @@
head_tail = segment_size * (start * d2 + end * d1) / (d1 + d2);
}
/* Smooth the mapping to suppress discontinuities by using BSP boundaries up the stack.
Member

We should include a reference to this PR in this comment, since the rationale behind this as well as the blend file where you worked out the formulas for this are both there. And that gives people a lot more to work from if this ever needs to be revisited in the future.

We should include a reference to this PR in this comment, since the rationale behind this as well as the blend file where you worked out the formulas for this are both there. And that gives people a lot more to work from if this ever needs to be revisited in the future.
angavrilov marked this conversation as resolved
@ -183,0 +184,4 @@
/* Boundary data in pose space. */
float point[3];
float plane_normal[3];
/* Dot product of point and tangent to speed up distance computation. */
Member

Now that the member names have been changed, also change this comment to match: "Dot product of point and plane_normal..."

Now that the member names have been changed, also change this comment to match: "Dot product of point and plane_normal..."
angavrilov marked this conversation as resolved
@ -1088,0 +1090,4 @@
"STRAIGHT",
0,
"Straight",
"Fast mapping method that ignores the rest pose curvature of the B-Bone"},
Member

Maybe:

"Fast mapping that is good for most situations"

And then for the curved mapping:

"Slower mapping that gives better deformations for B-bones that are sharply curved in rest pose"

Further details can be provided in the documentation.

Maybe: > "Fast mapping that is good for most situations" And then for the curved mapping: > "Slower mapping that gives better deformations for B-bones that are sharply curved in rest pose" Further details can be provided in the documentation.
angavrilov marked this conversation as resolved
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from d552481667 to 1257a9913c 2023-09-18 16:07:58 +02:00 Compare
Author
Member

Here is the latest blend file for completeness. It reflects the change in managing the bsp division bias value.

Here is the latest blend file for completeness. It reflects the change in managing the bsp division `bias` value.
Nathan Vegdahl approved these changes 2023-09-18 17:41:06 +02:00
Nathan Vegdahl left a comment
Member

This looks good to me, but I'd like to get one more pair of eyes on it before merging.

@dr.sybren could you also give this a review?

This looks good to me, but I'd like to get one more pair of eyes on it before merging. @dr.sybren could you also give this a review?
Alexander Gavrilov added the
Module
Animation & Rigging
label 2023-09-19 11:44:50 +02:00
  • First, boundaries between segments are used for a binary space
    partitioning search to narrow down the mapping to one segment.
  • Then, a position on the segment is chosen via linear
    interpolation between the planes.

This references "the planes", but so far there is no mention of those. What are those planes?

  • Finally, to remove the sharp discontinuity a smoothing pass is
    applied to the chosen position by blending to reduce the slope
    around the planes previously used in the BSP search.

This references "the sharp discontinuity", which also hasn't been mentioned before. Maybe change that to "any sharp discontinuity" (to make it implicit that these in fact can occur) or explicitly describe that these can occur (and how/why).

In order to make per-vertex processing faster, a new array with the
necessary data converted to the optimal representation is built.

What is this optimal representation?

The new mode is implemented as a per-bone option in order to ensure
backward compatibility, and also because the new mode may not be
optimal for all cases due to the difference in performance, and
complications like the smoothed but still present mapping
discontinuities around the evolute surface.

👍

> * First, boundaries between segments are used for a binary space > partitioning search to narrow down the mapping to one segment. > * Then, a position on the segment is chosen via linear > interpolation between the planes. This references "the planes", but so far there is no mention of those. What are those planes? > * Finally, to remove the sharp discontinuity a smoothing pass is > applied to the chosen position by blending to reduce the slope > around the planes previously used in the BSP search. This references "the sharp discontinuity", which also hasn't been mentioned before. Maybe change that to "any sharp discontinuity" (to make it implicit that these in fact can occur) or explicitly describe that these can occur (and how/why). > In order to make per-vertex processing faster, a new array with the > necessary data converted to the optimal representation is built. What is this optimal representation? > The new mode is implemented as a per-bone option in order to ensure > backward compatibility, and also because the new mode may not be > optimal for all cases due to the difference in performance, and > complications like the smoothed but still present mapping > discontinuities around the evolute surface. :+1:
Author
Member
  • First, boundaries between segments are used for a binary space
    partitioning search to narrow down the mapping to one segment.
  • Then, a position on the segment is chosen via linear
    interpolation between the planes.

This references "the planes", but so far there is no mention of those. What are those planes?

"boundaries between segments are used for a binary space partitioning" = planes

Since binary space partitioning is normally done using planes.

  • Finally, to remove the sharp discontinuity a smoothing pass is
    applied to the chosen position by blending to reduce the slope
    around the planes previously used in the BSP search.

This references "the sharp discontinuity", which also hasn't been mentioned before. Maybe change that to "any sharp discontinuity" (to make it implicit that these in fact can occur) or explicitly describe that these can occur (and how/why).

They always occur though, it was a major point in the history of the patch:

img

In order to make per-vertex processing faster, a new array with the
necessary data converted to the optimal representation is built.

What is this optimal representation?

Basically converting everything from LOCAL to POSE space.

> > * First, boundaries between segments are used for a binary space > > partitioning search to narrow down the mapping to one segment. > > * Then, a position on the segment is chosen via linear > > interpolation between the planes. > > This references "the planes", but so far there is no mention of those. What are those planes? "boundaries between segments are used for a binary space partitioning" = planes Since binary space partitioning is normally done using planes. > > * Finally, to remove the sharp discontinuity a smoothing pass is > > applied to the chosen position by blending to reduce the slope > > around the planes previously used in the BSP search. > > This references "the sharp discontinuity", which also hasn't been mentioned before. Maybe change that to "any sharp discontinuity" (to make it implicit that these in fact can occur) or explicitly describe that these can occur (and how/why). They always occur though, it was a major point in the history of the patch: ![img](https://projects.blender.org/attachments/3a073250-e965-4725-aa0e-3a0d1de86310) > > In order to make per-vertex processing faster, a new array with the > > necessary data converted to the optimal representation is built. > > What is this optimal representation? Basically converting everything from LOCAL to POSE space.
Sybren A. Stüvel requested changes 2023-09-22 12:56:34 +02:00
Sybren A. Stüvel left a comment
Member

The code works, and I think it's really nice to have in 4.0!

Because things already work well, I mostly focused on clarity of the code & comments.

The word "inverse" is used quite a few times in the code. It seems to be used as 1/x. Mathematically "inverse" is typically used for operations, so this would "the inverse under multiplcation". Probably better to use the word "reciprocal", as that's actually defined as 1/x.

Since this is such a novel approach (at least for Blender), and the images in this PR really help to clarify things, how would you feel about adding a new page on https://wiki.blender.org/wiki/Source/Animation and documenting the implementation there? That might be better than referring to the PR itself.

The code works, and I think it's really nice to have in 4.0! Because things already work well, I mostly focused on clarity of the code & comments. The word "inverse" is used quite a few times in the code. It seems to be used as `1/x`. Mathematically "inverse" is typically used for operations, so this would "the inverse under multiplcation". Probably better to use the word "reciprocal", as that's actually defined as `1/x`. Since this is such a novel approach (at least for Blender), and the images in this PR really help to clarify things, how would you feel about adding a new page on https://wiki.blender.org/wiki/Source/Animation and documenting the implementation there? That might be better than referring to the PR itself.
@ -1507,3 +1508,3 @@
}
static void allocate_bbone_cache(bPoseChannel *pchan, int segments)
static void allocate_bbone_cache(bPoseChannel *pchan, int segments, bool joints)

const

Document what joints actually means. Especially since segments actually means "number of segments" (and num_segments thus would be a better name), it gets confusing when joints doesn't mean "number of joints".

The value of joints is either determined by pchan->bone->bbone_mapping_mode == BBONE_MAPPING_CURVED or pchan->runtime->bbone_use_bsp_mapping. So what's the ground truth here? And why isn't that property used everywhere, given that each call has access to the pchan and thus also its runtime data?

There's another confusion in the naming. The bone property is called ..._segment_joints, but the property that determines whether it's allocated is called ..._bsp_mapping. I'm guessing that the "planes" in the "segment joints" form a space partition, but that's guesswork. It's better if related things are actually named such that that relation is clear. Or at the bare minimum, have either side's documentation mention the other.

`const` Document what `joints` actually means. Especially since `segments` actually means "number of segments" (and `num_segments` thus would be a better name), it gets confusing when `joints` doesn't mean "number of joints". The value of `joints` is either determined by `pchan->bone->bbone_mapping_mode == BBONE_MAPPING_CURVED` or `pchan->runtime->bbone_use_bsp_mapping`. So what's the ground truth here? And why isn't that property used everywhere, given that each call has access to the `pchan` and thus also its runtime data? There's another confusion in the naming. The bone property is called `..._segment_joints`, but the property that determines whether it's allocated is called `..._bsp_mapping`. I'm guessing that the "planes" in the "segment joints" form a space partition, but that's guesswork. It's better if related things are actually named such that that relation is clear. Or at the bare minimum, have either side's documentation mention the other.
Author
Member

The value of joints is either determined by pchan->bone->bbone_mapping_mode == BBONE_MAPPING_CURVED or pchan->runtime->bbone_use_bsp_mapping.

It is pretty simple: when computing the data, it is using the original mapping mode. However when copying it, it copies what is there. The same copy code also uses runtime.bbone_segments instead of bone->segments.

> The value of joints is either determined by `pchan->bone->bbone_mapping_mode == BBONE_MAPPING_CURVED` or `pchan->runtime->bbone_use_bsp_mapping`. It is pretty simple: when computing the data, it is using the original mapping mode. However when copying it, it copies what is there. The same copy code also uses `runtime.bbone_segments` instead of `bone->segments`.
@ -1524,2 +1525,4 @@
1 + uint(segments), sizeof(DualQuat), "bPoseChannel_Runtime::bbone_dual_quats"));
}
if (joints && !runtime->bbone_segment_joints) {

Shouldn't this be reallocated when the number of bbone segments changes?

Shouldn't this be reallocated when the number of bbone segments changes?
dr.sybren marked this conversation as resolved
@ -1526,0 +1533,4 @@
}
}
static bool compute_bbone_joint_planes(bPoseChannel *pchan)

Document what the returned bool means.

This actually computes the bbone_segment_joints. Sometimes they're called "segment joints", sometimes "joint planes", and sometimes "bsp_mapping".

Document what the returned `bool` means. This actually computes the `bbone_segment_joints`. Sometimes they're called "segment joints", sometimes "joint planes", and sometimes "bsp_mapping".
dr.sybren marked this conversation as resolved
@ -1526,0 +1555,4 @@
joints[i].plane_offset = dot_v3v3(joints[i].point, joints[i].plane_normal);
}
/* Precompute the (inverted) length of the curve. */

Remove the parentheses, as the word "inverted" is not something that's optional here. It'll always do that.

Remove the parentheses, as the word "inverted" is not something that's optional here. It'll always do that.
dr.sybren marked this conversation as resolved
@ -1526,0 +1568,4 @@
* The actual space partitioning includes two extra virtual segments for the ends. */
const int bsp_depth = int(ceilf(log2f(bone->segments + 2)));
/* Maximum half-width of the smoothing band at the primary split, in segments. */

"the" primary split -- I haven't seen anything about about "primary splits" or "secondary splits" or similar terms, so talking about the primary split gets confusing. What does it mean? What is it referring to?

"the" primary split -- I haven't seen anything about about "primary splits" or "secondary splits" or similar terms, so talking about *the* primary split gets confusing. What does it mean? What is it referring to?
dr.sybren marked this conversation as resolved
@ -1526,0 +1569,4 @@
const int bsp_depth = int(ceilf(log2f(bone->segments + 2)));
/* Maximum half-width of the smoothing band at the primary split, in segments. */
const float bone_length = len_v3v3(joints[0].point, joints[bone->segments].point);

If the points are on a curve, this is not the arclength of the bone, but the distance between its start & end point. Naming this something like straight_length and the length_sum to arc_length would make things clearer.

If the points are on a curve, this is not the arclength of the bone, but the distance between its start & end point. Naming this something like `straight_length` and the `length_sum` to `arc_length` would make things clearer.
dr.sybren marked this conversation as resolved
@ -1526,0 +1570,4 @@
/* Maximum half-width of the smoothing band at the primary split, in segments. */
const float bone_length = len_v3v3(joints[0].point, joints[bone->segments].point);
const float max_depth_scale = bone->segments * (bone_length / length_sum) * 0.222f;

What's the magic 0.222f?

What's the magic `0.222f`?
@ -1526,0 +1572,4 @@
const float bone_length = len_v3v3(joints[0].point, joints[bone->segments].point);
const float max_depth_scale = bone->segments * (bone_length / length_sum) * 0.222f;
/* Per-layer scaling factor, aiming to reduce width to 1 at innermost layer. */

What are layers here? And why is a width of 1 desirable?

What are layers here? And why is a width of 1 desirable?
@ -1526,0 +1573,4 @@
const float max_depth_scale = bone->segments * (bone_length / length_sum) * 0.222f;
/* Per-layer scaling factor, aiming to reduce width to 1 at innermost layer. */
const float scale_factor = powf(max_ff(max_depth_scale, 1.0f), 1.0f / (bsp_depth - 1));

What is the typical range of this scale factor? Is it in [0, 1] (i.e. the loop below diminishes the scale for every subsequent joint), ≥1 (i.e. the loop increases the scale for every subsequent joint), <0 (toggles back & forth between positive & negative)?

What is the typical range of this scale factor? Is it in `[0, 1]` (i.e. the loop below diminishes the scale for every subsequent joint), `≥1` (i.e. the loop increases the scale for every subsequent joint), `<0` (toggles back & forth between positive & negative)?
dr.sybren marked this conversation as resolved
@ -1547,2 +1608,4 @@
BKE_pchan_bbone_spline_setup(pchan, true, true, b_bone_rest);
/* Compute joint planes. */
runtime->bbone_use_bsp_mapping = use_joints && compute_bbone_joint_planes(pchan);

If the BSP mapping isn't going to get used, what's the use in allocating the "segment joints"? Shouldn't they either be allocated and used, or not allocated at all? Or if that's tricky, deallocated when it turns out they can't be used?

If the BSP mapping isn't going to get used, what's the use in allocating the "segment joints"? Shouldn't they either be allocated and used, or not allocated at all? Or if that's tricky, deallocated when it turns out they can't be used?

I still don't think runtime->bbone_use_bsp_mapping (now runtime->bbone_use_curved_mapping) is necessary (this is related to my other question about about the allocation). If the mapping isn't going to be used, then the data shouldn't be kept allocated and the pointer can be nullptr. This means that you can simply use runtime->bbone_segment_boundaries != nullptr instead of having a separate boolean that needs to be kept in sync.

I still don't think `runtime->bbone_use_bsp_mapping` (now `runtime->bbone_use_curved_mapping`) is necessary (this is related to my other question about about the allocation). If the mapping isn't going to be used, then the data shouldn't be kept allocated and the pointer can be `nullptr`. This means that you can simply use `runtime->bbone_segment_boundaries != nullptr` instead of having a separate boolean that needs to be kept in sync.
@ -1599,0 +1665,4 @@
runtime->bbone_use_bsp_mapping = use_bsp_mapping;
if (use_bsp_mapping) {

If !use_bsp_mapping, shouldn't this reset some of the properties that are set below?

If `!use_bsp_mapping`, shouldn't this reset some of the properties that are set below?
@ -1629,2 +1703,2 @@
int *r_index,
float *r_blend_next)
/** Implementation of the Straight B-Bone segment mapping. */
static void find_bbone_segment_index_simple(const bPoseChannel *pchan,

Instead of calling this function "simple", just call it "straight", because that's what it implements.

Instead of calling this function "simple", just call it "straight", because that's what it implements.
dr.sybren marked this conversation as resolved
@ -1640,6 +1717,138 @@ void BKE_pchan_bbone_deform_segment_index(const bPoseChannel *pchan,
pchan, y / pchan->bone->length, r_index, r_blend_next);
}
/** Computes signed distance to the plane perpendicular to the joint tangent. */

What is the joint tangent? Isn't this just the segment itself, expressed as a vector in some space or other?

What is the joint tangent? Isn't this just the segment itself, expressed as a vector in some space or other?

This question hasn't been answered yet.

This question hasn't been answered yet.
@ -1641,2 +1718,4 @@
}
/** Computes signed distance to the plane perpendicular to the joint tangent. */
inline float bbone_segment_bsp_side(const bPoseChannel_BBoneSegmentJoint &joint, const float *co)

Name the function after what it does, so something like bbone_segment_bsp_signed_distance.

Name the function after what it does, so something like `bbone_segment_bsp_signed_distance`.
dr.sybren marked this conversation as resolved
@ -1643,0 +1724,4 @@
}
/** Implementation of the Curved B-Bone segment mapping. */
static void find_bbone_segment_index_bsp(const bPoseChannel *pchan,

Since this implements the "curved" mode, just name it "curved" instead of "bsp".

Since this implements the "curved" mode, just name it "curved" instead of "bsp".
dr.sybren marked this conversation as resolved
@ -1643,0 +1735,4 @@
/* Perform a BSP binary search to narrow things down to one segment.
* Checked BSP planes are stored for the smoothing pass later. */
float boundary_dist[MAX_BBONE_SUBDIV + 1];
int boundary_idx_stack[MAX_BBONE_SUBDIV];

What is this stack?

What is this stack?
dr.sybren marked this conversation as resolved
@ -1643,0 +1794,4 @@
/* Boundary in the head-tail space. */
const float boundary_pos = boundary_idx * segment_size;
/* Distance of the original 3d point from the boundary plane,

is "the original 3d point" the same as co?

is "the original 3d point" the same as `co`?
dr.sybren marked this conversation as resolved
@ -1643,0 +1824,4 @@
/* Smooth the distance based coefficient around zero. */
const float dist_coeff_smooth = dist_coeff * dist_coeff * (7.0f + 1.0f) /
(7.0f * dist_coeff + 1.0f);

Where do these constants come from (inclyuding the 3.0f above)? It's fine if they were just experimentally found. Just add some documentation that explains.

Where do these constants come from (inclyuding the `3.0f` above)? It's fine if they were just experimentally found. Just add some documentation that explains.

The refactor to using named constants is good, but it doesn't answer the question of where these values came from.

The refactor to using named constants is good, but it doesn't answer the question of where these values came from.
Author
Member

It comes from what looks reasonably good in the pictures.

It comes from what looks reasonably good in the pictures.
@ -180,6 +180,19 @@ typedef struct bPoseChannelDrawData {
struct DualQuat;
struct Mat4;
typedef struct bPoseChannel_BBoneSegmentJoint {

Given that this is already a "bone" (often also represented as joints), add some docstring that explains what is a "segment joint". The code also refers to a plane; the relation of "segment joints" and "planes" should be documented as well.

Given that this is already a "bone" (often also represented as joints), add some docstring that explains what is a "segment joint". The code also refers to a plane; the relation of "segment joints" and "planes" should be documented as well.
dr.sybren marked this conversation as resolved
@ -183,0 +188,4 @@
float plane_offset;
/* Inverse width of the smoothing at this level in head-tail space.
* Optimization: this value is actually indexed by bsp depth (0 to bsp_depth-1), not joint

👍 for documenting this optimisation.

:+1: for documenting this optimisation.
dr.sybren marked this conversation as resolved
@ -198,1 +218,4 @@
struct DualQuat *bbone_dual_quats;
/* Segment boundaries for curved mode. */
struct bPoseChannel_BBoneSegmentJoint *bbone_segment_joints;

If those are "segment boundaries", why is the struct name "joints", and not "boundaries"?

👍 for documenting that this is specific to curved mode.

If those are "segment boundaries", why is the struct name "joints", and not "boundaries"? :+1: for documenting that this is specific to curved mode.
dr.sybren marked this conversation as resolved
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from 1257a9913c to f5def8ae1c 2023-09-22 14:02:49 +02:00 Compare
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from f5def8ae1c to 5649b6e21d 2023-09-22 14:17:30 +02:00 Compare
Sybren A. Stüvel requested changes 2023-09-22 15:00:19 +02:00
@ -82,2 +82,4 @@
float oldlength;
/** Mapping of vertices to segments. */
char bbone_mapping_mode;

Since this is purely runtime data anyway, just use the eBone_BBoneMappingMode type instead of char.

Since this is purely runtime data anyway, just use the `eBone_BBoneMappingMode` type instead of `char`.
Author
Member

What is the point of using an enum for a field that is a temporary storage of a char DNA field. It means nothing.

In other words: this declaration has the same type as the corresponding permanent DNA field.

What is the point of using an enum for a field that is a temporary storage of a char DNA field. It means nothing. In other words: this declaration has the same type as the corresponding permanent DNA field.

DNA not supporting enum types doesn't mean other code cannot use them. Use enum types whenever the data is of the enum type, unless there is a specific reason why this is not possible. This is not such a reason.

DNA not supporting enum types doesn't mean other code cannot use them. Use enum types whenever the data is of the enum type, unless there is a specific reason why this is not possible. This is not such a reason.
@ -1602,0 +1684,4 @@
if (use_curved_mapping) {
runtime->bbone_arc_length_reciprocal = runtime_from->bbone_arc_length_reciprocal;
memcpy(runtime->bbone_segment_boundaries,

I think a better approach would be to just copy what's in the source, instead of reevaluating whether the curved mapping should be used. In other words, call allocate_bbone_cache(pchan, segments, false), then:

runtime->bbone_use_curved_mapping = runtime_from->use_curved_mapping;
runtime->bbone_arc_length_reciprocal = runtime_from->bbone_arc_length_reciprocal;
runtime->bbone_segment_boundaries = MEM_dupallocN(runtime_from->bbone_segment_boundaries);

MEM_dupallocN is NULL-safe, so it'll just do the right thing here.

I think a better approach would be to just copy what's in the source, instead of reevaluating whether the curved mapping should be used. In other words, call `allocate_bbone_cache(pchan, segments, false)`, then: ```cpp runtime->bbone_use_curved_mapping = runtime_from->use_curved_mapping; runtime->bbone_arc_length_reciprocal = runtime_from->bbone_arc_length_reciprocal; runtime->bbone_segment_boundaries = MEM_dupallocN(runtime_from->bbone_segment_boundaries); ``` `MEM_dupallocN` is NULL-safe, so it'll just do the right thing here.
Author
Member

This whole function is clearly structured as to avoid freeing and immediately re-allocating buffers. Otherwise it all can be changed to do free + dupalloc.

This whole function is clearly structured as to avoid freeing and immediately re-allocating buffers. Otherwise it all can be changed to do free + dupalloc.

It's not clearly structured that way. But I get your point.

It's not *clearly* structured that way. But I get your point.
dr.sybren marked this conversation as resolved
@ -1646,0 +1753,4 @@
float boundary_dist[MAX_BBONE_SUBDIV + 1];
/* Stack of BSP plane indices that were checked in the binary search. */
int boundary_idx_stack[MAX_BBONE_SUBDIV];

Now that I understand better what this is, just use a blender::Vector<int> to store a variable-sized array.
If you want to keep things as they are, at least move the declaration of stack_ptr here as well so that the relevant variables are declared together. Possibly rename stack_ptr (because it's not a pointer) to something like boundary_idx_stack_top (or stack_top_index, or something else non-pointery).

Now that I understand better what this is, just use a `blender::Vector<int>` to store a variable-sized array. If you want to keep things as they are, at least move the declaration of `stack_ptr` here as well so that the relevant variables are declared together. Possibly rename `stack_ptr` (because it's not a pointer) to something like `boundary_idx_stack_top` (or `stack_top_index`, or something else non-pointery).
dr.sybren marked this conversation as resolved
Sybren A. Stüvel approved these changes 2023-09-26 13:58:13 +02:00
Sybren A. Stüvel left a comment
Member

👍

Just one note that can be handled when landing the PR.

:+1: Just one note that can be handled when landing the PR.
@ -1602,0 +1682,4 @@
memcpy(runtime->bbone_segment_boundaries,
runtime_from->bbone_segment_boundaries,
sizeof(bPoseChannel_BBoneSegmentBoundary) * (1 + segments));
}

Add an assert that runtime->bbone_segment_boundaries == nullptr in an else clause, so that it's locally clear that !use_curved_mapping implies that this pointer is nil.

Add an assert that `runtime->bbone_segment_boundaries == nullptr` in an `else` clause, so that it's locally clear that `!use_curved_mapping` implies that this pointer is nil.
Alexander Gavrilov force-pushed pr-bbone-curved-mapping from cac6505037 to a5224b40af 2023-09-26 14:17:02 +02:00 Compare
Alexander Gavrilov merged commit 0055ae01ab into main 2023-09-26 14:17:29 +02:00
Alexander Gavrilov deleted branch pr-bbone-curved-mapping 2023-09-26 14:17:31 +02:00
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No Assignees
5 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#110758
No description provided.