Closed-form expressions for the Distance Gradients / Hessians #87
-
Hi, Thank you for your time. VECTOR<T, 12>
Point_Plane_Distance_Gradient(const TV& p, const TV& a, const TV& b, const TV& c)
{
TV pa = p - a;
TV ba = b - a;
TV ca = c - a;
TV cb = c - b;
TV normal = ba.Cross(ca);
T one_over_normal_sq = 1 / normal.Magnitude_Squared();
T one_over_normal_sq_sq = one_over_normal_sq * one_over_normal_sq;
T p_distance = pa.Dot(normal);
T p_distance_sq = p_distance * p_distance;
return 2 * p_distance * one_over_normal_sq * VECTOR<T, 12>(
normal, //g_p
VECTOR<T, 9>(
p_distance * one_over_normal_sq * cb.Cross(normal) - (normal + pa.Cross(cb)), //g_a
VECTOR<T, 6>(
ca.Cross(normal) + pa.Cross(ca) * p_distance * one_over_normal_sq, //g_b
pa.Cross(ba) - ba.Cross(normal) * p_distance * one_over_normal_sq //g_c
)
)
);
} VECTOR<T, 9>
Point_Line_Distance_Gradient(const TV& p, const TV& a, const TV& b) const
{
TV pa = p - a;
TV pb = p - b;
TV ab = a - b;
T over_ab_sq = 1.0 / ab.Magnitude_Squared();
TV paXpb = (p - a).Cross(p - b);
T over_ab_sq_sq = over_ab_sq * over_ab_sq;
T paXpb_sq = paXpb.Magnitude_Squared();
TV f = 2 * ab * paXpb_sq * over_ab_sq_sq;
return 2 * over_ab_sq * VECTOR<T, 9>(
ab.Cross(paXpb),
VECTOR<T, 6>(
f + pb.Cross(paXpb),
f + pa.Cross(paXpb)
)
); |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
All of the distance derivatives in the toolkit (except for point-point) are derived using symbolic differentiation, so unfortunately, I do not have the matrix calculus versions of these expressions on hand. I can point you to Kim and Eberle's [2022] "Dynamic Deformables" course (see "Appendix I: Computing the Derivatives of a Triangle (and Edge) Normal"). Shi and Kim [2023] also have some useful derivations in their supplemental. If your goal is to verify your derivation you can compare it with the existing implementation or use finite differences. The derivatives were originally derived using the Symbolic Math Toolbox in MATLAB, but I prefer to use SymPy in Python. There is a utility file |
Beta Was this translation helpful? Give feedback.
-
If you do end up deriving these by hand, I would be curious to see what the performance differences are. I imagine a matrix calculus implementation could be able to leverage fast Fortran subroutines (e.g., BLAS), but the symbolically generated code uses common subexpression detection and collection to reduce the amount of duplicate work. |
Beta Was this translation helpful? Give feedback.
-
Never mind, I don't need the original equations - just wanted to verify that they are correct. |
Beta Was this translation helpful? Give feedback.
All of the distance derivatives in the toolkit (except for point-point) are derived using symbolic differentiation, so unfortunately, I do not have the matrix calculus versions of these expressions on hand. I can point you to Kim and Eberle's [2022] "Dynamic Deformables" course (see "Appendix I: Computing the Derivatives of a Triangle (and Edge) Normal"). Shi and Kim [2023] also have some useful derivations in their supplemental.
If your goal is to verify your derivation you can compare it with the existing implementation or use finite differences.
The derivatives were originally derived using the Symbolic Math Toolbox in MATLAB, but I prefer to use SymPy in Python. There is a utility file
…