I'm writing a new function in my polytope viewer for calculating angles between surtopes. I'm suddenly starting to realize I've no idea how to do this correctly in all cases.

For facets ((n-1)-dimensional surtopes) it's easy: compute the angle between the normals of the respective hyperplanes.

But for subdimensional surtopes, it's not so clear how to do it correctly. For edges, it's also easy: just calculate the angle between the vectors parallel to the respective edges. But what's the correct derivation for polygons in dimension > 3? Since there would be no unique containing hyperplane that you could use for the computation. And it's not obvious how to derive the containing 2-planes of the two polygons, or how to compute an angle between them (since they wouldn't have unique normals in 4D or higher). If the two polygons lie in the same hyperplane, then you could calculate it within that hyperplane as a subspace, reducing it to the 3D case. But what if they don't lie on the same hyperplane? (Haha, my ignorance is showing. )

More generally, how to correctly calculate the angle between an i-dimensional surtope I and a j-dimensional surtope J, given their respective vertex coordinates immersed in n-dimensional space? (Assuming they are both incident on some k-dimensional surtope K.)

Currently, my implementation glosses over the problem by taking the centroids of the respective surtopes and the centroid of their shared vertices, and computing the angle based on vectors between those centroids. This works for regular polytopes, but almost everywhere else it's wrong.