I added a category in my online pub list for hyperbolic geometry. Not a lot there yet: three research papers (including the new squarepants one), commentary on another paper, and a survey talk (with slides and streaming video). But it's convenient to have it all in one place, and it's an area I'd like to do more in — I don't think it's been very thoroughly explored by the computational geometry community.

To keep this from being content-free, here's a research problem I don't know the answer to: does there exist a polynomial time \((1+\varepsilon)\) approximation to the TSP for hyperbolic point sets, for any \(\varepsilon\gt 0\)? Or to other related approximation problems such as the minimum Steiner tree? An obvious approach would be to try something similar to the approximation for a different problem in my squarepants paper: group the points into bounded-diameter subsets, apply known quadtree-based approximation methods to each group, and connect the groups somehow. It's the "connect the groups somehow" part that I don't see how to do...



Without knowing too much about this area: is it in principle possible that some (real) problems can be efficiently solved in the case of Euclidean geometry and cannot be efficiently solved in the case of hyperbolic geometry?


In principle, yes. I could imagine the existence of problems with efficient exact solutions, or approximation schemes, in the Euclidean case, but for which there's some kind of hardness of approximation result in the hyperbolic case. I don't know of any examples, though.


The geometry of gaussians is a hyperbolic space. specifically, if you parametrize Gaussians by a correctly chosen function of mean and variance, you get a space whose curvature matches that of hyperbolic space.

another reason to study hyperbolic spaces !


dang. i should have read your webpage first, specifically the JASA note, before pointing this out...


Yes, that's what's happening in that paper. I think more generally than Gaussians it makes a lot of sense to view a pair (Euclidean point, scale factor) as a point in a halfspace model of hyperbolic geometry. Then if one extends Euclidean similarities in the obvious way to the scale parameter, one gets hyperbolic isometries. In the case of Gaussians, the variance is the scale factor.

Do you have in mind some other references on this or were you thinking of Mizera's work when you posted it?


i had some other references. I've been thinking about the geometry of distributions for a while now, and this came up in passing.


For what it's worth, a few other places where hyperbolic geometry and computation have been mentioned, though I understand them only dimly:

This talk discusses cat(0) spaces arising in motion planning. Here cat(0) seems to mean that when you look at a triangle abc in the cat(0) space, and a triangle ABC in Euclidean space with the same edge lengths, the distance between two edge midpoints in abc is less than the distance between endpoints in ABC (and similarly for other corresponding points on edges). I think maybe the points on the edges of a simple polygon form a cat(0) space, under shortest path distance inside the polygon: triangles in that space are pseudo-triangles, as in "pseudo-triangulations".

The papers here that refer to geometry, also talk a lot about hyperbolic, or nearly-hyperbolic, spaces that arise in networks. There seems to be the claim that, in such spaces, there is a constant k such that, if all the k-hop subpaths of a path are shortest, then the whole path is shortest.

Finally, as a novelty at least, I've shown that the distance measure inside a disk D'(x,y) := D(x,y)/[D(x,y) + min_{a in C} D(x,a)+D(y,a)] , where D(x,y) is Euclidean distance, and C is the bounding circle, gives a metric space that "looks" hyperbolic, but is not the same as the Poincaire model.

-Ken Clarkson


sorry, "distance between two edge midpoints in abc is less than the distance between endpoints in ABC" should be "distance between two edge midpoints in abc is less than the distance between the corresponding midpoints in ABC"



Interesting — thanks for the references.

D'(x,y) := D(x,y)/[D(x,y) + min_{a in C} D(x,a)+D(y,a)] , where D(x,y) is Euclidean distance, and C is the bounding circle, gives a metric space that "looks" hyperbolic

It's not Riemannian, is it? But it looks very closely related to the Riemannian metric in which <.,.>x = <.,.>/(minaεC 2D(x,a)). I'm not so familiar with the formula for hyperbolic distance as a Riemannian metric on the unit disk but if I've worked it through correctly it's similar but with a lower-order correction term: <.,.>/(minaεC 2D(x,a)-D(x,a)2).


did you see the FOCS paper list ? there's a new paper by Krautgamer and Lee on algorithms for negatively curved space. Sounds very interesting.

geomblog: moreover

the paper I mentioned above answers your TSP question in the affirmative, using what sounds like a similar technique to what you outlined.

None: Re: moreover

Actually, we only resort to "quad-tree" based techniques very locally, when the space gets flat enough that the quad-trees have bounded degree. The more difficult part is for "large" distances, where the number of portals in a bounded-diameter decomposition grows exponentially (the number of eps*R-spaced portals on a diameter-R piece is ~ (1/eps)^R instead of (1/eps)^2 like in Euclidean space). This makes dynamic programming too inefficient. Instead, we use exponential divergence of geodesics in neg. curved spaces to prove a stronger structure theorem for the "large-scale" structure of near-optimal tours.


11011110: Re: moreover

Which does in fact sound like "a similar technique to what I outlined", in very broad outline. My hierarchical clustering paper similarly uses quadtree techniques very locally and resorts to something else for large distances.