Report from SoCG
As I mentioned in my previous post, I just finished attending the Symposium on Computational Geometry in Portland. The conference proceedings are open access through LIPIcs.
The conference started on Tuesday (after some earlier social activities) with the best paper talk by Arnaud de Mesmay, “Almost tight lower bounds for hard cutting problems in embedded graphs” (with V. Cohen-Addad, É. Colin de Verdière, and D. Marx) proving that, to find the shortest set of cuts to slice a genus-\(g\) surface into a disk, the exponent must depend linearly on \(g\) (assuming the exponential time hypothesis).
Several of the contributed talks from throughout the conference particularly caught my attention:
-
Shay Moran’s paper with A. Yehudayoff, “On weak \(\varepsilon\)-nets and the Radon number” concerned abstract convexity spaces where the convex sets are any set family closed under intersection. A space has Radon number \(r\) if every \(r\) points can be partitioned into two subsets whose convex hulls (smallest containing convex sets) intersect, and a point in the intersection is called a Radon point. Having bounded Radon number turns out to be equivalent to having weak \(\varepsilon\)-nets, subsets of points that (for a given measure on the space) intersect every large convex set.
-
Mitchell Jones’ “Journey to the center of the point set” (with Sariel Har-Peled) improved an old paper of mine on using Radon points to find points of high Tukey depth in high-dimensional point sets in time polynomial in the dimension, improving both the polynomial and the depth of the resulting points.
-
Sariel Har-Peled spoke on his work with Timothy Chan, “Smallest \(k\)-enclosing rectangle revisited”. As well as shaving logs from the time bounds for finding rectangles that enclose a given number of points and minimize their area or perimeter, they found a general reduction from problems on \(n\) points to \(O(n/k)\) problems on \(O(k)\) points, allowing one to turn factors of \(n\) in the time bounds into factors of \(k\). Previously such reductions were only known for \(k\)-point subset problems where the optimal solution lies among the \(O(k)\) nearest neighbors of some input point, true for rectangle perimeter but not rectangle area.
-
Timothy Chan’s “Computing Shapley values in the plane” concerns an interesting combination of game theory and computational geometry. The Shapley value is a method for assigning credit when multiple people collaborate to produce some value (given as input a function from subsets of people to the value of a subset). It can be defined by adding contributors one-by-one in a random order and setting each contributor’s Shapley value to the expected increase in value when that contributor is added. It sums to the total value and is the unique function with this property that obeys several other natural and desirable axioms for credit assignment. For arbitrary functions from subsets of \(n\) contributors to subset values, it takes time \(O(2^n)\) to compute, but Timothy described polynomial time algorithms for cases when the value function has some geometric meaning, such as when it measures the area of the convex hull of a subset of \(n\) points. There’s probably a lot more to be done in the same direction.
-
Patrick Schnider’s “Ham-Sandwich cuts and center transversals in subspaces” has some partial results towards a conjectured generalization of the ham sandwich theorem: given \(dk\) probability distributions in \(d\)-dimensional Euclidean space, it should be possible to find \(k\) hyperplanes whose checkerboard partition of space simultaneously bisects all of the distributions.
-
Jie Xue spoke on “Near-optimal algorithms for shortest paths in weighted unit-disk graphs” (with Haitao Wang). Here “weighted” means that overlapping unit disks are connected by an edge whose length is the Euclidean distance between their centers. Previous methods obtained near-linear algorithms by using bichromatic closest pair data structures to simulate Dijkstra’s algorithm. Instead, Wang and Xue use a related relaxation algorithm that partitions the plane into a grid and at each step relaxes all edges between certain pairs of grid squares, using data structures for additively weighted nearest neighbors. By avoiding the overhead of closest pairs they shave several logs from the runtime.
-
Arnaud de Mesmay spoke again towards the end of the conference on “The unbearable hardness of unknotting”, with Yo’av Rieck, Eric Sedgwick, and Martin Tancer. They used a reduction from 3-satisfiability, with Hopf links for variables and Borromean links for clauses (connected to each other by ribbons), to prove that it’s NP-complete to find an unlinked subset of a link with a maximum number of components. By a second level of replacement that doubles the strands of each link, they also showed that it’s NP-complete to find the minimum number of crossing changes or of Reidemeister moves to unlink a link or to unknot a knot.
On Tuesday afternoon I went to the workshop on open problems. After the obligatory open problem session, we debated whether we should collect problems on The Open Problems Project, The Open Problem Garden, or some new system that, since it doesn’t exist, can be more perfect than anything that does. Then I switched to the Young Researcher’s Forum (unfortunately missing the open problem implementation challenge); at the YRF, my student Daniel Frishberg spoke about our work on using nearest neighbor chains to speed up greedy algorithms.
The invited talks on Wednesday and Friday were by Sanjoy Dasgupta and Bruce Donald, respectively. Dasgupta spoke on using concepts from geometric data structures to interpret the neural structure of the olfactory system in fruit flies (and hopefully, eventually, vice versa: to use our understanding of neural structures to develop new data structures). For instance, the first three layers of neurons in this system appear to implement an “expand-and-sparsify” data structure that represents low-dimensional normalized vectors by randomly mapping them to much higher dimensional vectors and then listing as a set a smaller number of high-value coordinates. This appears to be closely analogous to known methods for locality-sensitive hashing. Donald spoke on using geometric methods to represent and infer the shapes of proteins, and to design proteins with desired shapes and functions.
Wednesday evening was the conference banquet, at the famed Crystal Ballroom, conveniently located near Powell’s Books. Matthew Dickerson showed up (after many years absence from SoCG) and brought some musical family members for a live concert.
On Thursday afternoon I stopped by the workshop on algebraic methods where I had the pleasure of seing Josh Zahl give his talk on a whiteboard instead of using prepared slides. It was on methods for proving bounds on the number of incidences among geometric objects by combining naive bounds based on forbidden complete bipartite subgraphs of the incidence graphs, cuttings of space into smaller regions within which one applies these naive bounds, and cuttings of the objects into pieces in order to make the forbidden subgraphs smaller.
The conference business meeting was also Thursday, and ran very smoothly. Next year SoCG will be in Zurich; the year after that, in Buffalo. We now have an officially incorporated society, The Society for Computational Geometry, so that we can maintain buffer funds for the conference from one year to the next; it will be supported by an increase of $30-$35 in non-student registration fees. And the attendees voted overwhelmingly in favor of both the SafeTOC anti-harassment guidelines and ensuring and better advertising the availability of childcare at future conferences.
The conference was dedicated to the memory of Ricky Pollack, but it also included a celebration on Friday of a living computational geometer: John Hershberger, who led the effort to bring SoCG to Oregon, and turned 60 just before the conference began. Happy birthday, John!