I'm now making my way back from Iceland, where I attended the 15th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2016). Rather than providing a run-down of the talks and results I found interesting (you can choose your own from the conference proceedngs, this year provided as open-access through LIPIcs), I thought it might be more fun to mention a few open problems that people mentioned along the way and that caught my attention.

The first invited talk, by Julia Chuzhoy, concerned the relation between grid minor size and treewidth. There exist graphs of treewidth \( t \) whose biggest grid minor has side length \( O(\sqrt{t/\log t}) \), and the results she spoke about give polynomiial lower bounds, but with much smaller exponents (currently around \( 1/19 \)). As her abstract states, "an important open question is establishing tight bounds" for this problem.

From Zachary Friggstad's talk on integrality gaps for directed Steiner trees, we have the situation that (for directed Steiner tree problems with t terminals) it is possible to approximate the solution within a \( t^{\varepsilon} \) factor, for any \( \varepsilon\gt 0 \), in polynomial time, or to within a polylog factor in quasipolynomial time. Is a polynomial-time polylog-factor approximation possible?

Daniël Paulusma spoke on finding square roots of \( k \)-apex graphs; that is, one is given a graph \( G \) that could be made planar by the deletion of \( k \) vertices, and the goal is to find another graph \( R \) on the same vertex set such that two vertices are adjacent in \( G \) if and only if their distance in \( R \) is at most two. There are a lot of other graph classes for which the problem is known to be polynomial or hard, but he listed as open finding square roots of split graphs or cographs, and finding graph square roots that are planar.

Jens Schmidt's talk was motivated by some older work of Mikkel Thorup on recognizing map graphs, the intersection graphs of interior-disjoint simply-connected regions in the plane. These differ from planar graphs because of the four corners phenomenon, where Arizona, Colorado, New Mexico, and Utah all meet at a point (and are all adjacent in the map graph). Thorup gave a polynomial time recognition algorithm but his exponent is huge and his algorithm does not provide a combinatorial certificate of being a map graph (a planar bipartite graph whose half-square is the given graph). Schmidt gave a more satisfactory solution to a special case but would like a better algorithm for the general problem.

My own talk concerned cuckoo filters, a more space-efficient replacement for Bloom filters. A data structure of Pagh et al provides a theoretical but not-yet-practical solution for the same problem (for all false positive rates), but cuckoo filters only work well when the false positive rate is small enough. What about if we want a practically efficient filter that gives a high false positive rate, say 20%? Can we get close to optimal space in this case?

Konrad Dabrowski made progress on classifying which sets of forbidden induced subgraphs lead to graph families with bounded clique-width by showing that (diamond, \( P_1+2P_2 \))-free graphs and four other similarly defined classes do have bounded clique-width. But as he reports, there are many remaining open cases.

Dana Richards spoke on a problem of comparison-sorting a set of items when only certain pairs of items are allowed to be compared, but his paper has more questions on this problem than answers. Any set of comparisons that you make (a subset of allowed ones) gives you a DAG with an edge for each comparison, oriented from smaller to larger, and you can infer the results of all comparisons in the transitive closure of this DAG. The goal is to either test or infer the results of all allowed comparisons. You can do it in a sublinear number of comparisons when either the set of allowed comparisons is small (by just doing them all) or its complement is small (by Richards' algorithm) but what he would like to prove is a subquadratic (deterministic) bound that holds in all cases.

Timothy Chan revisited one of the first kinetic data structures, for closest pairs (and/or nearest neighbors). The static version of the problem can be solved in \( O(n\log n) \) time in any constant dimension, but previous kinetic structures had a factor of logdimension in their time bounds. Chan eliminated this, but at the expense of making the time bound depend on the geometry of the input (the range of values that the closest pair might span) instead of just on the number of points. He asked whether it is possible to eliminate both the dependence on dimension and the dependence on geometry.

One of the disadvantages of the move from paper to electronic proceedings (at least in my own experience) is less browsing and more going directly to the papers you already knew you were looking for. And one of the continuing advantages of going to a symposium like this (especially a small and welcoming one like SWAT) is that by listening to talks on subjects you don't already know about, you can broaden your own interests and find out about topics you might not have discovered on your own. But browsing is still possible, even if we no longer are forced to leaf through paper proceedings. So, I'd encourage you to download the full proceedings (from the link at the top left corner of the proceedings link) rather than just the individual papers, and browse them to find many more problems that the papers there have listed as open.