I just returned from Montpellier, France, where I was invited to speak at WG 2009, the 35th International Workshop on Graph-Theoretic Concepts in Computer Science.

This was my first trip to France, and Montpellier was a very charming place to visit. It is in the south, and though not quite on the Mediterranean itself it has a very Mediterranean feeling. The conference organizers put together an interesting and educational excursion for us in which we learned some of the local history by visiting three local landmarks (the opera house, an old pharmacy, and the top of the gateway arch) and drinking a different wine in each. We learned that Montpellier is a new city for France, being only a little over 1000 years old, but it is home to one of the oldest medical schools in Europe. It's on the old pilgrim trail from Rome to Compostela, markers for which run through the town, and although the language is no longer spoken in the area there are several inscriptions for tourists written in Occitan. The nearby mountains were a refuge for the Cathars when they were persecuted by the Catholics, and there are many open squares in the city due to the monasteries and other buildings that were torn down in the French revolution. My hotel was on one side of the old quarter, now mostly a shopping district, and the conference was in an old movie theater on the other side of the quarter, so I had a very pleasant walk every day to the conference (only getting lost twice), and my bad high-school French was enough to get by with the people who didn't or wouldn't speak English (rather more of them than I've encountered in other parts of Europe).

The conference itself is about graph algorithms, both for problems on arbitrary graphs and (a larger fraction of the papers) for important special classes of graphs. There were too many interesting talks to describe them all in detail, so let me just mention a few.

Daniel Král started the conference with the other invited talk, on graphs of bounded expansion. This is a notion of sparsity in graphs, based on shallow minors, graph minors (subgraphs of contractions) in which two vertices may only be contracted together if they started out within some constant distance of each other in the original graph. There is a trichotomy theorem for the density of shallow minors: among the densest shallow minors of a given family of graphs, the numbers of edges must grow roughly as an integer power of the number of vertices, the exponent of which can only be 0 (for graphs with bounded numbers of edges), 1 (sparse graphs, with comparable numbers of edges and vertices), or 2 (dense graphs). Additionally, in the case where the exponent is 2, all graphs can be formed as shallow minors. Graphs of bounded expansion form the case of exponent 1, in which the shallow minors are all sparse; they generalize both minor-closed graph families (in which all minors are sparse) and bounded-degree graphs (since a shallow minor of a bounded-degree graph also has bounded degree). As Král described, graph properties describable as first-order logic formulae may be tested efficiently on these graphs, but there seems to be plenty of opportunity for more algorithmic work on them. The two main papers on this subject are in the European Journal of Combinatorics, by Nešetřil and Ossona de Mendez: one on structure theory and a second on algorithms. There also seems to be some connection with Král's work with Dvořák and Thomas at SODA 2009 on efficient 3-coloring of triangle-free surface-embedded graphs (via a data structure for finding paths of bounded length) but I'm not sure of the details of this part.

My favorite contributed talk of the first day was by Pim van 't Hof, a Ph.D. student of Broersma at Durham, presenting his work with Marcin Kamiński and Daniël Paulusma on finding induced paths of given parity in claw-free graphs. Since line graphs are claw-free, this generalizes the problem of finding arbitrary paths of given parity in arbitrary graphs. Non-induced paths of given parity are easy enough to find: the shortest path has one parity, and a path of the other parity exists if and only if at least one of the biconnected components on a path between a given pair of terminals is non-bipartite (one direction is easy and the other can be proved by induction using open ear decomposition). However, the induced claw-free generalization is not as easy. I don't know that it's a very important problem, but it was a well-presented method that used some deep results of Chudnovksy and Seymour on the three-in-a-tree problem to reduce the problem to one in which all vertices belong to induced paths between the given terminals, and then split the problem into a case analysis using the known structure of perfect claw-free graphs (in the case that the graph is perfect) or the existence of an odd hole (in the case that it is not).

The talks on the morning of the second day were all about parametrized complexity and exponential time algorithmics. The most interesting to me (and the most closely related to my own recent research) was the last of the morning's talks, by Fedor Fomin, Daniel Lokshtanov, and Saket Saurabh, on algorithms for finding an embedding of a metric space into a line minimizing the total distortion. The metric spaces they consider are the ones generated by distances in an unweighted graph (hence the relevance to WG); the difficult part of the problem is determining the ordering of the vertices on the line, so the problem could be solved in factorial time, but they reduce this time bound to $$5^{n+o(n)}$$. The method involves partitioning the line into blocks whose length is the eventual distortion bound, so that each vertex only has a constant number of choices for the block it belongs to, and then performing some more complicated placement algorithms within the blocks. It's one of only a very few algorithms that I know of in which the optimal distortion of an embedding can be calculated exactly (another being my upcoming WADS paper); it would be of interest to extend the result from unweighted graphs to arbitrary metrics.

My own talk was in the afternoon; I surveyed problems (such as the partition of an orthogonal polygon into a minimum number of rectangles) the solution to which involves constructing a graph from the input (the bipartite intersection graph of line segments bounded by two reflex vertices of the polygon) and uses the special properties of this graph to find an efficient algorithmic solution (maximum independent sets in bipartite graphs via König's theorem). See here for talk slides.

I almost missed the first talk of the third day, due to my mistaken belief that I could find an alternative route from my hotel to the conference site without referring to a map, but I was very glad that I recovered from my disorientation in time for its start. Torben Hagerup spoke, on the problem of determining whether a given tree is the minimum spanning tree of a weighted graph, or (almost equivalently) of finding the maximum-weight edge on each of a collection of paths in a given tree, a key step in the randomized linear time minimum spanning tree algorithm of Karger, Klein, and Tarjan. This spanning tree verification problem had been previously solved in linear time in two papers, by Tarjan, Rauch, and Dixon and again by King, but Hagerup's solution was even simpler (though still not without some complication). By using least common ancestors and a technique of hierarchical clustering into subtrees from the previous papers based on Borůvka's algorithm, the problem becomes one of finding maximum-weight edges on ancestor-descendant paths in a tree in which all paths have logarithmic length, so sets of edges in these paths (for instance, the set of suffix maxima according to the weight function) can be represented in single machine words. Hagerup then derives some clever bit manipulation tricks for solving the problem in a linear number of word operations, based around the algebraic properties of an interesting operation $$\operatorname{down}(x,y)$$ that returns the nonzero bits of $$y$$ for which the next larger bit of $$x$$ is smaller than the next larger bit of $$y$$. Unlike our new SIGACT chair, I'm a big fan of this sort of bit-parallel computation — I think the effort of algorithm design in this area is well-spent because these techniques really do speed up actual programs and yet can be quite nontrivial to discover. Or, if you prefer an argument by authority, Knuth likes them too: on my way to France, I had coincentally watched Knuth discuss similar manipulations in his "Platologic Computation" talk (available here for Windows or here for iTunes).

All in all, a good conference, and one I would attend much more regularly if only it weren't so far away.