I participated today in the successful doctoral defense of Evrim Ozel, one of Mike Goodrich’s students at UCI. Evrim is Turkish, and came to UCI from Middle East Technical University in Ankara.

His dissertation concerns experimental algorithms, road networks, and noisy sorting, combining research from four of his papers:

  • Efficient exact learning algorithms for road networks and other graphs with bounded clustering degrees (SEA 2022, with Goodrich) is on using unweighted shortest path queries to infer an unknown graph on known vertices, using a method based on randomized incremental graph Voronoi diagrams. They analyze their algorithm’s complexity as \(O(d_{\max}^2n\log n)\), where \(d_{\max}\) is the maximum number of neighbors of a Voronoi cell in their construction. It follows from some old work of mine that for random geometric points one can expect \(d_{\max}\) to be \(O\bigl(\tfrac{\log n}{\log\log n}\bigr)\) and they provide empirical evidence that when applied to road networks they get a similar logarithmic behavior for this parameter.

  • Modeling the small-world phenomenon with road networks (SIGSPATIAL 2022, best paper runner-up, with Goodrich) experiments with three different social network models that overlay random long-distance connections onto a road network, inspired by Kleinberg’s work on overlays of grid graphs. One overlays a Barabasi–Albert network, a second overlays random long-distance connections using the same model as Kleinberg, and a third combines both ideas by building a network incrementally with a fixed number of neighbors per new vertex chosen with probabilities combining the neighbors’ degrees and distances. In a model of greedy routing incorporating a fixed drop-out rate per hop, the third model comes close to replicating Milgram’s results of most successfully found paths having only six or seven steps, far smaller than the constant from Kleinberg’s work. Evrim, Mike, and another of Mike’s students, Ofek Gila, have a follow-up paper (not in the thesis) that won the best-paper award at COCOA 2023.

  • Noisy sorting without searching: data oblivious sorting with comparison errors (SEA 2023, with Afshar, Dillencourt, and Goodrich). Here, noisy sorting means that some comparisons can produce the incorrect result, and stay incorrect if you try them again. It was known through work of Geissmann, Leucci, Liu, and Penna that with \(O(n\log n)\) comparisons you can get all items within distance \(O(\log n)\) and average distance \(O(1)\), of their correct sorted position, both optimal for this model. Evrim’s paper comes close to that in the more restrictive data oblivious model of sorting, where the sequence of comparisons that are made cannot depend on the results of earlier comparisons (true for instance for sorting networks but not for many sorting algorithms).

  • External-memory sorting with comparison errors (WADS 2023, with Goodrich). This uses the same noisy sorting model, but looking at it from the point of view of external memory algorithms, where the goal is to minimize the number of blocks of memory transferred into and out of the CPU and its cache. In this case, Evrim’s paper provides an algorithm whose total amount of memory I/O is optimal, with again near-optimal bounds on how close each item is to its correct sorted position.

At the defense, Evrim gave a very impressive talk in which he managed to cover all of these many results, smoothly, clearly and in detail, within the one-hour time and despite being interrupted quite a few times by questions from the audience (ok, mostly me). His next step is to go to LinkedIn as a data scientist, after previously interning for them.

Congratulations, Evrim!

(Discuss on Mastodon)