Ok, I think my idea from last time of using winding numbers is enough to patch Kaufmann's broken proof that his system of states of a knot diagram (equivalently, spanning trees of a plane-embedded graph, rooted at a vertex of the outer face) forms a lattice.
Recall that a rotation changes one spanning tree into another, by replacing the parent of some vertex \(v\) by a new parent one step clockwise or counterclockwise among the neighbors of \(v\). The new edge, together with the path in the old spanning tree connecting its endpoints, form a Jordan curve in the plane; the rotation is allowed if the wedge between the old and new edges at \(v\) lies in the interior of this Jordan curve, because under that condition the dual spanning tree also undergoes a rotation of the same type.
We may assume without loss of generality that our planar graph is embedded in such a way that no three vertices are collinear. For technical reasons, we also assume that the chosen root vertex has only one incident edge (if not, add a new artificial vertex within the outside face of the embedding, connect the old root to it by a single edge, and designate the new vertex as the root; this addition does not change the structure of trees and rotations). Given a path \(p\) in a spanning tree from a vertex \(v\) to the root \(r\) on the outer face, define the winding number of \(v\) to be the number of times path \(p\) crosses line segment \(vr\) from counterclockwise to clockwise, minus the number of times the path crosses from clockwise to counterclockwise (in terms of the angles the points on the path take as viewed from \(v\)):
The key property of these numbers is that the only rotations that can change the winding number of \(v\) are rotations at \(v\) itself. A rotation in which the parent of \(v\) was counterclockwise of \(r\) (as viewed from \(v\)) before the rotation, and is clockwise of \(r\) after the rotation, increases the winding number by one, and the reverse rotation decreases the winding number by one. Rotations cannot occur at \(r\), and rotations at any other vertex of \(p\) can't change the crossing number: if we consider the Jordan curve bounded by the old tree and new edge of the rotation, the path \(p\) runs along one side of this curve while the new path runs along the other, but these two sides are homotopic (can be continuously deformed into each other), and this continuous deformation process adds and removes matching pairs of left-right and right-left crossings of line segment \(vr\), leaving the total winding number unchanged. (This is where we use the assumption that the root has only one incident edge, so that neither it nor \(v\) are on the boundary of this Jordan curve — if they were, it would be possible for a homotopy to change the winding number.)
So now, label a rotation at a vertex \(v\), that changes its parent to \(w\) and its winding number to \(i\), by the triple \((v,w,i)\). If we perform a sequence of clockwise rotations, starting from the counterclockwisemost tree, after which a rotation labeled \((v,w,i)\) is performable, then it will remain performable through the course of any additional clockwise rotations at other vertices, until it is actually performed. This can be seen via Kaufmann's markings on the medial graph \(M(G)\): a rotation at \(v\), setting its parent to \(w\), is possible exactly when there are two marks on the corresponding edge of \(M(G)\), and other clockwise rotations can't change those marks, so a rotation \((v,w,j)\) for some \(j\) will remain performable until actually performed, and the winding number also cannot change as we have described above.
We may form a formal language from the possible sequences of clockwise rotations starting from the counterclockwisemost tree, using the labels of rotations as symbols in the language's alphabet. Each maximal word in the language contains each possible rotation exactly once, because it always ends at the clockwisemost tree, and because for each vertex \(v\) the rotations at v must progress cyclically through the possible values until they reach the parent and winding number that \(v\) has in that clockwisemost tree. A language of this type, in which each maximal word is a permutation of the same set of symbols, in which each prefix of a word in the language is also in the language, and in which a symbol, once available to be added to the end of a word, remains available until it actually is added, is called an antimatroid.
Antimatroids can also be described as families of sets (the sets of symbols in their words) and define a lattice structure on those sets in which the lattice join operation is represented by set union. Each set in this set system corresponds to one of Kaufmann's states (the state reachable by performing that set of rotations, in any order in which they may be performed), so, as Kaufmann claimed, we have a lattice on the states.
But we can deduce a little more about the structure of this lattice, by looking at the same system of states and rotations in the other direction, starting from the clockwisemost tree and applying counterclockwise rotations. It is again an antimatroid, by a symmetric argument, but it is convenient to label its rotations slightly differently: in the counterclockwise antimatroid, label a counterclockwise rotation by the triple \((v,w,i)\) when \(w\) is the parent of \(v\) in the old spanning tree, prior to the rotation, and \(i\) is the winding number of \(v\) in the old spanning tree. Antimatroids can be described both in terms of families of sequences (as above) and as union-closed accessible set systems; viewed as set systems, with the labeling described above, the complement of a set in the clockwise antimatroid is a set in the counterclockwise antimatroid and vice versa. Join operations are represented as unions of these sets, because that is how joins behave in antimatroids; but because these two antimatroids are complementary, the meet operation in one antimatroid is the complement of the join of the complements in the other antimatroid, and is represented as the intersection of sets. Thus, we have a lattice in which joins and meets are both represented as set-theoretic unions and intersections; this is (by Birkhoff's theorem) the defining property of a distributive lattice, and the sets representing each state must be the lower sets of some partial order. Thus, Kaufmann's attempted proof, in which he claims to define a partial order on the pairs \((v,w)\), is on the right track, but needs the pairs to be augmented into triples by the winding numbers in order to work correctly.
Conclusion: Kaufmann's states and rotations form a distributive lattice, much as Kaufmann claimed, but with a rather more complicated proof.