After a cursory read, I don't buy the guaranteed constant time claim.
"If there are so many competing adjacent deletes that you exceed this number of traversals, you just stop on the 10th deleted node and modify it, editing out the following 10 deleted nodes. The upshot is a list that steadily shrinks..."
...unless you're inserting and deleting more than 10 consecutive nodes all the time. Then the list steadily grows. Though I could be missing something.
1) The constant time claim for deletion is actually completely unrelated to the size of the list. Even if the list grew infinitely (only a fraction of all deletes were realised), the deletion would certainly be constant time since we explicitly bound it. Iteration would incur the cost, and reimpose the true structure as it went.
2) Regardless of the number of competing deletes, we always successfully remove some of the nodes we intended to - the question is only if an older delete in the same interval selected the same terminal node in one direction as a newer delete, and overrides the newer delete's pointer modification to reintroduce some nodes it had removed. Such an interleaving should be rare, especially as the range grows, since the newer delete both started later and had more work to do. The main risk is in situations where a thread is suspended in the middle of making its modification, in which case it could conceivably reintroduce a large range of previously removed nodes.
"If there are so many competing adjacent deletes that you exceed this number of traversals, you just stop on the 10th deleted node and modify it, editing out the following 10 deleted nodes. The upshot is a list that steadily shrinks..."
...unless you're inserting and deleting more than 10 consecutive nodes all the time. Then the list steadily grows. Though I could be missing something.