That is correct, if the first few records (a virtual block size sorta) (not sure how many off the top of my head) don't compress well, it assumes that the file is not compressible and just writes the data out skipping the compression step.
That quote doesn't contradict what mkl said, towards the end of the article it does say this thou, "The cloak will be on display in the new museum developed in Perth, set to open in Spring 2024."
It isn't required (in the strict sense) to keep the dedup table in memory, the problem is that performance is dire when it doesn't. It would be pretty similar to virtual memory thrashing, when the table is not fully in memory.
https://en.wikipedia.org/wiki/Arthur_John_Priest