I hate UTF-16 and the systems that use it with a passion, but...
Windows and Java (and Javascript) adopted unicode at a time when it was thought that 64k code points would be enough for everyone. Then they prioritized backwards compatibility over anything else. Most of us have benefited from their insistence on backwards compatibility in some form or the other, so I'm really not in a position to complain about it :-/
That said, IMHO any "length" property (as opposed to `codepoints` or `bytes`) on a UTF-16 string should definitely be deprecated.
Windows and Java (and Javascript) adopted unicode at a time when it was thought that 64k code points would be enough for everyone. Then they prioritized backwards compatibility over anything else. Most of us have benefited from their insistence on backwards compatibility in some form or the other, so I'm really not in a position to complain about it :-/
That said, IMHO any "length" property (as opposed to `codepoints` or `bytes`) on a UTF-16 string should definitely be deprecated.