No, byte simply meant the number of bits needed to represent a single character in a fixed width encoding.
From Wikipedia: The byte (/ˈbaɪt/) is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer[1][2] and for this reason it is the smallest addressable unit of memory in many computer architectures. The size of the byte has historically been hardware dependent and no definitive standards existed that mandated the size.
PS: 16 bit bytes are now common with UTF-16 with fits a large chunk of Unicode. On the other hand UCS-2 is not a fixed width encoding so different rules apply. However, we stuck with 8 bit bytes for long enough it's become a viable alternative definition, until we byte the bullet and go back to a fixed 32 bit format.
It doesn't change the unit of measure byte and its value when converting to the bit unit of measure.
Extended ascii uses the full 8 bits (unsigned) to value of 255 and values 0 through 31 are reserved for control codes.