The question is simple: How many bits are there in a byte. But the answer is not that straight, there is a “but” after the answer we all know.
How many bytes
One byte consists of 8 bits, which should be adjacent. This is the de facto standard, that is, it is widely accepted and used in practice. But the number of bits in a byte is not fixed, it can be different in different systems. There exists systems which has more than 8 bits wide. The number of bits in a byte is actually implementation defined and is not necessarily 8 bytes everywhere. It can be 8 bits, 9 bits, 16 bits, 32 bits, even 64 bits . Does that mean if you do sizeof (char) it may return more than 1 ? No, because char has a storage size of exactly one byte and when the number of bits in a byte change, the definition of byte changes, not the definition of char, so sizeof (char) is always 1. The minimum number of bits in a char is defined to be 8 as per C99 standards Section 220.127.116.11.1 Paragraph 1, and the number of bits in a byte in an implementation should be greater than or equal to 8.
In C99 Section 3.6 Paragraph 3 it is
NOTE 2 A byte is composed of a contiguous sequence of bits, the number of which is implementation-defined. The least significant bit is called the low-order bit; the most significant bit is called the high-order
This clearly states the the number of bits in a byte is implementation-defined.
We might need the number of bits in a byte to know it to make some bit operations or for some other cause. How can we know that is the number of bits in a byte for a certain implementation? The number of bits per byte is in the limits.h file defined as the CHAR_BIT macro. Therefore if we want to know how many bits are there in a char or int we need to do sizeof (char) * CHAR_BIT and sizeof (int) * CHAR_BIT respectively. We need to make sure that the limits.h file is included.