I have this array of bits
int bits[8] = { 0, 1, 1, 0, 0, 1, 0, 1 }
This is 65 in hex or 101 in decimal. The ASCII letter is 'e'. How do I go about reading my array into a char and int (the decimal value)?
I have this array of bits
int bits[8] = { 0, 1, 1, 0, 0, 1, 0, 1 }
This is 65 in hex or 101 in decimal. The ASCII letter is 'e'. How do I go about reading my array into a char and int (the decimal value)?
You could use bit shifting in order to get the char from the bit array like so:
int bits[8] = { 0, 1, 1, 0, 0, 1, 0, 1 };
char result = 0; // store the result
for(int i = 0; i < 8; i++){
result += (bits[i] << (7 - i)); // Add the bit shifted value
}
cout << result;
This basically loops through your array, bitshifts by the correct amount, and then adds the value to an aggregating "result" variable. The output should be "e".
assert(bits[i] ==0 || bits[i] == 1) just because if it fails, the result will silently return a result.std::bitset because C != C++ (except if C is a floating point value large enough).for(int i = 0; i < 8; i++) { result = (result << 1) | bits[i]; } would work too and imho. is slightly easier to understand. (Neither my nor the answer proposed by Keveloper check that the values of bits[] are either 0 or 1). Better do a bits[i] == 0 ? 0 : 1 instead of just bits[i].