-1

I have this array of bits

int bits[8] = { 0, 1, 1, 0, 0, 1, 0, 1 }

This is 65 in hex or 101 in decimal. The ASCII letter is 'e'. How do I go about reading my array into a char and int (the decimal value)?

3
  • 1
    I'd look at this: stackoverflow.com/questions/13667746/… Commented Jun 25, 2018 at 14:34
  • 2
    Second half of the accepted answer is wrong though, sadly... Commented Jun 25, 2018 at 14:39
  • Do prefer the second up-voted answer. Commented Jun 25, 2018 at 14:42

1 Answer 1

4

You could use bit shifting in order to get the char from the bit array like so:

int bits[8] = { 0, 1, 1, 0, 0, 1, 0, 1 };
char result = 0; // store the result

for(int i = 0; i < 8; i++){
    result += (bits[i] << (7 - i)); // Add the bit shifted value
}

cout << result;

This basically loops through your array, bitshifts by the correct amount, and then adds the value to an aggregating "result" variable. The output should be "e".

Sign up to request clarification or add additional context in comments.

3 Comments

I'd probably throw in an assert(bits[i] ==0 || bits[i] == 1) just because if it fails, the result will silently return a result.
And I'd use std::bitset because C != C++ (except if C is a floating point value large enough).
for(int i = 0; i < 8; i++) { result = (result << 1) | bits[i]; } would work too and imho. is slightly easier to understand. (Neither my nor the answer proposed by Keveloper check that the values of bits[] are either 0 or 1). Better do a bits[i] == 0 ? 0 : 1 instead of just bits[i].

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.