The @Harry answer explains why sizeof(c)/sizeof(char) , where c is a pointer, returns the size of the pointer type, and not the length of the C string that this pointer can refer to.
To completely avoid such errors, you can use std::string for I / O instead of char* . For example, to get bits of bytes from strings ("01" strings — representation of bytes in binary):
#include <bitset> #include <climits> #include <string> template<class UnaryFunction> void str_to_bits(const std::string& str, UnaryFunction yield) { for (unsigned char byte : str) yield(std::bitset<CHAR_BIT>(byte).to_string()); }
Call example:
#include <iostream> int main() { str_to_bits("abcdefgh", [] (auto bits) { std::cout << bits << ' '; }); }
auto in lambda supported only starting with c ++ 14:
$ g++ -std=c++14 yield_bits.cxx && ./a.out 01100001 01100010 01100011 01100100 01100101 01100110 01100111 01101000
The expected codes of letters in the English alphabet go in a row (1, 10, 11, 100, 101, etc. in the binary system).
s? I don't see her - Pavel Mayorov