I'm making some experiments with Crypto++ library, I don't understand why before that reinterpret_cast the length is 20 byte and after is only 8 chars instead of 10. Thank you very much for your attention.
P.S. I'm sorry if the question isn't well posed according to the guidelines, I'm new here. Maybe it's not very general but I hope it can be useful to others nonetheless.
#include "cryptopp/cryptlib.h"
#include "cryptopp/sha.h"
#include "cryptopp/filters.h"
#include "cryptopp/hex.h"
#include "cryptopp/files.h"
#include <cstdlib>
#include <iostream>
#include <string>
using namespace CryptoPP;
int main (){
std::string msg = "Yoda said, Do or do not. There is no try.";
std::string digest;
SHA1 hash;
hash.Update((const byte*)msg.data(), msg.size());
digest.resize(hash.DigestSize());
hash.Final((byte*)&digest[0]);
std::cout << "Message: " << msg << std::endl;
std::cout << "Digest: "<<digest<<std::endl;
//StringSource(digest, true, new Redirector(encoder));
//std::cout << std::endl;
std::cout << typeid(digest).name() << std::endl;
std::cout<<digest.size()<<std::endl;
//this is the line......
const unsigned char* decoded = reinterpret_cast<const unsigned char *>( digest.c_str() );
std::cout << "Length of array = " << (sizeof(decoded)/sizeof(*decoded)) << std::endl;
//byte decoded[] = digest;
std::string encoded;
HexEncoder encoder;
encoder.Put(decoded, sizeof(decoded));
encoder.MessageEnd();
//word64 size = encoder.MaxRetrievable();
int size=encoder.MaxRetrievable(); std::cout<<size<<std::endl;
if(size)
{
encoded.resize(size);
encoder.Get((byte*)&encoded[0], encoded.size());
}
std::cout << encoded << std::endl;
}
This is the output:
Message: Yoda said, Do or do not. There is no try.
Digest: -y;{:r79R
NSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE
20
Length of array = 8
16
05C0042DF9A7793B
What I was really trying to do was just to convert the result of the hash into an hexadecimal string that I could manipulate. The example on https://www.cryptopp.com/wiki/SHA only redirects the result through a pipeline to the terminal, but I wanted to have access to the encoded string. But after making that modification, I found out the digest is many digits shorter than the original, thus my attempt to find out why.
This was the original code basically from Crypto++ Wiki:
#include "cryptopp/cryptlib.h"
#include "cryptopp/sha.h"
#include "cryptopp/filters.h"
#include "cryptopp/hex.h"
#include "cryptopp/files.h"
#include <cstdlib>
#include <iostream>
#include <string>
using namespace CryptoPP;
int main(){
HexEncoder encoder(new FileSink(std::cout));
std::string msg = "Yoda said, Do or do not. There is no try.";
std::string digest;
SHA1 hash;
hash.Update((const byte*)msg.data(), msg.size());
digest.resize(hash.DigestSize());
hash.Final((byte*)&digest[0]);
std::cout << "Message: " << msg << std::endl;
std::cout << "Digest: ";
StringSource(digest, true, new Redirector(encoder));
std::cout << std::endl;
return 0;
}
and this was the output with the right length (160 bits):
05C0042DF9A7793B7BDE3AB9724C08CF37398652
sizeofdoes what you think it does.reinterpret_castthe length is 20 byte and after is only 8 chars instead of 10." Why do you need more than about four lines (initialize a variable, show length 20, cast, and show length 8) to demonstrate this issue? Throwing in stuff about encoding obfuscates the question, making it less useful to future readers.