codeJack codeJack - 4 years ago 141
C++ Question

Valgrind : "Invalid write of size 1" after char[] to std::vector<char> migration

I have a very simple hexlify method in c++ which is based against python's binascii implementation.

std::string Hexlify(const std::string& iData)
{
// the same as python binascii.b2a_hex
const size_t len = iData.size() << 1; // output will be twice as long
char hex[len];

const char* curdata = iData.data();
char* curhex = hex;
const char* end = curdata + iData.size();

char c;
// from python's implementation (2.7.1, if it matters)
while(curdata <= end)
{
c = (*curdata>>4) & 0xf;
c = (c>9) ? c+'a'-10 : c + '0';
*(curhex++) = c;
c = (*curdata) & 0xf;
c = (c>9) ? c+'a'-10 : c + '0';
*(curhex++) = c;
curdata++;
}
return std::string(hex, len);
}


This works perfectly fine.

Now, the
hex
char[] gets allocated on the stack and it can be an issue when dealing with huge buffers which is why I wanted to migrate it to an
std::vector
to benefit from heap allocation.

std::string Hexlify(const std::string& iData)
{
// the same as python binascii.b2a_hex
const size_t len = iData.size() << 1; // output will be twice as long
std::vector<char> hex(len);

const char* curdata = iData.data();
char* curhex = &hex[0];
const char* end = curdata + iData.size();

// SAME CODE AS BEFORE

return std::string(&hex[0], len);
}


This
std::vector
implementation generates a Valgrind's "Invalid write of size 1" error.

Any idea why ?

If I make the
hex
vector two bytes bigger (one does not seem to be enough)

std::vector<char> hex(len + 2);


the error disappears from valgrind's report.

Answer Source

Because you're out by one.

If iData is say "xy", then end points to one after the "y". With your <= end, you attempt to encode 3 characters (giving 6 hex digits) into a space only big enough for 4.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download