Konrad Kapp - 1 year ago 79

C++ Question

I have a very strange segmentation fault that occurs when I call

`delete[]`

`new`

`delete[] arr`

`int main(int argc, char * argv [])`

{

double * arr = new double [5];

delete[] arr;

}

I get the following message:

`*** Error in `./energy_out': free(): invalid next size (fast): 0x0000000001741470 ***`

Aborted (core dumped)

Apart from the

`main`

`main`

`vector<double> cos_vector()`

{

vector<double> cos_vec_temp = vector<double>(int(2*pi()/trig_incr));

double curr_val = 0;

int curr_idx = 0;

while (curr_val < 2*pi())

{

cos_vec_temp[curr_idx] = cos(curr_val);

curr_idx++;

curr_val += trig_incr;

}

return cos_vec_temp;

}

const vector<double> cos_vec = cos_vector();

Note that the return value of

`cos_vector`

`cos_vec_temp`

`cos_vec`

The thing is, I know what causes the error:

`cos_vec_temp`

`cos_vec_temp[curr_idx]`

`cos_vec_temp`

`cos_vec_temp`

`delete[]`

`arr`

`main`

`arr`

`(gdb) p &cos_vec[6283]`

$11 = (__gnu_cxx::__alloc_traits<std::allocator<double> >::value_type *) 0x610468

(gdb) p arr

$12 = (double *) 0x610470

In the first gdb command, I show the memory location of the element just past the end of the

`cos_vec`

`0x610468`

`arr`

`0x610470`

`double`

`0x610468`

`0x610470`

`arr`

`main`

`arr`

`arr`

Any clarification would be appreciated.

`cos_vec_temp`

`int(2*pi()/trig_incr)`

`new`

`delete[]`

`cos_vec`

`double *`

Before you downvote me for using a dynamic array, I am just curious as to why this occurs. I normally use STL containers and all their conveniences (I almost NEVER use dynamic arrays).

Recommended for you: Get network issues from **WhatsUp Gold**. **Not end users.**

Answer Source

Many heap allocators have meta-data stored next to the memory it allocates for you, before or after (or both) the memory. If you write out of bounds of some heap-allocated memory (and remember that `std::vector`

dynamically allocates off the heap) you might overwrite some of this meta-data, *corrupting* the heap.

None of this is actually specified in the C++ specifications. All it says that going out of bounds leads to undefined behavior. What the allocators do, or store, and where it possibly store meta-data, is up to the implementation.

As for a solution, well most people tell you to use `push_back`

instead of direct indexing, and that *will* solve the problem. Unfortunately it will also mean that the vector needs to be reallocated and copied a few times. That can be solved by reserving an approximate amount of memory beforehand, and then let the extra stray element lead to a reallocation and copying.

Or, or course, make better predictions for the actual amount of elements the vector will contain.

Recommended from our users: **Dynamic Network Monitoring from WhatsUp Gold from IPSwitch**. ** Free Download**