Mark Benovsky Mark Benovsky - 1 year ago 68
C# Question

List<> looping optimization

I have a project that works with a large amount of objects of the same type.

For now I use

, but it seems that looping through this list is hard when I have about 1 000 000 items.

In the loop, each
has a method called and there are randomly generated new items, and some items are deleted.

What can I do to optimize this loop ?
Should I change the collection type, or move items to database ?

This is how the loop looks like:

while (_worldIsLiving)
int beginningPopulation = WorldPopulationNumber;

for (int i = 0; i < beginningPopulation; i++)

// Remove dead persons
for (int i = 0; i < WorldPopulationNumber;)
if (_worldPopulation[i].IsDead()) _worldPopulation.RemoveAt(i);
else i++;

if (_hasStopCondition && (WorldPopulationNumber >= WorldMaximumPopulation || WorldPopulationNumber == 0))

new persons can be generated.

Skill has a

Answer Source

_worldPopulation.RemoveAt(i) on a list will be an expensive operation.

It involves shunting every subsequent item down by a position. You're calling this multiple times in every iteration of the outer loop, once for every "dead" instance.

This will multiply up to a very significant overhead for a long list, with O(n2) complexity (if my CS hat is being worn correctly today).

It would likely be much faster to just write out a new list:

_worldPopulation = _worldPopulation.Where(p=>!p.IsDead())

If this still seems expensive, is it important to prune the list at all? (can the list hold all members of the population both alive and dead, or will this strain the available memory?)

You could, for instance:

var livingPopulace = _worldPopulation.Where(p => !p.IsDead());
foreach(var pop in livingPopulace)
var newPopulationCount = _worldPopulation.Count(p => !p.IsDead());

Although this requires 2 sweeps of the collection, it's still going to be less passes over the collection than using RemoveAt, especially if the death-rate per cycle is high. You're back to O(n) complexity and I suspect that with a large collection, this will be significantly more efficient that using RemoveAt.

If neither of these is satisfactory, you might consider using a LinkedList<T> as your container which supports easy forward iteration and low cost removal (at the expense of random-access indexing), but there might be other constraints that make this impractical. Nothing in the code you supplied suggests that this wouldn't work. Just don't be tempted by Linq operators such as ElementAt to get round the random-access limitations or you'll be back to the same kinds of problem.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download