[cfe-dev] Proposed C++ optimization with big speed gains with big objects

Lars Viklund zao at acc.umu.se
Tue Sep 25 10:09:21 PDT 2012


On Tue, Sep 25, 2012 at 09:26:03AM -0700, Ladislav Nevery wrote:
> int a=1;
> a=2;
> 
> The moment I agreed to assign new value I agreed to old value being lost.
> This is normal and expected behavior. 
> 
> If you wana preserve old value for whatever reason (construction failure)
> you keep old value in temp like you do for everything else.
> 
> Advantage is you know what is going on.
> You know that redundant copy is being kept and you can use it as you wish;
> But it's big performance difference of preserving state of objects we 99% of
> time don't care about in this particular case.
> 
> Skyscrapper city[1000], tmp=city[1]; // 1000 in this case useless default
> constructors doing nothing more that specialized one does not do too
> 
> city[1]=Skyscrapper("Empire"); //exception occurs so we just try again or
> restore to tmp in usual exception handler
> 
> in this case which is prety much most of the static array creation loops
> restoring to tmp doesn't make sense so thank god we can skip it and ged rid
> of useless copy

The problem you face is that all the code in the world is written
against the standard, not against your scary optimization.

That means that if you have your "optimization" enabled in a TU, _all_
the code that is pulled into it must be aware of the standard-violation
that your optimization performs.

The core concern you have when implementing an optimization is not
optimizing the cases you care about. It's making sure it doesn't screw
over all the other cases.

It's completely irrelevant if it may make some explicitly desired cases
better, if it breaks the rest of the world indiscriminately.

I'm willing to bet three pinecones that if you implemented this, large
chunks of code would start failing, including the standard libraries
used.

-- 
Lars Viklund | zao at acc.umu.se



More information about the cfe-dev mailing list