Bill Kelly wrote:
>> engine can do this for you.  After all, it is the job of the programming 
>> language to make programming easier for us, is it not?
> 
> Is it worth noting that the article at
> http://perlmonks.org/index.pl?node_id=502408
> indicates:
> 
>   - a performance penalty is incurred for keeping track of the
>     "have we been here before" state;
> 
>   - the solution doesn't cover all regexps, so some regexps
>     can still produce exponential backtracking.
> 
> ?
> 
> 
> The Perl guys seem to feel the trade-off was worth it.  Maybe
> so.  But it doesn't seem 100% clear cut to me.

The important point is that the performance penalty is linear: those
matches that get slower will get slower by a constant factor. But: this
penalty will prevent an *exponential* slowdown in many, typical cases. I
have not waited how long Ruby would have taken in the simple example
case I gave, but the slowdown was already more than 2000-fold when I
killed it -- much much more than the performance penalty would ever be.

Regarding the example: yes this is an extremely easy example meant for
illustration only. However, with complex hand-crafted regular
expressions or with automatically generated ones the same effect can
arise when it is much more complicated or even impossible to find an
equivalent regular expression that does not show the exponential
slowdown.

Exponentional slowdown can be an extremely bad thing, rendering an
application practically unusable. So I would say, the trade-off is
definitely worth it. Even if it cannot prevent *all* pathological cases
(my feeling is that most pathological cases that occur in practice *can*
be prevented though).

-- 
Posted via http://www.ruby-forum.com/.