Ara.T.Howard <Ara.T.Howard / noaa.gov> wrote:
> 
> here dark is the array having values like
> 
>    [ 1, 2, 3, 6789, 6790, 6791 ]
> 
> that i want to reduce to a list of ranges.
> 
> in reality the ranges are huge and there will, in amost all cases except
> crossing the poles, be only one.  also note that 'dark' is a sorted list.  my
> current optimization is
> 
>    min, max = dark[0], dark[dark.size - 1]
> 
>    ranges =
>      if((max - min + 1) != dark.size)
>        slow_search dark
>      else
>        [ min .. max ]
>      end
> 
>    ranges.each do |range|
>      #
>      # munge data based on ranges which are small and fast
>      #
>    end
> 
> 
> i can't think of anything better than brute force for the 'slow_search' but
> thought i'd throw it out there...

This might work better: binary search for {x | (x - min) == (i_x - i_min)},
set i_min to i_x+1 and repeat. For an added optimisation, run two loops
in parallel, searching from each end.

martin