"Just Another Victim of the Ambient Morality" <ihatespam / rogers.com> wrote 
in message news:NKaFg.144673$Em2.92508 / fe10.news.easynews.com...
>
> "Just Another Victim of the Ambient Morality" <ihatespam / rogers.com> wrote 
> in message news:XpaFg.542444$C62.257947 / fe12.news.easynews.com...
>>
>> "Just Another Victim of the Ambient Morality" <ihatespam / rogers.com> 
>> wrote in message news:xkaFg.542364$C62.447443 / fe12.news.easynews.com...
>>>
>>> "Just Another Victim of the Ambient Morality" <ihatespam / rogers.com> 
>>> wrote in message news:k2aFg.542068$C62.362943 / fe12.news.easynews.com...
>>>>
>>>> "Just Another Victim of the Ambient Morality" <ihatespam / rogers.com> 
>>>> wrote in message news:9D9Fg.541757$C62.90805 / fe12.news.easynews.com...
>>>>>    I placed a Ruby script in the site_ruby directory, expecting to 
>>>>> find it from other ruby scripts using the "require" keyword but, to my 
>>>>> surprise, it does nothing.
>>>>>    Do I have no idea how "require" works?  How does "require" find 
>>>>> files to load?
>>>>>    Thank you...
>>>>
>>>>    Okay, it's failling specifically for mechanize.  If I make another 
>>>> ruby script, that works properly.  So, what's with mechanize?
>>>>    Thank you...
>>>
>>>    ...and "require 'rubygems'" doesn't help...
>>>    Thanks...
>>
>>    ...and I'm using Ruby 1.8.4...
>>    Thanks...
>
>    Okay, so if I go into mechanize.rb and make this change to a line of 
> code:
>
>
> # This is the original line of code...
> #require 'web/htmltools/xmltree'   # narf
>
> # This is my hack to get the file to parse
> require 'xmltree'
>
>
>    ...and go into mechanize\parsing.rb and comment out this block of code:
>
>
> # Aliasing functions to get rid of warnings.  Remove when support for 
> 1.8.2
> # is dropped.
> if RUBY_VERSION > "1.8.2"
>    alias :old_each_recursive       :each_recursive
>    alias :old_find_first_recursive :find_first_recursive
>    alias :old_index_in_parent      :index_in_parent
> end
>
>
>    ...it all parses.  Otherwise, I get an undefined method error for 
> 'each_recursive.'  What's up with that?  Do I have any reasonable 
> expectation of this working, now?  Why the hell did I have to do all this? 
> Why can't this "just work?"
>    Thank you...

    Okay, after _way_ too many code modifications, I got it to almost 
work...
    One way that it is defective is that it can't always parse the links of 
a page.  The strange thing is that it works find for some pages (like 
slashdot.org) but completely fails for others (like rubyforge.org).  When it 
fails, it does so by failing to collect the URL of the link and the script 
dies by sending a message to a nil object.  I looked at the source and the 
two pages seem to be the same to me.  Can anyone guess what's going on?
    Also, why didn't "mechanize" "just work?"
    Thank you...