Lee Jarvis wrote:
> Ok guys, I need to build a script that connects to a websites, grabs and
> opens a link, parses the contents in that link and submit the findings
> in a form, I have most of it done, But the link for the file i need to
> parse is random and changes on every session so i always get an error
> when i am trying to open it.. It looks something like this:
> 
> site = Net::HTTP.post_form(url,
> {'start' => '1'})
> link = site.body.scan(/href="(.+?)">/)
> open(link) # it fails here, because its not the same session
> 
> Any ideas? Sorry if i didn't explain myself very well
> 
> tia
use google?
there's bunch of frameworks for web-scraping with ruby, why nbot use 
them, to save you searching try www::mechanize or scrubyt