Chad Layton wrote:
> I'm rather new to both web programming and ruby so forgive me if my 
> question is ill formed.
> 
> I'm trying to do some screen scraping on a website that requires a login 
>  . What I would like to have happen is for the user to login to the 
> website normally, then run my script which uses the existing login 
> session to grab the page and do whatever to it.
> 
> To illustrate my problem: If I use 
> Net::HTTP.get_response(URI.parse("http://foo.bar/baz.php")).body, then 
> it  serves up the index asking for a login. How do I get contents of 
> baz.php?

I suspect that the user agent (i.e., the code, as opposed to a browser) 
needs to include site cookies in the request headers.

After you sign in using a browser, you'll need to find the cookie left 
by the site, or inspect a session cookie if the browser is not writing 
it to disk.  Most browsers have a way to show cookies sent by a site.



James


-- 

http://www.ruby-doc.org       - Ruby Help & Documentation
http://www.artima.com/rubycs/ - Ruby Code & Style: Writers wanted
http://www.rubystuff.com      - The Ruby Store for Ruby Stuff
http://www.jamesbritt.com     - Playing with Better Toys
http://www.30secondrule.com   - Building Better Tools