Is there some reason to not use wget or curl? Those are both written  
already. What are you hoping to do with the files you download?

-Bryan

On Dec 31, 2007, at 2:04 PM, thefed wrote:

> What is the best way to download files from the internet (HTTP)  
> that are greater than 1GB?
>
> Here's the story in whole....
> I was trying to use Ruby Net::HTTP to manage a download from  
> wikipedia... Specifically all current versions of the english  
> one... But anyways, as I was downloading it, I got a memory error  
> as I ran out of RAM.
>
> My current code:
>       open(@opts[:out], "w") do |f|
>         http = Net::HTTP.new(@url.host, @url.port)
>         c = http.start do |http|
>           a = Net::HTTP::Get.new(@url.page)
>           http.request(a)
>         end
>         f.write(c.body)
>       end
>
> I was hoping there'd be some method that I can attach a block to,  
> so that for each byte it will call the block.
>
> Is there some way to write the bytes to the file as they come in,  
> not at the end?
>
> Thanks,
> ---------------------------------------------------------------|
> ~Ari
> "I don't suffer from insanity. I enjoy every minute of it"  
> --1337est man alive
>
>
>
>