Make sure you're not slurping the CSV file's contents after uploading  
it.
slurping is fine if you know that data will always be of a limited  
size you can handle, but even then, it's generally better to create a  
temp file and process things in parts.
Load part
Process
Save part
repeat until done
check data integrity (make sure it's not screwed up)
then change temp file name and delete original (or delete both, if  
the CSV data is going into a database)

This kind of thing pops up pretty regularly here lately.
It's just safe. It keeps you from running out of resources on the  
system, but also protects data integrity while processing in the  
event of some kind of interruption. (power outage, process killed,  
etc...)
It's a little more work, and may seem like overkill for small files,  
but it's a lot more reliable.

But remember to get some sample test data too, CSV doesn't mean short  
lines always. One line could be ridiculously long, so you may need to  
check the data first.
This kind of subroutine could get pretty involved and depends a lot  
on data sources.
But if the data is important you'll be glad you did all this rather  
than have one instance of completely munged data that is not  
recoverable.