any perl performance experts awake?

Andrew Gould andrewlylegould at gmail.com
Thu Apr 11 09:32:10 PDT 2013


On Thu, Apr 11, 2013 at 11:13 AM, Lonni J Friedman <netllama at gmail.com>wrote:

>
> >
> > Does the script read in an entire data source file and parse each line?
>  Or
> > does is read one line at a time and parse/write it prior to reading the
> next
> > line?  If the entire source file is being read into memory, could it be
> > causing a bottleneck?
>
> The script reads in an entire data source file, parsing line by line,
> putting the data into a hash (%hash_values).  Once that is completed,
> the hash is passed to sqlInsert().  So everything is already read into
> memory at the point in time when performance tanks.  I'd expect that
> this would be the fast path, since its never needed to read from disk.
>  All of my systems have 2+GB RAM, and the data in question is always
> less than 30MB, so I can't imagine that this would be a swap issue, if
> that's what you mean?   Unless querying a key/value pair in a hash is
> not a good performance path in perl?
> _______________________________________________
>


The script is holding the input file (>150k rows?) and the hash in memory
while it's reformatting the data and performing sqlinsert().  I was
wondering whether the combination of processing and RAM utilization could
be causing the slowdown.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.celestial.com/pipermail/linux-users/attachments/20130411/1f7df2e8/attachment.html 


More information about the Linux-users mailing list