<div> Well, the reason is our 'archive' file is fairly large in size as well.</div>
<div> My end result needs to be the most recent 6 months will be in the orig file that is over 2GB right now.....</div>
<div> My archive file will hold approx the previous year - and a third file I was planning on building will have some older stuff</div>
<div> Because of legal issues, etc we have to retain all these notes - we can't permanently delete things until they're over 4 yrs old</div>
<div> </div>
<div> <br><br></div>
<div class="gmail_quote">On Thu, Feb 3, 2011 at 5:00 PM, Jeff Harrison <span dir="ltr"><<a href="mailto:jeffaharrison@yahoo.com">jeffaharrison@yahoo.com</a>></span> wrote:<br>
<blockquote style="BORDER-LEFT: #ccc 1px solid; MARGIN: 0px 0px 0px 0.8ex; PADDING-LEFT: 1ex" class="gmail_quote">
<div class="im">From: "<a href="mailto:scooter6@gmail.com">scooter6@gmail.com</a>" <<a href="mailto:scooter6@gmail.com">scooter6@gmail.com</a>><br>><br></div>>To: Jeff Harrison <<a href="mailto:jeffaharrison@yahoo.com">jeffaharrison@yahoo.com</a>><br>
>Cc: filePro Mailing List <<a href="mailto:filepro-list@lists.celestial.com">filepro-list@lists.celestial.com</a>><br>>Sent: Thu, February 3, 2011 4:45:39 PM<br>>Subject: Re: file too large<br>
<div class="im">><br>><br>> Well, there are over 8 million records in this file - so I was looking for a<br>>bit quicker of a solution to get this resolved<br>> I can't wait to get this system on a CentOS server - SCO is soooooo dang slow<br>
<br>>haha<br>> I'm currently looking to export the records to a few csv files, then delete<br>>key & data for this file and import only the most<br>> recent 6 months or so and then resolve the rest when I get back next week from<br>
><br>>vacation....<br>> Why do these things happen the day before I'm leaving town? haha<br>><br>> So, to clarify - is this a SCO OpenServer 5.0.5 issue, or is this a filePro<br>>error? Meaning, if I upgrade tonight to 5.6.10 would this<br>
> resolve the issue?<br>><br>><br><br></div>This is an OS limitation - I believe there is a way to expand the limit in the<br>OS - perhaps a Unix Guru will speak up?<br><br>Not sure why you would need to resort to CSV files import/export - just archive<br>
the recent 6 months that you need to a file with a duplicate map, then rename<br>your original key and data to hldkey hlddata or something like that, and then in<br>the OS copy the key/data with the recent only info back to the original<br>
location. And then rebuild your indexes.<br><br>If you want it to go faster, you can remove the "delete" when you archive - just<br>remember that you will need to go back and remove those records later.<br>
<div class="im"><br>Jeff Harrison<br><a href="mailto:jeffaharrison@yahoo.com">jeffaharrison@yahoo.com</a><br></div>Author of JHImport and JHExport<br><br><br><br></blockquote></div><br>