php has some nice and fast csv parsing abilities, use them as a source
for your database.
can then remove any regexp need
still simple for the users
snippet taken from one of my csv readers showing the prints in
comments so you can see the data in an array
this also keeps memory footprint down
$row = 1;
$fp = fopen ($fromdir.$file,"r");
while ($data = fgetcsv ($fp, 1000, ",")) {//readlines in csv
$num = count ($data);
// print "<p> $num fields in line $row: <br>";
$row++;
// for ($c=0; $c<$num; $c++) {
// print "'".$data[$c] . "' ";
// }
// print "<BR>";
}
Dave Caroline
On Tue, Dec 6, 2011 at 6:32 PM, Nate Hill <[log in to unmask]> wrote:
> csv files are what I have- they are easy for the not-technically inclined
> staff to create and then save to a folder. I was really just hoping to
> make this easy on the people who make the reports.
>
>
> On Tue, Dec 6, 2011 at 10:21 AM, Dave Caroline
> <[log in to unmask]>wrote:
>
>> I dont understand the thinking and waste of time scanning entire csv
>> files where a database table with good indexing can be a lot faster
>> and use less server memory.
>>
>> Do the work once up front when the data becomes available not on every
>> page draw.
>>
>> I subscribe to the read/send and mangle as little as possible(server
>> and client) on a web page view
>>
>> Dave Caroline
>>
>
>
>
> --
> Nate Hill
> [log in to unmask]
> http://www.natehill.net
|