MySql +DBI problem

U

uttamhoode

hi all,
i have a perl scipt which reads a CSV file and inserts the values
into a mysql database.
CSV file has around 75000 lines.

when i execute the script only aroud 31000 to 37000(varies each time)
lines are inserted....I am using a remote mysql server


Regards,
uttam hoode
 
U

usenet

when i execute the script only aroud 31000 to 37000(varies each time)
lines are inserted....I am using a remote mysql server

You have an error on line 42.

And you're not checking your SQL return messages on lines 92, 111, and
141 (hint, hint).
 
P

Paul Lalli

i have a perl scipt which reads a CSV file and inserts the values
into a mysql database.
CSV file has around 75000 lines.

when i execute the script only aroud 31000 to 37000(varies each time)
lines are inserted....I am using a remote mysql server

What kind of answer are you expecting to this question? Obviously,
you've done something wrong. But you've shown us absolutely no code
and no data, so how can we possibly help you to find what you've done
wrong?

Please read the Posting Guidelines for this group. They are posted
here twice a week. They will give you information on how to ask a good
question in this group.

Paul Lalli
 
U

uttamhoode

ok ok :) sorry for not posting the code

no of lines inserted varies on each execution

#===============================
#!/usr/bin/perl


use Socket;
use DBI;
use CGI qw/:standard/;

$query=new CGI;

print $query->header;


# Connect to the database.
my $dbh = DBI->connect("DBI:mysql:database=database
name;host=hostname",
"username", "password",
{'RaiseError' => 1})|| print "ERROR
CONNECTING";






open(FREADER,"csvfile.csv") or print "ERROR OPENING FILE \n";
if(!($error))
{
my $ln;
my $separator=$/;
undef $/;
$buff=<FREADER>;
$/=$separator;
undef $ln;
close FREADER;
}
my $ln;
my $i=1;


foreach $ln (split("\n",$buff))
{
$buff=~/REG EXP FOR SPLITTING EACH LINE/g;


my $qry="INSERT INTO IP(A,B, C, D, E, F, G, H )
VALUES('$i','$1','$2','$8','$9','$14','$17',".$dbh->quote($20).")";
$dbh->do($qry);


$i++;
}

print $i;
my $errstr=$dbh->errstr;
print $errstr;


$sth->finish();

# Disconnect from the database.
$dbh->disconnect();



close(MAILREADER);
 
D

Dr.Ruud

(e-mail address removed) schreef:
i have a perl scipt which reads a CSV file and inserts the values
into a mysql database.
CSV file has around 75000 lines.

when i execute the script only aroud 31000 to 37000(varies each time)
lines are inserted....I am using a remote mysql server

This is not a Perl question.

Very probably, MySQL is dropping some updates because of a constraint,
like a PK you are trying to reuse.
 
P

Paul Lalli

ok ok :) sorry for not posting the code

no of lines inserted varies on each execution

#===============================
#!/usr/bin/perl

you forgot:
use strict;
use warnings;

Those two lines should be in every script you write. They catch 95% of
errors you are likely to make - including a couple you made below.
use Socket;
use DBI;
use CGI qw/:standard/;

$query=new CGI;

print $query->header;


# Connect to the database.
my $dbh = DBI->connect("DBI:mysql:database=database
name;host=hostname",
"username", "password",
{'RaiseError' => 1})|| print "ERROR
CONNECTING";

So if you can't connect, you print an error, and then keep on going
regardless?
open(FREADER,"csvfile.csv") or print "ERROR OPENING FILE \n";

So if you can't open the file, you print an error and keep on going,
regardless?

Also:
1) Use the three-argument form of open
2) Use lexical filehandles, not global barewords
3) State the *reason* the open failed

open my $FREADER, '<', 'csvfile.csv' or die "Error opening file: $!\n";
if(!($error))

$error does not exist. It is therefore undefined. Therefore, this if
statement is *always* true.
{
my $ln;

You never use this variable.
my $separator=$/;
undef $/;
$buff=<FREADER>;
$/=$separator;

Replace this entire mess with:
{
local $/;
$buff = <FREADER>;
}

The local() automatically restores $/ to its original value after the
block ends.
undef $ln;

Now you undefine that variable you never used, which of course was
already undefined - because you never used it.
close FREADER;
}
my $ln;

.... and now you declare another variable of the same name as that never
used variable above.
my $i=1;


foreach $ln (split("\n",$buff))

Waitaminute. You went through all that mess just to read the file into
one big scalar - and now you're processing that scalar on each "line"
within the file?! WHY?!

Get rid of that entire nonsense above, and change this loop to:

while (my $ln = said:
{
$buff=~/REG EXP FOR SPLITTING EACH LINE/g;

Ah, so here you want us to believe that your regular expression matched
just fine, but have decided it's not worth showing us the code. Have
you considered the possibility that you have an error here?

Do you also realize that you have the current "line" in $ln, but you
are matching against the entire $buff?

my $qry="INSERT INTO IP(A,B, C, D, E, F, G, H )
VALUES('$i','$1','$2','$8','$9','$14','$17',".$dbh->quote($20).")";

NEVER ever use the values $1, $2, etc without ensuring that the pattern
match succeeded. If it didn't, all those numbers are whatever they
were last time the pattern match did succeed.
$dbh->do($qry);

You should really *really* be preparing this insert once, using
placeholders, and then executing it multiple times.


This counter variable is unnecessary as you should now be reading the
file one line at a time, and the line number is stored in the $.
variable for you.
}

print $i;
my $errstr=$dbh->errstr;

You realize this errstr is only valid for the *last* time you tried to
insert, ignoring all the previous attempts, right?
print $errstr;


$sth->finish();

# Disconnect from the database.
$dbh->disconnect();

I strongly suggest you get rid of that massive regular expression that
apparently has at least 20 sub-captures, and use a module made for
parsing CSV files, like say Text::CSV. You can find it on the CPAN:
http://search.cpan.org

Paul Lalli
 
D

DJ Stunks

hi all,
i have a perl scipt which reads a CSV file and inserts the values
into a mysql database.
CSV file has around 75000 lines.

by the way, you can insert CSV files directly into a MySQL database
through "LOAD DATA [LOCAL] INFILE" and it will be orders of magnitude
faster than inserting line by line...

-jp
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,201
Messages
2,571,048
Members
47,647
Latest member
NelleMacy9

Latest Threads

Top