3

I'm having a problem with a Perl script that uses DBI module (version 1.601) connecting to Oracle 8. The OS is FreeBSD 7.0.

The script just opens a connection, loops through a CSV file inserting data one row/line at a time, then closes the connection and exits. A cron job executes this script every 30 minutes processing any incoming CSV files. On occasion, the script stops processing midway through (no new data inserted, does not consume any more cpu time) and it becomes necessary to kill it.

There appears to be a size threshold for the CSV file. Anything under 35000 rows, the script executes and exits ok. Otherwise it stalls and the process sits (seemingly) dormant/waiting.

I only have remote access to Oracle8 through SQL*Plus. Checked the Oracle v$session table and see the connections for these stalled processes don't get closed, so perhaps hitting some resource limit? Running 'limits' on FreeBSD yields the following:

cputime          infinity secs
filesize         infinity kB
datasize           524288 kB
stacksize           65536 kB
coredumpsize     infinity kB
memoryuse        infinity kB
memorylocked     infinity kB
maxprocesses         5547
openfiles           11095
sbsize           infinity bytes
vmemoryuse       infinity kB

I'm unsure how to proceed. How or where can I look to narrow down the problem?

Any help is appreciated.


Here is the script with changes to some var names and input checking code omitted:

#!/usr/bin/perl -w
use DBI;
use strict;

my $exitstatus = 0;
my $infile = shift;
open (INFILE, "$infile");

my $dbh = DBI->connect(
  'dbi:Oracle:sid=mysid;host=myhostname',
  'myusername',
  'mypassword',
  {
    RaiseError => 1,
    AutoCommit => 1
  }
) or die $DBI::errstr;

while (<INFILE>) {
  chomp;
  next if (/^#.*$/ || /^$/);
  my @columns = split /\t/;
  my ($var1, $var2, $var3) = @columns;
  eval {
    my $oProce = qq{
      BEGIN
        InsertStoredProc(
        field1 => $var1,
        field2 => $var2,
        field3 => $var3
        );
      END;
    };
    $dbh->do( $oProce );
  };
  if ( $@ ) {
    $exitstatus=1;
    warn "LINE: @columns\n";
    warn "Execution of stored procedure failed: $DBI::errstr\n";
    warn "################################################################################\n";
  }
}
$dbh->disconnect;
close INFILE;
exit $exitstatus;
2
  • 2
    Please include some portion of your script. Commented Nov 8, 2012 at 1:50
  • Appended code to original post. Let me know what you think. Commented Nov 8, 2012 at 5:28

2 Answers 2

4

Not that I think it relevant but that code is pretty inefficient. You are parsing your SQL for every row inserted. Why not do:

my $sth = $dbh->prepare('BEGIN InsertStoredProc(?,?,?); END;');
while (<INFILE>) {
  chomp;
  next if (/^#.*$/ || /^$/);
  my @columns = split /\t/;
  #my ($var1, $var2, $var3) = @columns;
  eval {
       $sth->execute(@columns);
    #my $oProce = qq{
    #  BEGIN
    #    InsertStoredProc(
    #    field1 => $var1,
    #    field2 => $var2,
    #    field3 => $var3
    #    );
    #  END;
    #};
    #$dbh->do( $oProce );
  };
  if ( $@ ) {
    $exitstatus=1;
    warn "LINE: @columns\n";
    warn "Execution of stored procedure failed: $DBI::errstr\n";
    warn "################################################################################\n";
  }
}

The other suggestion of turning off AutoCommit will also speed up your code as you will only be committing every N rows instead of every row. Why using AutoCommit should cause a hang makes no sense to me.

As for the point where it appears to hangs if you can reproduce it at will try running it with DBI_TRACE=15=x.log and set ora_verbose => 6 in the connect. What is at the end of the log file when it hangs.

Sign up to request clarification or add additional context in comments.

2 Comments

There's another advantage to this in that it uses a parameterized query so that it's impossible for any value to hijack the query.
Thanks. I'll try all the suggestions posted. Just inherited these systems and much of it is suboptimal.
0

Did you tried without AutoCommit? Maybe you creates to many transaction and Oracle stops processing more requests.

Try commit in every 100 row and when there is no more data.

You could use strace -p your_program_pid to see what's going on, where your scripts stalls.

I hope this will helps you.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.