Database DBI连接,失败:致命:抱歉,已经有太多客户端

Database DBI连接,失败:致命:抱歉,已经有太多客户端,database,perl,postgresql,cron,Database,Perl,Postgresql,Cron,我正在运行crontab,如下所述: * 1 * * * /var/fdp/reportingscript/an_outgoing_tps_report.pl * 1 * * * /var/fdp/reportingscript/an_processed_rule_report.pl * 1 * * * /var/fdp/reportingscript/sdp_incoming_traffic_tps_report.pl * 1 * * * /var/fdp/reportingscript/en

我正在运行crontab,如下所述:

* 1 * * * /var/fdp/reportingscript/an_outgoing_tps_report.pl
* 1 * * * /var/fdp/reportingscript/an_processed_rule_report.pl
* 1 * * * /var/fdp/reportingscript/sdp_incoming_traffic_tps_report.pl
* 1 * * * /var/fdp/reportingscript/en_outgoing_tps_report.pl
* 1 * * * /var/fdp/reportingscript/en_processed_rule_report.pl
* 1 * * * /var/fdp/reportingscript/rs_incoming_traffic_report.pl
* 1 * * * /var/fdp/reportingscript/an_summary_report.pl
* 1 * * * /var/fdp/reportingscript/en_summary_report.pl
* 1 * * * /var/fdp/reportingscript/user_report.pl
并得到一个错误:对于所有脚本,错误都是相同的

DBI connect'dbname=scs;主机=192.168.18.23;端口=5432','postgres',。。。失败:致命:抱歉,在/var/fdp/reportingscript/sdp_incoming_traffic_tps_report.pl第38行已经有太多客户端

此外,如果我一次手动运行一个脚本,它不会显示任何错误

为了供您参考,我附上了我已显示上述错误的脚本:

#!/usr/bin/perl

use strict;
use FindBin;
use lib $FindBin::Bin;
use Time::Local;
use warnings;
use DBI;
use File::Basename;
use CONFIG;
use Getopt::Long;
use Data::Dumper;

my $channel;
my $circle;
my $daysbefore;
my $dbh;
my $processed;
my $discarded;
my $db_name     = "scs";
my $db_vip      = "192.168.18.23";
my $db_port     = "5432";
my $db_user     = "postgres";
my $db_password = "postgres";
#### code to redirect all console output in log file
my ( $seco_, $minu_, $hrr_, $moday_, $mont_, $years_ ) = localtime(time);
$years_ += 1900;
$mont_  += 1;
my $timestamp = sprintf( "%d%02d%02d", $years_, $mont_, $moday_ );
$timestamp .= "_" . $hrr_ . "_" . $minu_ . "_" . $seco_;
print "timestamp is $timestamp \n";
my $logfile = "/var/fdp/log/reportlog/sdp_incoming_report_$timestamp";
print "\n output files is " . $logfile . "\n";
open( STDOUT, ">", $logfile ) or die("$0:dup:$!");
open STDERR, ">&STDOUT" or die "$0: dup: $!";

my ( $sec_, $min_, $hr_, $mday_, $mon_, $year_ ) = localtime(time);

$dbh = DBI->connect( "DBI:Pg:dbname=$db_name;host=$db_vip;port=$db_port",
    "$db_user", "$db_password", { 'RaiseError' => 1 } );
print "\n Dumper is " . $dbh . "\n";
my $sthcircle = $dbh->prepare("select id,name from circle");
$sthcircle->execute();

while ( my $refcircle = $sthcircle->fetchrow_hashref() ) {
    print "\n dumper for circle is " . Dumper($refcircle);
    my $namecircle = uc( $refcircle->{'name'} );
    my $idcircle   = $refcircle->{'id'};
    $circle->{$namecircle} = $idcircle;
    print "\n circle name : " . $namecircle . "id is " . $idcircle;
}

sub getDate {
    my $daysago = shift;
    $daysago = 0 unless ($daysago);
    my @months = qw(Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec);
    my ( $sec, $min, $hour, $mday, $mon, $year, $wday, $yday, $isdst ) = localtime( time - ( 86400 * $daysago ) );
    # YYYYMMDD, e.g. 20060126
    $year_ = $year + 1900;
    $mday_ = $mday;
    $mon_  = $mon + 1;
    return sprintf( "%d-%02d-%02d", $year + 1900, $mon + 1, $mday );
}

GetOptions( "d=i" => \$daysbefore );

my $filedate = getDate($daysbefore);
print "\n filedate is $filedate \n";
my @basedir = CONFIG::getBASEDIR();
print "\n array has basedir" . Dumper(@basedir);
$mon_  = "0" . $mon_  if ( defined $mon_  && $mon_ <= 9 );
$mday_ = "0" . $mday_ if ( defined $mday_ && $mday_ <= 9 );

foreach (@basedir) {
    my $both = $_;
    print "\n dir is $both \n";
    for ( keys %{$circle} ) {
        my $path     = $both;
        my $circleid = $_;
        print "\n circle is $circleid \n";
        my $circleidvalue = $circle->{$_};
        my $file_csv_path = "/opt/offline/reports/$circleid";
        my %sdp_hash      = ();
        print "\n file is $file_csv_path csv file \n";
        if ( -d "$file_csv_path" ) {
        } else {
            mkdir( "$file_csv_path", 0755 );
        }

        my $csv_new_file
            = $file_csv_path
            . "\/FDP_"
            . $circleid
            . "_SDPINCOMINGTPSREPORT_"
            . $mday_ . "_"
            . $mon_ . "_"
            . $year_ . "\.csv";
        print "\n file is $csv_new_file \n";
        print "\n date:$year_-$mon_-$mday_ \n";

        open( DATA, ">>", $csv_new_file );
        $path = $path . $circleid . "/Reporting/EN/Sdp";
        print "\n *****path is $path \n";
        my @filess = glob("$path/*");

        foreach my $file (@filess) {
            print "\n Filedate ---------> $filedate file is $file \n";
            if ( $file =~ /.*_sdp.log.$filedate-*/ ) {
                print "\n found file for $circleid \n";
                my $x;
                my $log       = $file;
                my @a         = split( "-", $file );
                my $starttime = $a[3];
                my $endtime   = $starttime;
                my $sdpid;
                my $sdpid_value;
                $starttime = "$filedate $starttime:00:00";
                $endtime   = "$filedate $endtime:59:59";
                open( FH, "<", "$log" ) or die "cannot open < $log: $!";

                while (<FH>) {
                    my $line = $_;
                    print "\n line is $line \n";
                    chomp($line);
                    $line =~ s/\s+$//;
                    my @a = split( ";", $line );
                    $sdpid = $a[4];
                    my $stat = $a[3];
                    $x->{$sdpid}->{$stat}++;
                }
                close(FH);
                print "\n Dumper is x:" . Dumper($x) . "\n";
                foreach my $sdpidvalue ( keys %{$x} ) {
                    print "\n sdpvalue us: $sdpidvalue \n";
                    if ( exists( $x->{$sdpidvalue}->{processed} ) ) {
                        $processed = $x->{$sdpidvalue}->{processed};
                    } else {
                        $processed = 0;
                    }
                    if ( exists( $x->{$sdpidvalue}->{discarded} ) ) {
                        $discarded = $x->{$sdpidvalue}->{discarded};
                    } else {
                        $discarded = 0;
                    }
                    my $sth_new1 = $dbh->prepare("select id from sdp_details where sdp_name='$sdpid' ");
                    print "\n sth new is " . Dumper($sth_new1);
                    $sth_new1->execute();
                    while ( my $row1 = $sth_new1->fetchrow_hashref ) {
                        $sdpid_value = $row1->{'id'};
                        print "\n in hash rowref from sdp_details table " . Dumper($sdpid_value);
                    }
                    my $sth_check
                        = $dbh->prepare(
                        "select processed,discarded from sdp_incoming_tps where circle_id='$circleidvalue' and sdp_id='$sdpid_value' and start_time='$starttime' and end_time='$endtime'"
                        );
                    print "\n Dumper for bhdatabase statement is " . Dumper($sth_check);
                    $sth_check->execute();
                    my $duplicate_row = 0;
                    my ( $success_, $failure_ );
                    while ( my $row_dup = $sth_check->fetchrow_hashref ) {
                        print "\n row_dup is " . Dumper($row_dup);
                        $duplicate_row = 1;
                        $success_ += $row_dup->{'processed'};
                        $failure_ += $row_dup->{'discarded'};
                    }
                    if ( $duplicate_row == 0 ) {
                        my $sth
                            = $dbh->prepare(
                            "insert into sdp_incoming_tps (id,circle_id,start_time,end_time,processed,discarded,sdp_id) select nextval('sdp_incoming_tps_id'),'$circleidvalue','$starttime','$endtime','$processed','$discarded','$sdpid_value' "
                            );
                        $sth->execute();
                    } else {
                        $success_ += $processed;
                        $failure_ += $discarded;
                        my $sth
                            = $dbh->prepare(
                            "update sdp_incoming_tps set processed=$success_,discarded=$failure_ where circle_id='$circleidvalue' and sdp_id='$sdpid_value' and start_time='$starttime' and end_time='$endtime'"
                            );
                        $sth->execute();
                    }
#                    my $file_csv_path = "/opt/offline/reports/$circleid";
#                    my %sdp_hash      = ();
#                    if ( -d "$file_csv_path" ) {
#                    } else {
#                        mkdir( "$file_csv_path", 0755 );
#                    }
#                    my $csv_new_file = $file_csv_path . "\/FDP_" . $circleid . "_SDPINCOMINGTPSREPORT_". $mday_ . "_" . $mon_ . "_" . $year_ . "\.csv";
                    print "\n file is $csv_new_file \n";
                    print "\n date:$year_-$mon_-$mday_ \n";
                    close(DATA);
                    open( DATA, ">>", $csv_new_file ) or die("cant open file : $! \n");
                    print "\n csv new file is $csv_new_file \n";
                    my $sth_new2 = $dbh->prepare("select * from sdp_details");
                    $sth_new2->execute();

                    while ( my $row1 = $sth_new2->fetchrow_hashref ) {
                        my $sdpid = $row1->{'id'};
                        $sdp_hash{$sdpid} = $row1->{'sdp_name'};
                    }
                    #print "\n resultant sdp hash".Dumper(%sdp_hash);
                    #$mon_="0".$mon_;
                    print "\n timestamp being matched is $year_-$mon_-$mday_ \n";
                    print "\n circle id value is $circleidvalue \n";
                    my $sth_new
                        = $dbh->prepare(
                        "select * from sdp_incoming_tps where date_trunc('day',start_time)='$year_-$mon_-$mday_' and circle_id='$circleidvalue'"
                        );
                    $sth_new->execute();
                    print "\n final db line is " . Dumper($sth_new);
                    my $str     = $sth_new->{NAME};
                    my @str_arr = @$str;
                    shift(@str_arr);
                    shift(@str_arr);
                    my @upper = map { ucfirst($_) } @str_arr;
                    $upper[4] = "Sdp-Name";
                    my $st = join( ",", @upper );
                    $st = $st . "\n";
                    $st =~ s/\_/\-/g;
                    #print $fh "sep=,"; print $fh "\n";

                    print DATA $st;
                    while ( my $row = $sth_new->fetchrow_hashref ) {

                        print "\n found matching row \n";
                        my $row_line
                            = $row->{'start_time'} . ","
                            . $row->{'end_time'} . ","
                            . $row->{'processed'} . ","
                            . $row->{'discarded'} . ","
                            . $sdp_hash{ $row->{'sdp_id'} } . "\n";
                        print "\n row line matched is " . $row_line . "\n";
                        print DATA $row_line;
                    }
                    close(DATA);
                }
            } else {
                next;
            }
        }
    }
}

$dbh->disconnect;
请帮忙,我怎样才能避免这个错误


感谢您的帮助。

如错误消息所示,当前的问题是,一次运行所有这些脚本需要的数据库连接比服务器允许的多。如果它们单独运行良好,那么单独运行它们将解决这一问题

根本问题是您的crontab错误。*1***将在每天0100到0159之间每分钟运行所有脚本。如果它们需要一分钟以上才能完成,那么新的连接集将在前一个连接集完成之前启动,这需要额外的一组数据库连接,这将相当快地通过可用连接池运行

我假设您每天只需要运行一次日常脚本,而不是60次,所以将其更改为5 1***,以便在0105时只运行一次

如果仍然存在问题,请在不同的时间运行每个问题,这可能是一个好主意:

5 1 * * * /var/fdp/reportingscript/an_outgoing_tps_report.pl
10 1 * * * /var/fdp/reportingscript/an_processed_rule_report.pl
15 1 * * * /var/fdp/reportingscript/sdp_incoming_traffic_tps_report.pl
20 1 * * * /var/fdp/reportingscript/en_outgoing_tps_report.pl
25 1 * * * /var/fdp/reportingscript/en_processed_rule_report.pl
30 1 * * * /var/fdp/reportingscript/rs_incoming_traffic_report.pl
35 1 * * * /var/fdp/reportingscript/an_summary_report.pl
40 1 * * * /var/fdp/reportingscript/en_summary_report.pl
45 1 * * * /var/fdp/reportingscript/user_report.pl

如错误消息所示,直接的问题是,一次运行所有这些脚本需要的数据库连接比服务器允许的多。如果它们单独运行良好,那么单独运行它们将解决这一问题

根本问题是您的crontab错误。*1***将在每天0100到0159之间每分钟运行所有脚本。如果它们需要一分钟以上才能完成,那么新的连接集将在前一个连接集完成之前启动,这需要额外的一组数据库连接,这将相当快地通过可用连接池运行

我假设您每天只需要运行一次日常脚本,而不是60次,所以将其更改为5 1***,以便在0105时只运行一次

如果仍然存在问题,请在不同的时间运行每个问题,这可能是一个好主意:

5 1 * * * /var/fdp/reportingscript/an_outgoing_tps_report.pl
10 1 * * * /var/fdp/reportingscript/an_processed_rule_report.pl
15 1 * * * /var/fdp/reportingscript/sdp_incoming_traffic_tps_report.pl
20 1 * * * /var/fdp/reportingscript/en_outgoing_tps_report.pl
25 1 * * * /var/fdp/reportingscript/en_processed_rule_report.pl
30 1 * * * /var/fdp/reportingscript/rs_incoming_traffic_report.pl
35 1 * * * /var/fdp/reportingscript/an_summary_report.pl
40 1 * * * /var/fdp/reportingscript/en_summary_report.pl
45 1 * * * /var/fdp/reportingscript/user_report.pl

postgresql.conf中max_连接的赋值是多少???@Winged,嗨,我不知道如何检查..你能帮我找到方法吗?感谢您是否在psql SHOW max_connections中使用pgAdmin或其他工具;我使用:psql-U postgres-h 192.168.18.23-d scs在postgresql.conf的max_连接中分配了什么值???@Winged,嗨,我不知道如何检查..你能帮我一下方法吗?感谢您是否在psql SHOW max_connections中使用pgAdmin或其他工具;我使用:psql-U postgres-h192.168.18.23-dsc或按顺序运行它们:11***cd/var/dfp/reportingscript&&ls./*.pl | shOr按顺序运行它们:11***cd/var/dfp/reportingscript&&ls./*.pl | sh