0

Input file (i.e happy.txt generated from happy)

NAME     PEND RUN SUSP JLIM JLIMR   RATE   HAPPY
akawle   8    20  0    100  20:100  67980  71%
akumar6  16   0   0    100  100     0      0%
apatil2  2    4   0    100  10:100  20398  67%
ashetty  0    3   0    100  40:100  9725   100%
bdash    2    0   0    100  100     0      0%

code

gen_ls_data();

sub gen_ls_data{

    my($lines) = @_;
    my $header_found = 0;
    my @headers = ();
    my $row_count = 0;
    my %row_data = ();

    $lines = `happy`;
    system("happy > happy.txt");

    my $filename = 'happy.txt';
    open(my $fh, '<', $filename) or die "Could not open file '$filename' $!";
    print $fh $lines ;
    close $fh;

    foreach (split("\n",$lines)) {
        if (/NAME\s*PEND/) {
        $header_found = 1;      
        @headers =split;
        next;
    }

    if ( $header_found == 0 ) { }
    else {

        if (/^\s*$/) {
            $header_found=0;
            next; 
        }

        $row_data{$row_count++} = $_;
    }
}

How can I pass the happy.txt directly into the foreach loop without passing the $lines variables Linux command?

4
  • What exactly are you trying to do with the lines once you've loaded them into a perl data structure? I'm having trouble following. Commented Jan 31, 2017 at 5:40
  • i had used happy command and stored the command output into happy.txt.Then to read the file used the read operations.To split the contents inside forach loop i had used the linux command stored variables(i.e $lines).Now i want to use the happy.txt file directly inside foreach loop.Is it possible to modify my code@GregoryNisbet Commented Jan 31, 2017 at 5:49
  • 1
    I'm confused - you open $filename for reading ('<'), and then you print to it? I'm not following your intentions here at all. Commented Jan 31, 2017 at 5:59
  • 1
    Me too, confused. (1) You receive $lines from arguments. (2) But you also get $lines as output of happy (3) Then you also open 'happy.txt' to read those same lines, yet again ... (4) but then you write to that file. If you need $lines for foreach, either take them as function argument (and don't worry about files), or get them by executing happy under backticks, or execute happy via system and redirect them to a file, which you then read. One of these. Commented Jan 31, 2017 at 6:34

1 Answer 1

1

It seems that you simply need $lines for processing in the foreach loop.

Either get them as arguments passed to the function

my ($lines) = @_;
# ...
# no need to run "happy" in any way or to deal with files
foreach (split /\n/, $lines) {
    # ...
}

or get them as output from happy

# no need for function arguments to contain $lines
my $lines = `happy`;
# no need for system() and reading the file
foreach (split /\n/, $lines) {
    # ...
}

or run system with redirection and read from the file

my $filename = 'happy.txt';
system ("happy > $filename");
open my $fh, '<', $filename or die "Can't open $filename: $!";

while (my $line = <$fh>) {
    # process each line from the file
}

In this case you can nicely read line by line, instead of having $lines and split-ing it.

You can do that with backticks (the qx operator) as well,

my @lines = `happy`;

since in the list context it returns the list of lines from output, not the scalar with all of them.


Note that you should add error checking for all calls that go out to the system (and not only for that!). The system returns zero if it ran fine, so you normally do something like

system(...) == 0 or die "Can't fork: $!";

while qx on error returns undef in scalar context or empty list in list context, so these can be used for an initial check for problems.

They both also set the $? variable which you can interrogate for details, see the link and the links for system and qx. Note that this firstly concerns their operation, and not necessarily how well the command that they execute itself did.

Sign up to request clarification or add additional context in comments.

1 Comment

Great explanation.It helps me@zdim

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.