2

I was trying to generate an output file on the fly with my code in Perl:

#!/usr/bin/perl 
use strict;
use Data::Dumper;
use warnings;  
use Bio::Seq;
use Bio::SeqIO; 
use Bio::DB::Fasta;
open my $OUT, "> ./temporal.gff";
           my $seq = $db->get_Seq_by_id($key);
            my $length  = $seq->length;
            my $all = $coord{$key};
            foreach my $keys (@$all[0]){
                    @sorted = sort { $a->[0] <=> $b->[0] } @$all;
                    $num = scalar @sorted;
                    my @new = &infer_gaps(\@sorted, $num, $length);
                    foreach my $ln (@new){
                            my @tmp = split /\s{2, }|\t/, $ln;
                            print $OUT "$key\tFillSequencer\tinference\t$tmp[0]\t$tmp[1]\t\.\t$tmp[2]\t\.\t$specie $key:$tmp[0]\-$tmp[1]\n";
                    }
                    #print Dumper \@new;    
            }
        }

But, When I want to use this generated file, as the usual way with a while loop, it does not read this file:

open my $ABC, "<./temporal.gff" or die $!;
    my $mRNA_seq;
    while (my $l = <$ABC>){
            chomp $l;
            my @arr = split /\t|\s+/, $l;
            #my $coor = $arr[0] . " " . $arr[3] . " " . $arr[4];
            $frame = $arr[6];
            my $final_seq = $db->seq( $arr[0], $arr[3], $arr[4] );
            my $output_nucleotide = Bio::Seq->new(
                    -seq => $final_seq,
                    -id  => $mRNA_name,
                    -display_id => $mRNA_name,
                    -alphabet => 'dna',
            );
            if ($frame eq '-') {
                    $output_nucleotide = $output_nucleotide->revcom();
            }
            $outfile_coor->write_seq($output_nucleotide);
    }

    close $ABC;

Do you have some ideas? When I ran my code with -d flag it allowed me to confirm that the code skips the last while loop...I do not know if there is a problem with the definition of the FILE HANDLE variable or any other problem that does not generates warnings. Thanks in advance!

2
  • 1
    You're forgetting to check some of your open calls. Have a look into use autodie. Comes standard now. Commented Nov 6, 2016 at 19:59
  • Yeah! you are right, the solution is: close $OUT, where the out file was created! Thanks a lot! Commented Nov 6, 2016 at 20:28

1 Answer 1

4

I think you need to close the file before you start reading it.

Alternatively you can flush the output, to make sure that the file is completely written before you start reading it.

sub flush {
   my $h = select($_[0]); my $af=$|; $|=1; $|=$af; select($h);
}

# Flush immediately without turning auto-flush on:
flush($OUT);

See this answer for examples of how to use flush:

http://www.perlmonks.org/bare/?node_id=489962

Sign up to request clarification or add additional context in comments.

6 Comments

Is closing and flushing redundant? Should it be close or flush? Also flushing can be handled much nicer with use IO::Handle; $OUT->flush;. Once IO::Handle is loaded you can use method calls on all filehandles.
File output is buffered by default, even on a close. It's done this way for performance reasons - the file writing is done at the operating system's leisure. A flush is the only way to ensure it is completely on the disk.
No, close also flushes. From perldoc: "Closes the file or pipe associated with the filehandle, flushes the IO buffers, and closes the system file descriptor."
Ok, my apologies, I have used flush previously to ensure that the contents are written along the way, so if the program dies the work done so far is saved.
You don't even need use IO::Handle; anymore to do $OUT->flush (or $OUT->autoflush).
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.