I am trying to write a small Python script to read data from a serial port and simply display it to the screen. It seems to be working, but there seems to be a delay to when I see data displayed on the screen.
import serial
import sys
SerialGF = serial.Serial(port='/dev/ttyAMA0', baudrate=115200, parity='N', stopbits=1, xonxoff=0, rtscts=0, timeout=0)
def main():
# openSerial_GF()
print SerialGF.isOpen()
SerialGF.flush()
SerialGF.flushInput()
SerialGF.flushOutput()
try:
while True:
readSerial_GF()
except KeyboardInterrupt:
pass
SerialGF.close()
print SerialGF.isOpen()
# def openSerial_GF():
# global SerialGF = serial.Serial(port='/dev/ttyAMA0', baudrate=115200, parity='N', stopbits=1, xonxoff=0, rtscts=0, timeout=0)
def readSerial_GF():
s = SerialGF.read(SerialGF.inWaiting())
sys.stdout.write(s)
if __name__ == "__main__":
main()
The data on serial port is a stream of dash characters, '-', until some event occurs. What I am seeing a delay when the data is displayed to the screen. What is odd is that 1024 characters are displayed at a time. I have set the timeout to zero (0) so it should return straight away, but it isn't. Does anyone have any ideas why there is a delay?
Thanks, Mark
sys.stdout. Can you run in unbuffered modepython -uto see if the problem persists? Here's another question which digs into the buffering aspect.sys.stdout.flush()after thewrite. The buffering is explained in theopendocs (not the specific details on howsys.stdoutis opened, which obviously happens before you even get control). On many systems, the "system default" is to buffer until either a newline is printed or some max line length (like 1024) is reached, which seems like exactly what you're seeing.