Im working on object detection from a live stream video using opencv python. The program I have is running on a single thread because of that the resulting video shown on the screen doesnt even look like a video, since there is a delay in detection process. So, Im trying to re-implement it using multiple threads. I am using one thread for reading frames and another for showing the detection result and about 5 threads to run the detection algorithm on multiple frames at once. I have written the following code but the result is not different from the single thread program. Im new to python. So, any help is appreciated.
import threading, time
import cv2
import queue
def detect_object():
while True:
print("get")
frame = input_buffer.get()
if frame is not None:
time.sleep(1)
detection_buffer.put(frame)
else:
break
return
def show():
while True:
print("show")
frame = detection_buffer.get()
if frame is not None:
cv2.imshow("Video", frame)
else:
break
if cv2.waitKey(1) & 0xFF == ord('q'):
break
return
if __name__ == "__main__":
input_buffer = queue.Queue()
detection_buffer = queue.Queue()
cap = cv2.VideoCapture(0)
for i in range(5):
t = threading.Thread(target=detect_object)
t.start()
t1 = threading.Thread(target=show)
t1.start()
while True:
ret, frame = cap.read()
if ret:
input_buffer.put(frame)
time.sleep(0.025)
else:
break
print("program ended")
detect_objectthreads do after they get a frame from the queue is sleep for 1 second... hence 5 of them won't do anything better than 5 frames per second.