I have been scouring the Internet for some sort of guide for setting up django, celery, and redis so the celery task runs on a remote server, and I cannot seem to find out how to do this.
I have three machines - tsunami (development), trout (production), and t-rex (Nvidia card). I am running some image manipulation algorithms using celery and redis. I have the same django/celery code base on all three computers. When I run the algorithms on tsunami or trout without CUDA support, it takes hours to complete. When I run the same algorithms on t-rex with CUDA support, they complete in minutes. What I would like to do is offload the celery tasks from trout and tsunami to t-rex, and return the results to the either trout or tsunami. By results, I mean a file to save on the local file system or data added to the local mysql (8.0.32) database. Is this even possible?
Can someone point me to a guide on how to set this up? I am running django 4.1.7, celery 5.2.7 and redis server 6.0.16. Trout and tsunami are running Ubuntu 20.04, and t-rex is Ubuntu 22.04. All are running Python 3.8.
I am concerned that when I install dlib on trout and tsunami, it is compiled without CUDA support because there are no Nvidia cards. All machines are running dlib 19.24.0, but only t-rex has it compiled with CUDA support. Is this a show stopper for my plan to use t-rex as a remote CUDA processor for the other machines?
Thanks!