Anyone know how to call single notebook with different parameters parallel, and all notebooks should appear on spark UI to make the trouble shooting easier? I have one child notebook, calling from other master notebook with objects list dict. Child notebook has database connections, data frames, reading from database and reading different types of files based on objects list dict.
Currently using ThreadPool but have difficulty in troubleshooting, if any notebook fails its name and error doesn't appear in spark UI because its running on driver node.
any simple example would help?
Thanks.