Is there any nice (and elegant, if there is..) way of extracting list of tables involved in transformation while using spark.sql(...) ?
I need dynamically identify list of tables (ideally without using customer parsing logic) in spark sql query and apply down stream transformations depending on table(s) involved.
I was thinking to use default SQL parser in
import spark.sessionState.sqlParser
but, I am just wondering if there is an easy way of doing this??
explainfunction)? Could you give some examples of how should extraction of tables work?