I have to make multiple sql queries of entire tables, and concatenate them into one big data table.
I have a dictionary where the key is a team name, and the value serves as an acronym where the acronym is the prefix to mySQL data tables
engine = create_engine('mysql+mysqlconnector://%s:%s@%s/%s' % (mysql_user, mysql_password, mysql_host, mysql_dbname), echo=False, pool_recycle=1800)
mysql_conn = engine.connect()
team_dfs = []
nba_dict = {'New York Knicks': 'nyk',
'Boston Celtics': 'bos',
'Golden State Warriors': 'gsw',
'New York Knicks': 'nyk'}
for name, abbr in nba_dict.items()
query = f'''
SELECT *
from {abbr}_record
'''
df = pd.read_sql_query(query, mysql_conn)
df['team_name'] = name
team_dfs.append(df)
team_dfs = pd.concat(team_dfs)
Is there a better way to refactor this code and make it more efficient?
select *. Don't try to preemptively optimize your code around the data. Get the database with structured data and the performance and simple code will flow from that.