I have a Python application, built with Flask, that allows importing of many data records (anywhere from 10k-250k+ records at one time). Right now it inserts into a Cassandra database, by inserting one record at a time like this:
for transaction in transactions:
self.transaction_table.insert_record(transaction)
This process is incredibly slow. Is there a best-practice approach I could use to more efficiently insert this bulk data?