I have an SQL dump file containing an entire MariaDB which is multiple gigabytes big. I don't have access to a local database installation due to company security restrictions. Can I execute my dump via Python in SQLite or extract its data so I can analyze it?
I iterate my dump for table names to get an overview of the database:
table_list=[]
with open(dmp.file ,encoding='cp437') as f:
for line in f:
line = line.strip()
if line.lower().startswith('create table'):
table_name = re.findall('create table `([\w_]+)`', line.lower())
table_list.extend(table_name)
for x in table_list:
print(x)
This works but my dump's SQL statements span multiple lines, so I wrote the following to get the statements on one line:
currentLine = ""
with open(File,encoding='cp437') as f:
for line in f:
line = line.strip()
currentLine = currentLine + " " + line
if line.lower().endswith(';') == True:
with open(NewFileOneLiner.txt', "a", encoding="utf-8") as g:
g.write(currentLine.lstrip() + '\n')
currentLine = ""
What additional steps are needed (since both are SQL databases, transforming the SQL statements should be possible)? Is there any way to execute all statements in SQLite? What are the boundaries of and caveats to this approach (does SQLite not support concepts of SQL that I need to be aware of)? Can I extract the tables and their data in some other form?
CREATE TABLE somethingwith a regular expression. Parsing the INSERT statements with regular expressions is a lot harder, assuming it's possible