I recently deployed a new feature to my server where the API will take a large JSON string an store it in the database. This works fine in isolation but under load it quickly cripples my server. I'm looking to speed it up any way that I can.
Here is my table schema:
| id | int(11) |
| code | varchar(255) |
| uuid | varchar(255) |
| data | longtext |
| last_updated | datetime |
Here are the indexes on the table:

Finally, here's the update query:
update backups set data = :data, last_backup_date = NOW() where uuid = :uuid limit 1
I can't imagine that the query is that slow itself, but the way that I'm storing the data may be fundamentally wrong. I've read online that storing the data on files in the filesystem is quicker, and also that decoupling data from the table (having a backups_data and backups table for example) is also quicker. What would be the best approach?