0

I recently deployed a new feature to my server where the API will take a large JSON string an store it in the database. This works fine in isolation but under load it quickly cripples my server. I'm looking to speed it up any way that I can.

Here is my table schema:

| id           | int(11)      |
| code         | varchar(255) |
| uuid         | varchar(255) |
| data         | longtext     |
| last_updated | datetime     |

Here are the indexes on the table:

Indexes

Finally, here's the update query:

update backups set data = :data, last_backup_date = NOW() where uuid = :uuid limit 1

I can't imagine that the query is that slow itself, but the way that I'm storing the data may be fundamentally wrong. I've read online that storing the data on files in the filesystem is quicker, and also that decoupling data from the table (having a backups_data and backups table for example) is also quicker. What would be the best approach?

1 Answer 1

1

Please use SHOW CREATE TABLE to present the schema. I can't see what Engine is being used. What do the index(es) look like? That is very important for performance. How can code be unique, yet have a low cardinality?

How big is the data for the row in question? If it is megabytes, then it could take seconds or minutes to shovel that much data around.

Was anything else going on with that table at the same time as the slow Update? (Here the Engine becomes quite important.)

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.