Skip to content
loading...

Need idea to improve mysql's lock performance

github logo ・1 min read  

We currently do goods management system like amazon
With each product, there are about 5000 orders every day, which means that there are 5000 transactions processed.
Currently, I am using the lock row mysql mechanism to ensure the correctness of the data.
Mysql does a great job but the execution time is too long because the transactions are waiting for each other.
So I want to ask if there is any way to increase the processing ability of mysql or is there any other approach to handle the problem of multiple transactions updating the same record (using a different database ...)

twitter logo DISCUSS (2)
markdown guide
 

An alternative approach would be to handle these things by using Database Concurrency. You should have a row version column (usually int or timespan type) and with it, you will ensure the correctness of the data.

For example, if two users access the same row and the first one updates it, the second one will have the incorrect data. The first user will update the row version which will indicate that the other user should reload. Usually, a message is displayed that the data is in use to the user with invalid data.

 

Increase cpu, memory, hard drive, cache query. lock row is not bullet proof. Are you sure you making it right?

Classic DEV Post from Mar 22

🦠🚪 Show your COVID19 work from home desk? 👩🏻‍💻🪑

What's your COVID19 work from home desk ?

huynhit-92 profile image