The Problem:
Deletes can get expensive when you have a lot dependent destroy model hooks in your api, especially if you are deleting 1000's of records.
This is article is not about installing sidekiq, if you are curious about set up , click here
The Context:
- We have a front end redux react app sending a delete request and updating the front end store accordingly
- our backend accepts an array of objects to run batch deletes.
- Sidekiq is up and running on our server (there are many other options for delayed jobs, look into what works best for you)
Here we go:
So we ran into a situation where a user would log onto the app, perform a common action , and every time he did this it would crash our servers.
Embarassing Right?
We found out later what he was doing was performing a delete action, but we had no idea of the scale in which he was using our app, we definitely hadn't planned on dealing with a situation where someone would be using our app in the real_world
with TONS of real data ...
Solution
After some digging, apparently its very common to delay a job like deleting database records, because after a certain scale it becomes very time consuming and costly for the server.
this actually ended up working out perfectly because our redux front end can just assume that the user is trying to delete this model and we can just splice the object by id
out of the reducer.
Here we go for reals:
Tell your rails app to use sidekiq for jobs:
config/application.rb
#use sidekiq for delayed jobs:
config.active_job.queue_adapter = :sidekiq
create a queue file for sidekiq:
config/sidekiq.yml
:queues:
- [default, 1]
we have a procfile that runs commands for our server.
we need to tell sidekiq to use our new sidekiq.yml
file.
procfile
sidekiq: bundle exec sidekiq -t 25 -C config/sidekiq.yml
Now this is all i needed to do getting rails up in our environment, you might need to do more or less, let me know if you need any help in the comments and i'd be more than happy to help you trouble shoot.
Great!
Now that we have our quques set up , we can implement a delayed job into our controller...
lets use the job generate command to create a job
run this in your directory:
rails g job delete_some_model
this should generate a couple test files for you, plus the file we will actually use (because who writes tests? , hahah good developers thats who
)
navigate to our job ( more on jobs ):
What goes in a job class?
The job class is where you put the code that will be executed by the queue. There is a perform method which is called and sent whatever parameters were sent when the job was first enqueued (when you called the perform_later method).
app/jobs/delete_some_model_job.rb
class DeleteSomeModel < ApplicationJob
queue_as :default
def perform(model_id)
some_model = Model.find(model_id)
some_model.destroy
end
end
Okay great ! now we have our delayed job that will delete a model, we just need to send it an ID.
NOTE: never pass an object in as params :
Put instance id can reduce the Redis DB size cost rather than an instance.
Sidekiq client API uses JSON.dump to send the data to Redis.
The Sidekiq server pulls that JSON data from Redis and uses JSON.load to convert the data back into Ruby types to pass to your perform method.
Don’t pass symbols, named parameters or complex Ruby objects (like Date or Time!) as those will not survive the dump/load round-trip correctly.
Now that we have our job we can call it in our controller:
some_controller.rb
def destroy
@some_model = SomeModel.find(params[:id])
if DeleteSomeModel.perform_later(@some_model[:id])
render json: { status: :ok }
else
common_error(@some_model.errors.full_messages)
end
end
Easy enough right?!
Leave any questions or comments below!
I cited some very helpful articles as well !
Sources:
https://blog.botreetechnologies.com/migrate-from-delayed-job-to-sidekiq-772acc8e1cdd
Top comments (0)