Today I faced the task to save documents using rails paperclip saving them in AWS S3, however I needed to do this in background
in order not to block the user. I didn't want to use delayed_paperclip as it is used with only delayed_job or Resque, I wanted
something that will run on all ActiveJob adapters.
My idea is inspired originally , however that solution has points that can be enhanced: you need to create another local attachment for each attachment you add to your model, you may forget to add a migration to create it, you may forget to add it to the model, and finally: it add more columns to your model than needed.
The idea I used is as follows:
- Before saving the attachment, change the storage to Filesystem, so that it will write the document locally -not in public directory-, not to S3
- After saving the attachment, create a background process to post the document to AWS S3
The code is very basic so it can be enhanced, however this is the basic idea which is working fine till now.
# in your model like this has_attached_file :attachment, process_in_background: true # use paperclip normally, you don't have to do/change anything! my_model.attachment = some_file # Put this code in intializers directroy module Extensions module Paperclip module Attachment attr_accessor :uploading_to_s3 def uploading_to_s3? uploading_to_s3 end end module S3BackgroundHandler extend ActiveSupport::Concern included do class << self def has_attached_file(name, options = {}) super(name, options) all_options = ::Paperclip::Attachment.default_options.merge(options) process_in_background(name) if all_options[:process_in_background] end def process_in_background(name) original_options = {} save_to_disk = Proc.new{|name, attachment| send("#{name}_updated_at_changed?") && !attachment.uploading_to_s3?} before_save do attachment = send(name) if instance_exec(name, attachment, &save_to_disk) original_options = attachment.options.dup attachment.options.merge!({ path: LOCAL_PAPERCLIP_ATTACHMENT_PATH }) attachment.extend ::Paperclip::Storage::Filesystem end end after_save do attachment = send(name) if instance_exec(name, attachment, &save_to_disk) queued_for_delete = attachment.instance_eval("@queued_for_delete") UploadDocumentToS3Job.perform_later(self.class.name, self.id, name.to_s, queued_for_delete) queued_for_delete.clear attachment.options.merge!(original_options) end end end end end LOCAL_PAPERCLIP_ATTACHMENT_PATH = ":rails_root/files/:class/:attachment/:id_partition/:style/:filename" end end end Paperclip::Attachment.include Extensions::Paperclip::Attachment ApplicationRecord.include Extensions::Paperclip::S3BackgroundHandler # Put this code in jobs directory class UploadDocumentToS3Job < ApplicationJob queue_as :upload_document_to_s3 def perform(class_name, id, attachment_name, queued_for_delete) record = class_name.constantize.find(id) attachment = record.send(attachment_name) # deleting s3_bucket = attachment.s3_bucket queued_for_delete.each do |path| begin attachment.send(:log, "deleting #{path}") s3_bucket.object(path.sub(%r{\A/}, "")).delete rescue Aws::Errors::ServiceError => e # Ignore this. end end # uploading original_options = attachment.options.dup attachment.options.merge!({ path: ApplicationRecord::LOCAL_PAPERCLIP_ATTACHMENT_PATH }) file_path = attachment.path attachment.options.merge!(original_options) File.open(file_path) do |file| attachment.uploading_to_s3 = true attachment.assign(file) attachment.reprocess! attachment.uploading_to_s3 = false end File.delete(file_path) rescue nil end end