ActiveStorage does not provide any built-in way of implementing authentication for the available DirectUpload endpoint in Rails. When using DirectUpload
as JS wrapper in the frontend, be aware that its Rails endpoint is public by default, effectively allowing anyone to upload an unlimited amount of files to your storage.
The DirectUploadController
from @rails/activestorage
bypasses your form controller because it uploads the file using an AJAX request that runs directly, before any form roundtrip happens. This is a comfortable solution to allow in-form uploads with progress, errors etc. Otherwise, the file would be uploaded only as soon as the form is submitted, creating a potentially long-running request which is not what you want. Instead a blob is created on the server, the file is uploaded and your form receives a generated hidden input field carrying the blob's signed ID to allow your model to find and attach the file associated with it.
Now imagine an attacker, creating 1.000.000 upload requests to your direct upload endpoint. You definitely want to prevent this.
Blocking public requests
To achieve this, we can overwrite ActiveStorage::DirectUploadController
and implement authentication. We will do so by using
Consul
Show archive.org snapshot
, but you may use any other mechanisms.
class Power
include Consul::Power
def initialize(user)
@user = user
end
def signed_in?
user.present?
end
power :direct_uploads do
signed_in?
end
private
attr_reader :user
end
Your power may now also hold additional conditions in case you want to enable uploads only for specific users:
power :direct_uploads do
signed_in? && user.purchased_license?
end
Now we define our custom controller using this power:
class CustomDirectUploadsController < ActiveStorage::DirectUploadsController
include Consul::Controller
current_power { Power.new(Current.user) }
require_power_check
power :direct_uploads
end
Now map the direct_uploads
route used by your frontend controller (over the data-direct-upload-url
) to this custom controller:
# routes.rb
Rails.application.routes.draw do
scope ActiveStorage.routes_prefix do
post "/direct_uploads" => "custom_direct_uploads#create", as: :rails_direct_uploads
end
end
Note that by default you usually set a direct-upload
attribute on your file input and the @rails/activestorage
UJS will automatically replace it with the default DirectUpload endpoint URL which is /rails/active_storage/direct_uploads
.
Multiple/Scoped DirectUploadControllers
If you have multiple namespaces or different endpoints (for e.g. frontend and backend) you can define different, scoped controllers for them, but need to set the data-direct-upload-url
manually on the file input. You usually want to omit the direct-upload
attribute in this case to not have your custom URL overwritten by ActiveStorage-UJS, causing no effect.
For example, you can have:
= simple_form_for @backend_user do |f|
= f.input :avatar, input_html: { 'data-direct-upload-url' Router.instance.backend_direct_uploads_path }
with
# app/controllers/backend/direct_uploads_controller.rb
module Backend
class DirectUploadsController < ActiveStorage::DirectUploadsController
...
end
end
and
# routes.rb
Rails.application.routes.draw do
namespace :backend do
resources :direct_uploads, only: [:create]
end
end
Warning
Make sure to disable the default route for the
DirectUploadsController
in this case. This one is automatically loaded when callingrequire 'active_storage/engine'
in your application config. If you leave it there, an attacker could still actively target it over the default URL and this endpoint is not authenticated.
If you want to disable ActiveStorage routes, you can set config.active_storage.draw_routes Show archive.org snapshot tofalse
. However, this will also disable a ll other routes likeupdate_rails_disk_service_url
which you will need. Another solution would be to call a power that returnsfalse
for this default route.
Rate limiting
Now that the public access is denied, a malicious user may still upload a humongous amount of files because they are allowed to. By gaining control over the current user through authentication, you may limit their uploads. By default, ActiveStorage is just a wrapper around your storage and it is assumed that any rate limiting should be handled there. For example, if you use S3 or any other provider, there probably should be configuration there.
However, if you just use an arbitrary file server (like with GlusterFS), you should already implement countermeasures in Rails.
A simple way is implementing rate_limit
from Rack::Attack
with a pre-defined limit and scope by the user's id:
class CustomDirectUploadsController < ActiveStorage::DirectUploadsController
# ...
rate_limit to: 5, within: 1.minute, by: -> { Current.user.id }
end
Of course, you may also introduce storage limits per user by checking if there is still enough space available before creating the blob by overwriting the create
method:
class CustomDirectUploadsController < ActiveStorage::DirectUploadsController
# ...
def create
if exceeds_storage_limit?
render json: { error: "Storage limit exceeded" }, status: :forbidden
return
end
blob = create_blob_and_deduct_storage!
render json: direct_upload_json(blob)
end
private
def exceeds_storage_limit?
file_size = blob_args[:byte_size]
file_size > Current.user.remaining_storage_size
end
def create_blob_and_deduct_storage!
file_size = blob_args[:byte_size]
ApplicationRecord.transaction do
ActiveStorage::Blob.create_before_direct_upload!(**blob_args).tap do |_blob|
Current.user.update!(remaining_storage_size: Current.user.remaining_storage_size - file_size)
end
end
end
end
This will work directly because the blob is created before the upload happens and you do not have to wait until it is processed.
Of course, these countermeasures depend on your application and how it scales. For applications that are only used by a small or trusted circle of people, closing the public route might be good enough.
Authorization/Protecting access
By default, ActiveStorage::Blobs are just records in your database that anyone with the related signed ID can access, thus being able to retrieve and view the file contents. For public files, this might not be a problem, but imagine you have invoice documents for a specific user. If an attacker somehow gains access to the blob's signed ID or you accidentally expose the wrong files, they may be able to retrieve sensible information. You should additionally set an expiration date on your signed IDs for advanced security, but still there might be time frames where this could be a problem.
We need some way to attach the owner to a blob to be able to filter them. We can do this by creating a new join model BlobOwnership
that holds a reference to the owner and the blob (M:N). Depending on whether an owner can be depicted by multiple models or just one, the join model can either hold a concrete owner class (like. e.g. User
) or a polymorphic belongs_to
or multiple optional belongs_to
associations.
class CreateBlobOwnerships < ActiveRecord::Migration[7.2]
def change
create_table :blob_ownerships do |t|
t.references :active_storage_blob, null: false, foreign_key: true, index: true
t.references :user, null: false, foreign_key: true, index: true
t.timestamps
end
end
end
# app/models/blob_ownership.rb
class BlobOwnership < ApplicationRecord
belongs_to :active_storage_blob, class_name: 'ActiveStorage::Blob'
belongs_to :user
end
class CustomDirectUploadsController < ActiveStorage::DirectUploadsController
# ...
def create
blob = ApplicationRecord.transaction do
ActiveStorage::Blob.create_before_direct_upload!(**blob_args).tap do |as_blob|
BlobOwnership.create!(active_storage_blob: as_blob, user: Current.user)
end
end
render json: direct_upload_json(blob)
end
end
To be able to access the ownership/the associated record from the blob itself, we need to patch ActiveStorage. Add or adapt your config/initializers/active_storage.rb
:
Rails.application.config.to_prepare do
ActiveStorage::Blob.class_eval do
has_one :blob_ownership, dependent: :destroy, foreign_key: :active_storage_blob_id
has_one :user, through: :blob_ownership
end
end
Rendering to the user
Now, we can decide whether a blob should be rendered to the user based on powers:
power :renderable_attachment? do |file|
blob = file.blob
blob.present? &&
blob.user.present? &&
blob.user == user
end
- if current_power.renderable_attachment?(@user.invoice)
= link_to('Download invoice', @user.invoice.url)
If your downloads run through Rails (which is the better option for sensitive documents), query it in your download controller action.
Accepting signed IDs from the user
You should also check the received signed IDs from form params to not allow a user to upload a file from another user in case they found out the URL (or guessed any).
power :allowed_blob_signed_id? do |signed_id|
blob = ActiveStorage::Blob.find_signed(signed_id)
blob.present? &&
blob.user.present? &&
blob.user == user
end
# Some UserController
def user_params
permitted = params.require(:user).permit(invoice_signed_ids: [])
# Filter after permitting
permitted[:invoice_signed_ids] = permitted.select do |signed_id|
current_power.allowed_blob_signed_id?(signed_id)
end
permitted
end
This works because the ownership is created as soon as the blob is created in your DirectUploadsController
, before the file is even uploaded. When the form roundtrip happens, the signed ID will be valid and queryable.
Note
You do not need to extract the
blob
object from your attachment to get the signed ID, but can prompt this directly on the attachment because it delegates missing methods to the blob. See the line on GitHub Show archive.org snapshot .