Phone:
 +947 601 49595
Email:
 mail[at]pasindudissan.xyz
Secondary Email:
 pasindudissan[at]proton.me

PGP key (Ed22519)
3300 B645 19CA C101 A0DD
D030 E40F 4B15 095C C7AF

© 2025 Pasindu Dissanayaka.

Posted by:

Pasindu Dissanayaka

Posted on:

Jun 17, 2025

Web Application Security Basics - Encrypted Storage

What is an encrypted storage?

Generally securing a web app to most devs mean database encryption or HTTPS. But what about the files your application stores? Many public-facing web application are built to provide some type of service to clients. This means that you store client information on your application, these maybe minor details like their names, email or even their documents and profile pictures.

Too often, these get dumped in /uploads/ or a public_html subfolder with predictable paths. It’s easy to forget that these files can contain sensitive business or personal information, and if your access control fails or the server gets misconfigured, they’re publicly accessible.

The main focus on this article are those documents and profile pictures, how can we safeguard customer data while still being able to provide access to these pictures, files and documents.

Why is this important?

Would you store sensitive records in a table without encryption or access control? No. So why treat files any differently?

Files should be:

  • Encrypted before being saved
  • Stored outside the public directory
  • Served via access-controlled endpoints

Encrypting Files Before Storage

Using a strong symmetric encryption algorithm like AES-256 protects your files even if hackers gained access to your files. Here's my flowchart:

  • File is uploaded.
  • File contents are encrypted server-side.
  • Encrypted blob is saved to a secure location.
  • File metadata (name, type, original name) is stored in the DB, optionally encrypted.

🔥 Pro Tip: Don’t hardcode keys, always assume your attacker may even have access to your source code. Use a secure secrets manager (like AWS KMS) Or uniquely generated per-user keys.

But Doesn’t This Make the Application Slower?

To a certain extent yes, encrypting and decrypting files always adds a little overhead, especially if you're working with large files or doing it frequently. But for most web applications, this impact is very minimal and really doesn’t affect user experience. Personally here are some of the things I consider:

  • Encrypt-on-upload is a one-time cost, not a per-user hit.
  • Decryption should only happen once at access time, and only for authorized users.
  • Use AES encryption and in-memory decryption for small to medium-sized files as this can be done in a few milliseconds.
  • Use proper caching strategies (for eg: encrypt once and caching the result temporarily) because this can help if you're dealing with repeated requests for the same file.
  • Don’t just encrypt everything. Encrypt only sensitive data. Goes without saying but encrypting publically accessible files just wastes your processing power.

🔑 I've mentioned this during one of my client meetings “Security always has a trade-off. A few milliseconds of processing time is a small price for preventing even the smallest data breaches.”

How can I implement this in application?

Evidently I can't show you how to implement this in every programming language so I will walk you through the logic and a few code snippets in PHP

For encrypting files on upload you could looking openssl or similar modules in your stack

$contents = file_get_contents($_FILES['file']['tmp_name']);
$encrypted = openssl_encrypt($contents, 'AES-256-CBC', $key, 0, $iv);
file_put_contents('/secure/path/file.enc', $encrypted);

Prevent Direct Access

Make sure the storage directory is not accessible via public URL. Use:

  • A separate storage server or service (like S3 with private ACL)
  • Laravel's Storage::disk('private') or similar approach in your stack
  • Temporary signed URLs or backend endpoints to serve files
Isn’t S3’s Private Mode Enough?

It’s a great first step, but not doesn't prevent leaks, Consider this S3’s private ACL blocks direct access, but if someone leaks a signed URL, that file’s available until expiry. You still need backend-level permission checks before generating the URL.

So encrypting sensitive files even before they go to S3 is the smarter approach, that way even if the bucket is compromised, the files are useless without the key.

Access Flow

Instead of letting the browser hit /uploads/invoice.pdf, you:

  • Receive request to download
  • Check user’s permission
  • Decrypt the file in memory
  • Stream or serve the file as a download
Can I Use Public Storage If I Validate JWTs or Session?

Yes, but don't rely on this alone if a file is public and sensitive, you lose control as soon as it's cached or shared. Serving files directly from a public path (like /uploads/invoice.pdf) assumes that:

  1. Users won’t share links
  2. Files are access-controlled somewhere else
  3. Your auth tokens can’t be bypassed

Use backend endpoints like /api/download/{id}, validate the session/JWT there, decrypt, and stream the file securely.

File Metadata Hygiene

Even filenames like johndoe-CTscan-2023.pdf can leak sensitive info. So use a UUID or hash for saved files:

$randomName = bin2hex(random_bytes(16)) . '.enc';

Then you could either dynamically generate a name during the stream download or map the original name and file type in your database.

Conclusion

So you see it's not an over complex process and implementing this type of encryption not only keeps your end users safe but also proves your skills as a security focused developer. If you liked this article checkout the other articles in this series