File uploads from HTML forms without S3 keys
You have a form. Someone needs to upload a file - a resume, a screenshot, a portfolio PDF. You have a static site, no backend, and you don't want to put AWS credentials in front-end JavaScript. There are exactly four reasonable approaches, and three of them have pitfalls that catch teams a few months later.
Option 1: don't accept files, link to Dropbox
<form>
<input type="text" name="dropbox_url"
placeholder="Paste a Dropbox or Google Drive link">
</form>
This works for low-volume "send us your portfolio" flows. It punts the file storage problem to the visitor, but creates new problems:
- Half your visitors don't have Dropbox/Drive ready - friction kills conversion.
- Links rot - file moves, gets deleted, permission changes. Your inbox link breaks 6 months later.
- No control over file format, size, or scanning - visitors can paste a link to anything.
Use only when the file is genuinely optional and you're OK with the rot rate.
Option 2: presigned S3 URLs from a serverless function
The visitor's browser asks your serverless function for a presigned PUT URL, uploads directly to S3, then submits the form with the S3 object key.
// Serverless function (Vercel/Cloudflare/Lambda)
export async function POST(req: Request) {
const { filename, contentType } = await req.json();
const command = new PutObjectCommand({
Bucket: process.env.S3_BUCKET,
Key: `uploads/${crypto.randomUUID()}-${filename}`,
ContentType: contentType,
});
const url = await getSignedUrl(s3Client, command, { expiresIn: 600 });
return Response.json({ url });
}
// Browser
const presigned = await fetch('/api/upload-url', {
method: 'POST',
body: JSON.stringify({ filename: file.name, contentType: file.type }),
}).then(r => r.json());
await fetch(presigned.url, {
method: 'PUT',
headers: { 'Content-Type': file.type },
body: file,
});
await fetch('/api/contact', {
method: 'POST',
body: JSON.stringify({ message, fileKey: presigned.key }),
});
This works. The credentials never leave your function. But:
- Three round trips per submission instead of one.
- You own the S3 bucket - IAM policies, lifecycle rules, bucket-level encryption, CORS configuration, public-access blocks, audit logging, cost monitoring, ransomware-style abuse prevention.
- Files in the bucket aren't tied to form submissions yet - you need a janitor process that deletes orphaned uploads (where the form submission never happened).
- CORS configuration is fiddly -
S3_BUCKET_CORS_RULEStypos cost a half-day every six months.
Reasonable if you're already running a Node/serverless stack. Painful if a contact form is your only backend.
Option 3: post the file straight to the form-host's storage
This is the right answer for most static sites. Some form backends accept multipart uploads and handle storage themselves.
<form action="https://your-form-backend/f/abc123"
method="POST"
enctype="multipart/form-data">
<input type="text" name="name" required>
<input type="email" name="email" required>
<input type="file" name="resume" accept=".pdf,.docx">
<button>Submit</button>
</form>
The form-host receives the multipart body, stores the file in their S3 (private), and gives you a signed download link in the dashboard.
Pros:
- One round trip, one URL, no JavaScript required.
- No bucket to manage, no IAM policies to debug.
- File deletion is tied to the form submission lifecycle.
- Signed download links - visitors can't share a permanent public URL.
Cons (read carefully):
- File size is capped by the form-host. Common caps: Formspree 10MB, Basin 5MB, Getform 25MB, Formspring 25MB per file, Netlify Forms 8MB.
- The form-host's storage costs become your cost. At 50GB/month, you're paying maybe $5 instead of $1.50 - fine for most use cases, painful at petabyte scale.
- If you ever want to migrate, you have to re-host all the historical files yourself.
Use this for any file workflow under ~25MB per file.
Option 4: form backend with private S3 + signed URLs (the modern default)
Same shape as option 3, but with stronger guarantees:
- Files stored in a private S3 bucket (not the host's "public uploads" bucket).
- Downloaded via time-bounded signed URLs generated per request from the dashboard.
- No public file URLs ever - even if someone copies a download link from the dashboard, it expires in minutes.
- CORS-aware - the file metadata (filename, mime type) is preserved for sane downloads.
- Per-form retention rules - auto-delete files after N days for GDPR / cost control.
Formspring is in this category. Files land in S3-compatible storage in the EU, are downloaded via 5-minute signed URLs from the dashboard, and obey per-form retention rules.
<form action="https://formspring.io/f/abc123"
method="POST"
enctype="multipart/form-data">
<input name="name" required>
<input type="email" name="email" required>
<input type="file" name="resume" accept=".pdf,.docx" required>
<button>Submit</button>
</form>
That's the entire integration. No JavaScript, no presigned URLs, no IAM, no CORS configuration.
What to put in your form's accept attribute
Always restrict file types client-side AND server-side. Client-side is cosmetic (visitors can bypass it); server-side is enforcement.
<input type="file" name="resume" accept=".pdf,application/pdf">
<input type="file" name="screenshot" accept="image/*">
<input type="file" name="data" accept=".csv,.txt,text/csv,text/plain">
Don't allow */* - that's an invitation for .exe, .dmg, .iso, and worse. If you genuinely need to accept "anything", document why and rely on server-side virus scanning.
File size: what to set the limit to
Most contact forms shouldn't accept files over 10MB. Most application forms shouldn't accept files over 25MB.
If your use case genuinely needs larger files (video submissions, dataset uploads), you're outside the form-backend sweet spot. Use option 2 (presigned S3) or a dedicated upload service like Bunny Stream.
Virus / malware scanning
Form-backends typically don't scan uploaded files for malware. If you accept arbitrary files from the public internet, assume some will be malicious. Best practice:
- Treat the dashboard download as a triage surface, not a trust surface.
- Don't auto-process uploaded files in your own infrastructure (no auto-extracting ZIPs, no auto-running PDFs through OCR pipelines).
- For high-risk flows, add ClamAV or VirusTotal in your processing pipeline.
What this looks like end-to-end
A complete static-site portfolio form:
<form action="https://formspring.io/f/abc123"
method="POST"
enctype="multipart/form-data">
<input name="name" required placeholder="Name">
<input type="email" name="email" required placeholder="Email">
<textarea name="message" required placeholder="Tell us about your work"></textarea>
<label>
Portfolio (PDF, max 25MB)
<input type="file" name="portfolio" accept=".pdf" required>
</label>
<!-- Honeypot -->
<div style="position:absolute;left:-9999px" aria-hidden="true">
<input type="text" name="website" tabindex="-1">
</div>
<button>Send</button>
</form>
That's the entire integration. The form-host handles storage, spam protection, dashboard, retention, and download URLs. You write zero JavaScript and own no infrastructure.
Try it free
Formspring's free plan disables file uploads (storage isn't included on the $0 tier). Pro at $19/mo includes 5GB of storage, 25MB per file, private S3, and per-form retention rules. Sign up - no credit card to start.
Florian Wartner
Founder of Formspring and Pixel & Process. Senior Laravel and Vue engineer based in Lübeck, Germany. Building developer-first SaaS with EU data residency and honest pricing.
Related posts
Astro form handling without serverless functions
How to receive form submissions in an Astro site without writing an API route, server endpoint, or serverless function.
Honeypot vs reCAPTCHA vs hCaptcha: form spam protection compared
Three approaches to stopping form spam, with honest tradeoffs on accuracy, accessibility, privacy, and user friction.
How to verify HMAC webhook signatures in Node, PHP, and Python
Constant-time HMAC verification in three runtimes - with the bugs that get past code review.