Back to Blog
web-developmentcloud-computingawsjavascriptprogramming

AWS S3 Basics: How Do I Actually Store Files in the Cloud?

April 19, 20265 min readRead on Medium
AWS S3 Basics:
How Do I Actually Store Files in the Cloud?

I was confused by “buckets”, “objects”, and “permissions” too. Here’s how I finally made sense of S3 — and how you can start using it today.

When I first heard about AWS S3, I thought it was something only big companies used. Turns out it’s one of the simplest and most useful AWS services you can learn as a student — and once you understand it, you’ll see it everywhere.

In this blog I’ll walk you through exactly what S3 is, how it works, and how to actually use it — from creating your first bucket to uploading files with code.

What even is S3?

S3 stands for Simple Storage Service. It’s basically a place to store files on the internet. Think of it like Google Drive — but for your applications instead of for you personally.

You can store images, videos, PDFs, JSON files, HTML files, basically anything. And once a file is in S3, you can access it from anywhere in the world with a URL.

Two terms you need to know before anything else:

Bucket — A container for your files. Like a folder at the top level. Bucket names are globally unique.

Object — Any file you store inside a bucket. It has a key (its name/path) and the actual file content.

Lesson: By default, everything in S3 is private. Nobody can access your files unless you explicitly allow it. This is a good thing — don’t panic when your file URL gives a 403 error at first.

How do I upload a file?

From the console, click into your bucket and hit “Upload”. Drag in a file — let’s say a photo — and click Upload. Done.

You’ll see it in your bucket with a key (filename) and a size. If you click on it, you’ll get a URL. But if you try to open it, you’ll get an “Access Denied” — because remember, it’s private by default.

To make a single file public, click the object → “Object actions” → “Make public”. Now that URL works.

But doing this manually every time is annoying. Let’s do it with code.

How do I upload files with Node.js?

This is where it gets interesting. AWS has an official SDK for JavaScript. Install it first:

npm install @aws-sdk/client-s3

Now create a simple upload script. You’ll need your AWS Access Key and Secret Key — get these from IAM in the AWS Console (create a user with S3 permissions).

import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3"; 
import fs from "fs";

// 1. Create the S3 client
const s3 = new S3Client({ region: "ap-south-1", credentials: { accessKeyId: process.env.AWS_ACCESS_KEY_ID, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, }, });

// 2. Read the file from your machine
const fileContent = fs.readFileSync("./photo.jpg");

// 3. Upload it to S3
const command = new PutObjectCommand({ Bucket: "akil-my-first-bucket-2026", Key: "uploads/photo.jpg",

// path inside the bucket
Body: fileContent, ContentType: "image/jpeg", }); await s3.send(command);
console.log("Uploaded successfully!");

Run this and your file appears in your S3 bucket. The Key is the path — so uploads/photo.jpg means inside a folder called uploads. S3 doesn’t have real folders, but the slash in the key makes it look like one.

Lesson: Never hardcode your AWS keys in your code. Always use .env files and add them to .gitignore. Leaked AWS keys are one of the most common and expensive developer mistakes.

How do I retrieve a file with code?

Reading a file back is just as simple. Use GetObjectCommand:

import { GetObjectCommand } from "@aws-sdk/client-s3"; 
const command = new GetObjectCommand({ Bucket: "akil-my-first-bucket-2026", Key: "uploads/photo.jpg", });
const response = await s3.send(command);
// response.Body is a readable stream
// Convert to buffer and save locally
const chunks = []; for await (const chunk of response.Body) { chunks.push(chunk); }
const buffer = Buffer.concat(chunks);
fs.writeFileSync("./downloaded-photo.jpg", buffer);
Lesson: S3 returns files as streams, not direct buffers. This is actually efficient — for large files, you can start processing data before it’s fully downloaded instead of waiting for the whole thing.

Making a whole bucket public (for static hosting)

One killer use case for S3 is hosting a static website — plain HTML, CSS, and JS files. To do this, you need to make the entire bucket public with a bucket policy.

Go to your bucket → Permissions → Bucket Policy and paste this:

{ 
"Version": "2012-10-17",
"Statement":
[ {
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::akil-my-first-bucket-2026/*"
} ]
}

This says: allow anyone (“Principal”: “*”) to read (s3:GetObject) any file (/*) in this bucket. Now every file you upload is automatically publicly accessible via its URL.

Lesson: Bucket policies are written in JSON and follow a simple pattern — who can do what to which resources. Once you read a few of them, they start to make sense fast.

What’s the URL structure?

Every object in S3 has a predictable URL:

https://{bucket-name}.s3.{region}.amazonaws.com/{key} 
// Example: https://akil-my-first-bucket-2026.s3.ap-south-1.amazonaws.com/uploads/photo.jpg

If the object is public, that URL works directly in a browser. This is how apps serve profile pictures, product images, and uploaded documents — store in S3, use the URL wherever you need it.

What I’d do next

Now that you understand the basics, here’s a natural progression:

1. Connect S3 to an Express API — add a file upload endpoint to a Node.js app using Multer + the S3 SDK. This is exactly what I did in ElectroMart for product images.

2. Learn about presigned URLs — instead of making files public, you can generate temporary URLs that expire after a set time. Much safer for private user files.

3. Pair S3 with CloudFront — AWS’s CDN puts your files on edge servers globally, so users in Europe get your Sri Lankan-hosted file just as fast. Massive performance boost for zero extra effort.

Final thoughts

S3 felt intimidating before I actually used it. Once I did, I realised it’s just a file system with a URL. Upload a file, get a link, use it anywhere.

If you’ve already built a Node.js app, integrating S3 for file storage is the most natural next step. It’s how real production apps handle everything from profile pictures to invoice PDFs.

Start with the console, then move to the SDK. You’ll have it working in an afternoon.

The AWS free tier gives you 5GB of storage and 20,000 GET requests per month — more than enough to experiment without spending a cent.

Drop a comment if you get stuck, or connect with me on LinkedIn. I’m always happy to help a fellow student figure this stuff out.

Thanks for reading.

~Akil Dikshan~