Rahul Hathwar · March 19, 2026
I've always loved space as a theme. The scale of it, the loneliness, the imagination it invites. But I notice pretty often in conversations about game development that most people can't picture space as anything more than a visual style. "Sure it'd look cool, but what do you even do in it?" And honestly, for a long time that doubt stopped a lot of interesting conversations from going anywhere.
So I decided to actually test it. I gave myself a constraint: three calendar days, spare afternoon time only. Build something playable and cohesive, and if it isn't working or I'm not enjoying where it's headed, minimize losses and move on. This is a story about how I approached that, both as an engineer and as a designer.
And don't worry, that constraint doesn't mean the code is a mess. It's actually anything but, but I'll get to how I structure a codebase for rapid development in a moment.
Deciding What to Prove
I think the most important decision in any prototype isn't what you build. It's figuring out where the doubt lives and going straight at it.
On flight controls, the thing I've noticed about most space and flight games is that they tend to be built by people who care about physical accuracy. Real aerodynamic constraints, cockpit dials, forces that behave like they would in a real aircraft. For a certain kind of player, that's the whole appeal. But for someone who has never piloted anything and just imagines what it feels like to fly a spaceship, that realism usually just means unintuitive, dry controls. The doubt I was addressing was: can you make space flight controls that feel expressive, snappy, and accessible to someone who isn't trying to study aeronautics? I felt like I could take a swing at that.
On the environment, space is, realistically, just a massive void. The version of space most people imagine comes from movies and games with a pretty heavy creative bias toward making it look dramatic. A lot of the space games I've seen on Roblox just slap on a starry skybox and drop in some floating objects. I wanted to do something I genuinely hadn't seen before -- dense asteroid fields, drifting smoke-based nebulae, debris clouds, sun flares. From close to a decade of 2D and 3D art experience, I had some instincts: particles are heavily underutilized, volume and density make environments feel less open and intimidating, color variation does a lot of the heavy lifting. But was I confident this would work in practice to the fidelity I had in mind? Honestly, no. That's kind of the whole point of giving myself three days. If I could pull it off, great. Proved it. If I couldn't, I kept my losses small right at the start.
My job as a creative is to identify gaps and fill them with my own ideas. If I hadn't seen it done, that was probably a signal worth following.
How I Structured the Codebase
I want to take a moment here because there is something of an ongoing debate in developer spaces about how much you should care about code quality, especially when prototyping. Some people act like it's a massive waste of time. Others treat following best practices like there's some prestige attached to it. I think both camps miss the point.
The reason design patterns and principles like Single Responsibility exist isn't so you can feel righteous for following them. They're called patterns for a reason: they're generic, repeatable solutions backed by a "why". And I think a lot of people learn the how without ever really internalizing that why. When you understand why these principles were established, you can apply them strategically, not religiously. You pick them up when they buy you something specific, and you move on when they don't.
For this project, what they bought me was speed. My project ended up laid out like this:
src/
client/
init.client.luau -- input, camera, HUD, SFX
CargoAppearance.client.luau -- cargo model animations
GravityWellAppearance.client.luau -- gravity well ring animations
server/
init.server.luau -- ship state machine, physics, warp gates, combat
EnvironmentGenerator.server.luau -- all world generation
GravityWellTracker.server.luau -- proximity detection + client notification
modules/
GravityWellManager.luau -- gravity well instance spawning
TurretManager.luau -- turret spawning and update loop
Each file owns its own responsibility and as little else as possible. The practical benefit during a sprint like this is that bugs don't bleed across system boundaries. When something breaks, you know exactly where to look. When something needs to be replaced or rewritten, you pull it out without dragging three other things with it. And if two files start developing tightly coupled dependencies, I move that shared behavior up into its own module rather than letting the web form.
State lives locally too. The ship's entire runtime state, its movement flags, shield, fuel, roll angle, boost timer, all of it, lives in a single table on the server:
local shipState = {
started = false,
movingForward = false,
rollAngle = 0,
boostActive = false,
shield = 100,
fuel = 100,
externalVelocity = Vector3.new(0, 0, 0),
-- ...
}
Clients fire RemoteEvents to request changes. The server validates and applies them. Nothing on the client reaches across to mutate server state directly. That's basically just Roblox's native client/server model working as intended, but the point is I'm not fighting it.
There's a casual analogy I always come back to on this topic: cleaning while you cook versus doing it at the end. The total work is roughly the same either way, but you get a wildly different outcome depending on when you schedule it. Thirty seconds of discipline upfront saves you a disproportionate amount of untangling later. Same idea here.
Now, this doesn't mean every line in this codebase is immaculate. There are TODOs and a few hardcoded values in places where I made a deliberate call to get a presentable result over perfecting a specific implementation detail. For a prototype, that's fine. The structure is there, the responsibilities are clear, and anyone coming in to take this further knows exactly where everything lives.
The Environment
Once I committed to making a dense, atmospheric space environment, the next question was how to build it efficiently and in a way that could grow. I went with full procedural generation on the server at load time, and the reason wasn't just saving time over manual placement. It was about scalability and the ability to layer systems on top of each other.
When the environment is an algorithm, it can become more expressive without being rewritten from scratch. Today it's a randomly scattered asteroid field. Tomorrow the generation could be weighted by noise functions tied to difficulty. Enemy factions could spawn in denser zones. Gravity wells could be placed to create natural chokepoints. Procedural generation is an investment in every future system that needs to talk to the environment.
The asteroids are generated from a pool of 20 unique icosphere meshes displaced with multi-octave fractal noise, which is what gives them that irregular, rocky look rather than smooth spheres. Each asteroid gets a randomized drift velocity and slow rotation, updated per heartbeat. A basic LOD system keeps things performant: asteroids within a certain distance have collision enabled, distant ones are visual only.
Pool of 20 unique fractal noise icospheres
-> 4,000 asteroid instances scattered across 6,000 x 6,000 studs
-> each gets velocity + rotVelocity
-> LOD: CanCollide only within 600 studs of ship
10,000 debris pieces (non-collidable, visual density)
144 particle-based nebulae (slow drift + rotation, wraps map bounds)
Comets on randomized intervals (exponential acceleration, trail + particles)
The nebulae and comets weren't just decorative decisions either. I predicted that one of the things someone watching a space prototype would wonder about is whether the environment actually does anything for gameplay. Nebulae softly occlude turrets and gravity wells, which means you're navigating with incomplete information and have to pay attention. Comets add brief moments of visual activity without requiring any player interaction. I had a list of priorities to hit in three days, and I balanced toward the ones that answered real questions a viewer might have.
Flight Mechanics
This is the section I spent the most time on and the one I care most about, so let me actually walk through the thinking.
I decided pretty early not to let Roblox's physics engine drive the ship. The reason is predictability. Physics engines introduce variability that's genuinely hard to control when you're going for a specific stylized feel. I wanted movement that was reactive, snappy, and expressive, the kind of feeling that kids imagine when they think about flying a spaceship. That requires determinism.
The ship runs on a CFrame-based state machine. CFrames are just transformations, which means you can layer them deterministically and iterate on each in isolation. This is something I picked up from VFX work: good motion comes from layering. You start with a base linear movement, add mouse-driven orientation interpolation on top, then barrel roll on top of that, then idle bob and tilt while hovering, then the external velocity component from collisions. Each layer improves the overall feel in a controllable, consistent way.
base position
+ forward velocity (W key)
+ mouse-driven orientation lerp
+ barrel roll angle (A/D key, with snap-back recovery)
+ idle bob and tilt (sinusoidal, while hovering)
+ external velocity (collision impulse, decays over time)
+ gravity well pull
There are more layers I could add: debris clouds creating resistance, more granular vibration, surface-specific effects. But in prototyping, the goal isn't to implement all the layers. It's to implement enough to prove that the approach works and that you know where it goes next.
Emergent Gameplay and Layered Design
I think the most interesting gameplay doesn't come from any single mechanic in isolation. It comes from multiple systems that each have their own side effects, and those side effects start to interact with each other in ways that are varied but predictable enough that players can learn, adapt, and make real decisions around them. That's what I mean by emergent storytelling. You're not scripting the player's experience. You're setting up conditions where interesting situations arise naturally from how systems relate to each other. The key to doing this well is prioritizing which systems you build and in what order, because each new system you add should have something to interact with that's already there. The next few sections show exactly how this thinking played out in practice.
The camera threshold:
One of the more interesting decisions was the camera system. When the cursor is inside a center rectangle, the camera loosely follows without locking in. When the cursor exits that rectangle, it snaps into a tighter follow mode. The reason this matters is that it creates an inherent preference in the gameplay for forward linear movement. Making sharp turns becomes a slightly more demanding action, which means navigating a tight asteroid corridor or trying to dodge incoming laser fire actually feels like something you had to do intentionally. If the camera always followed tightly, you could spin freely and evade everything easily. The threshold gives turning a mechanical cost.
Asteroid grazing and collision:
This is a good example of what I focused on and why. Minor visual vibrations when grazing an asteroid would have been a nice touch, but they wouldn't have changed how you play the game. Deferred. Meanwhile, I knew that in a scenario where you're being chased through a dense field by turrets, the physics of how you graze an asteroid is a real gameplay decision. Catch it at a glancing angle and you get lightly displaced but retain directional control. Hit it head-on and you get bounced back, potentially right back toward a turret you were trying to escape.
That's emergent storytelling: multiple systems interacting in ways that are varied but predictable, that players learn, adapt to, and start making decisions around. The initial feel of the grazing wasn't right -- the restitution values were bouncing the ship too aggressively for certain angles. I spent more time tuning this than almost anything else because it directly affected how the game plays, not just how it looks.
Combat and Game Systems
Turrets and the case for tuneable systems:
There are 100 turrets scattered throughout the map. Each one has an idle yaw oscillation, a clamped firing cone, and a predictive aim function that leads the ship's position rather than just shooting where it currently is. The predictive aim forces the player to actually vary their movement and trajectory, which is the kind of pressure that makes a game feel alive. Purely reactive AI that just shoots where you are has a trivially easy counter: go fast.
But I want to emphasize something broader here, because I think it speaks to how I build any gameplay system. Look at what the turret implementation exposes:
local DEFAULT_FIRE_RANGE = 300 -- studs
local DEFAULT_FIRE_RATE = 8 -- shots per second
local MAX_ANGLE_RAD = math.rad(75) -- clamped firing cone
local ROTATION_SPEED = 4 -- lerp speed multiplier
-- predictive aim lead factor: randf(0.3, 0.7)
-- vertical aim jitter: randf(-0.2, 0.2)
These aren't just implementation details. They're tuning knobs. I genuinely believe the best gameplay systems aren't built in a fixed, rigid manner. They're built to be found -- shaped and refined through actual playtesting and feedback. An opinion and a direction is necessary, but blind confidence in that direction isn't. Nothing replaces a tight feedback loop with real players. Building in configurability from the start is what enables that loop to be useful.
Gravity wells:
Gravity wells pull the ship toward them when you're within range, with a blinking on-screen warning when you enter one. They're a navigation hazard, a strategic layer, and a bit of visual interest. But the more important thing is the order in which I built them. I didn't start with gravity wells. I started with what everything else depended on: flight mechanics, the environment, the basic combat structure. Gravity wells came later as a layer that interacts with the things already there. That's how I'd always prioritize this kind of work. Build the foundation, then add systems that have side effects on each other. Emergent gameplay lives in those interactions.
Warp gates:
Warp gates teleport the ship between paired portals and, importantly, they preserve your velocity through the transition. That detail was intentional. In a fast-paced scenario where you're being chased, a warp gate should feel like a getaway car, not a pause button. Stalling your momentum on exit would feel clunky and would undercut the snappy, fast style I was going for throughout the whole project. Every detail, including how physics behave at a portal exit, should serve the direction of the game you're building.
The cargo loop:
One thing I'm firm on from the drawing board of any game project: if there's no objective and no defined loop, it's not a game. It's a tech demo. Flight, combat, and traversal on their own are cool, but they don't form a loop. I knew from day one I needed one, and I knew it didn't have to be complex to do its job. A basic cargo pickup and delivery mechanic does exactly what's needed. Fly to the cargo, fly through it, deliver it. Its job is to turn a demo into a game and to convey that there's a future here. Simple round-based delivery today suggests a layered, objective-driven system with progression and risk tomorrow. That's the idea I wanted to leave the viewer with.
The Result
Three afternoons produced a 6,000 x 6,000 stud procedurally generated space environment with 4,000 asteroids, 10,000 debris pieces, 144 drifting nebulae, and periodic comets. Fully custom flight mechanics with boost, barrel rolls, collision response, and a threshold-based camera. 100 turrets with predictive AI and 50 gravity wells creating real navigation pressure. Warp gates with velocity-preserved teleportation. A shield, fuel, and cargo loop giving the whole thing a basic game structure. SFX and UI feedback throughout.
Is it production ready? By my usual standards for production, I'd say not exactly. I it passable? Yes. But more importantly, it is a testament towards something important: sometimes the best thing you can do to make progress is get a little scrappy, put a time constraint on yourself, and see in practice what you're capable of. So if you asked if I would design certain parts differently with a real timeline and a full team? For sure. But I don't think that's the right lens for evaluating this.
Instead, what I want to leave you with is a glimpse into my philosophy for prototyping and project management. The question isn't whether this result is perfect, the question is whether it is insightful when deciding in a real studio environemnt whether to invest more into the idea or scrap it before it becomes destructive.