Getting scale right for Unreal Engine isn't just a technical step; it's the foundation for a functional scene. I've learned through countless projects that exporting a mesh at the wrong scale is the fastest way to break lighting, physics, and animation downstream. My workflow ensures that 1 Unreal Unit always equals 1 centimeter in my 3D scene, which I verify before a single polygon is exported. This guide is for 3D artists and technical artists who are tired of rescaling assets in-engine and want a bulletproof, consistent export pipeline from their DCC tool into Unreal.
Key takeaways:
In my experience, treating Unreal Units as generic "points in space" is a major mistake. The engine's physics, lighting, and many core systems are intrinsically tuned for a real-world metric scale where one unit equals one centimeter. When you import a mesh modeled in meters or at an arbitrary size, you're not just changing a display number. You're throwing off the scale of force calculations, light falloff, and even the perceived speed of movement. I configure all my projects with this 1:1 (cm to UE unit) relationship from the start; it's the only way to ensure predictability.
The most frequent issue I see is artists modeling in their preferred software units (often meters) and relying on the Unreal import dialog's "Import Uniform Scale" to fix it. This creates a fragile asset. If that scale factor is ever lost or overridden, the asset breaks. Another pitfall is importing assets from different sources with different base scales, creating a mismatched scene where nothing aligns. A "door" from one source might be 20 units tall, while from another it's 2000, creating a nightmare for kitbashing or modular environment design.
I never guess. In my scene, I always create or import a simple reference object. My go-to is a capsule primitive scaled to 180 units tall (representing a 180cm tall person) and 50 units in radius. I place my model next to it. Does the character mesh align with the capsule? Does a chair seat sit roughly 45 units high? This visual check is instantaneous and prevents 90% of scaling errors. I also check the bounding box dimensions in my 3D tool to ensure they make sense in centimeters.
My first action in any new scene is to open the unit settings. I set the system unit to centimeters. This means when I type 180, I know I'm creating a line 180cm long. All my modeling and alignment happens in this real-world context from the beginning. If I'm starting with a base mesh that's the wrong size, I scale it here, in the main scene, against my reference object, before any detailed work begins.
This is the crucial step most miss. Let's say my final, detailed model is correctly scaled to real-world cm in my scene. I then create a final export version. I select all geometry, apply all transformations (this zeros out rotation and location, and bakes the scale factor of 1.0 into the vertex data). In Blender, this is Ctrl+A -> Apply Scale. In Maya, it's Modify -> Freeze Transformations. The object's displayed scale should now read 1.0, but its visual size in centimeters is unchanged.
With scale applied, I open the FBX exporter. My non-negotiable settings are:
I maintain a master "UE_Scale_Reference.fbx" file. It contains my 180-unit capsule, a 100-unit cube (1m³), and a 16-unit grid plane (standard Unreal floor tile). I import this into every new Unreal project and into my 3D software as a template scene. Having this persistent reference is faster and more reliable than trying to remember or recreate dimensions every time.
Before I hit export, I run through this mental list:
SM_Chair_Wood_Final.fbx).If an imported mesh is still the wrong size in Unreal, I follow this debug path:
When I generate a 3D model from text or an image using a platform like Tripo AI, the first thing I acknowledge is that the output scale is arbitrary. My next step is never direct export. I always import the generated mesh into my main 3D software. There, I scale it against my reference capsule, apply transforms, and often run a quick retopology or cleanup pass before it ever touches my FBX exporter. This normalization step integrates AI assets into a professional pipeline.
I look for AI generation tools that understand production needs. For instance, if I can specify a target bounding box dimension or scale to a reference in the generation parameters, it saves me the first manual step. Some platforms output models that are already scaled to a standard human proportion or allow for one-click scaling before download, which I find incredibly useful for blocking out scenes quickly.
Even with a perfect export, sometimes minor tweaks are needed in Unreal. If I must adjust scale in-engine, I do it on the Actor in the world, not via the mesh asset's import settings. For optimization, I use Unreal's built-in LOD tools after import. My workflow is: get the scale and asset correct first, then generate LODs in-engine for consistency across the project. I also immediately create and assign a simple material to verify surface normals and UVs survived the export/import process correctly.
moving at the speed of creativity, achieving the depths of imagination.