Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I hope this doesn't sound negative, but I'm curious about the converging feature set of modeling software like blender and game engines like Unreal. From my total amateur point of view it seems game engines slowly get all those nice modeling features and visual fidelity, while also being able render all it in real time. How can modeling software compete with that? Vastly better tools for modelling?


The workflow difference starts to matter more than actual features, where for me the workflow for modelling in Blender is miles ahead than what you get in something like Unreal Engine.

Secondary and more technical, not everything has to or can be real-time rendered, and for that Blender is probably still king of the two, where Unreal Engine is optimized for real-time.

There are things you can only do in Blender, and vice-versa, so what the right tool is will as always depend on the job.


Yes, blender is mainly a modeling tool. You can plug it into many different renderers (technically, including an Unreal engine game where you exported your models). Epic subsidising it should hint at it not being a competitor :)

There are other niche tools in a creative pipeline, like Substance Painter for texturing and applying surface shaders. Blender has image and material edition capabilities, but they are more entry-level than competitors to Substance and GIMP/Photoshop.I think there's a niche for a lot more tools, including "AI" stable diffusion-like tools.

Also, blender used to have its own "Blender Game" engine.


Game engines like Unreal are incredible, but Blender still has capabilities they lack: unbiased rendering. Real-time engines work via a series of clever tricks and hacks. We've spent decades of effort on hiding the seams, but they are there if you know where to look.

As an example, take refraction. I can look up Newton's telescope dimensions, model the lenses in Blender, slap a glass material on there, and look down it to see an accurate view. (Minus diffraction and dispersion, and dispersion can be added manually.)

Real-time raytracing is an answer to this, but it's not quite there in it's current state. It still takes an order of magnitude or two longer to raytrace a frame than to rasterize it, so it's currently only used sparingly to touch up details.

Personally this is one of the areas I'd love to see Blender continue to develop. A spectral rendering option in Cycles would be invaluable when you want to render the Dark Side of the Moon logo.


Blender also renders in real time. It’s useful outside games, I like just drawing in it. It’s like a way more powerful Illustrator.


Unreal's modeling tools are fine for blocking out stuff but you still want to do proper modeling in an actual modeling software because of the modifiers & tools that make the process so much simpler and less cumbersome




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: