Archilogic's platform let users configure and explore architectural models in real-time — but real-time rendering has inherent visual limits. I implemented a pipeline that let any user request a photorealistic render of their model directly from the interface, without leaving the platform. Users could define their own camera bookmarks to capture multiple angles, then trigger a full ray-tracing job on a cloud Blender server with a single click — receiving a high-quality image by email within minutes.
Drag to compare · same camera angle · same model — viewport vs. ray-tracing output
Automated rendering feature development: duplicating and adapting the light-baking pipeline logic, writing the Python render script, configuring the headless AWS Blender server on Linux, and implementing the front-end trigger. The back-end team handled API routing, machine availability, and delivery infrastructure. The result was a production-deployed feature for by Archilogic users to generate photorealistic renders of their interactive spaces on demand.