II. Designing the Matrix: When Sensors Met Unreal Engine
Designing the Matrix: When Sensors Met Unreal Engine
Phase II: 2023 - Digital Twin & Visualization
By 2023, I had a working sensor system. It could detect gas, but its output was a stream of dry, unfeeling numbers on a terminal.
“350 ppm CO₂.”
What does that feel like? Is it safe? Is it spreading? Where is it coming from?
I realized that as a designer and engineer, my job wasn’t just to capture data, but to translate it. I believe design is the programming of perception. If I could map this invisible data onto the physical world, I could change how people understand their environment.
Why I Chose Game Engines
I faced a choice: stick to standard industrial plotting tools (MATLAB/LabVIEW) or try something radical. I chose Unreal Engine 5.
People thought it was overkill. “Why use a AAA game engine for a gas sensor?”
Because reality is high-fidelity. If I wanted to create a true Digital Twin, I needed Lumen’s global illumination and Nanite’s geometry. I wanted the gas visualization to look not like a scientific graph, but like a physical phenomenon—volumetric, dynamic, and immersive.
The Workflow: Scanning Reality
I started by digitizing my lab. Using an Apple 3D Scanner (LiDAR), I captured a millimeter-accurate point cloud of the testing environment.
- Scan: Walk around the room with an iPad Pro, painting the geometry.
- Clean: Process the mesh to remove noise and artifacts.
- Import: Bring the model into UE5 as the “stage.”
- Link: Connect the Photonic Nose’s data stream via MQTT to the engine.
The “Aha” Moment
The first time I turned it on was unforgettable. I released a controlled puff of tracer gas in the physical lab. On the screen, a volumetric fog erupted from the corresponding location in the digital twin. It swirled, expanded, and faded based on the real-time concentration data fed by my sensors.
It wasn’t just a visualization; it was a prediction. I could see the gas accumulation in the corners of the room before the alarm even went off.
This system allowed us to simulate emergency scenarios—like a lithium battery venting in a closed storage room—without putting anyone in danger. We could “play” the disaster in the engine, see the invisible flow of hydrogen, and design better ventilation systems based on that data.
Bridging the Gap
This phase of the project was about translation.
- Input: Optical Physics (Wavelengths, Absorption)
- Processing: Engineering (Signal Amplification, IoT)
- Output: Human Experience (Visuals, Alerts, Understanding)
I learned that the most powerful technology effectively disappears. When you look at the screen, you don’t think about the Beer-Lambert Law or the MQTT protocol. You just see the danger, and you know what to do.
But as 2023 ended, I felt a pull. Industrial safety is important, but it’s cold. I wanted to use this technology to touch something more personal, more fragile.
Next: How I pivoted from industrial safety to the most intimate interaction of all—a mother’s breath.