[ INITIATING PROJECT OVERVIEW SEQUENCE ]
GALLARIA A MIXED REALITY APP
ROLE
DESIGN LEAD, AR DEV, UI & UX DESIGN
DURATION
4 WEEKS
PLATFORM
MOBILE (ANDROID)
TOOLS
UNITY, AR FOUNDATION, C#, BLENDER
ELEVATOR PITCH
An AR app that transforms any vertical surface into a dynamic planning canvas,
allowing users to plan art arrangements virtually with physical precision.
// THE OPPORTUNITY & THE CHALLENGE //
Creating a precise planning tool using AR Foundation's plane detection and custom interaction physics.
THE CONCEPT
The vision was to build an Aumented Reality tool that turns any wall into a live, responsive canvas for spatial experimentation. Anchored digital picture frames appear at true scale, allowing users to virtually summon, place, and choreograph arrangements with the presence and precision of physical pieces. By removing the physical layer, the process of planning a gallery wall is not only faster, but allows users to ocus more on composition than labor.
THE TECHNICAL HURDLE
I'm conceptually comfortable in Spatial Computing, but I have not directly made an app or forced my will onto a Game Engine before, so this project is my debut into full-stack development.
Developing GALLARIA meant moving from the map to the territory; learning Unity, AR frameworks, asset pipelines, and spatial anchoring systems at speed, not just solving an interesting technical problem, but teaching myself the ecosystem as I built within it.
[ DESIGN & LEARNING ]
From zero experience to deployed prototype: A self-taught journey.
ART DIRECTION
I conceptualized the app's brand identity as a nod to the elegance of historic gallery spaces. The fonts are Art Deco, like Josefin Sans, the UI colours are meant to evoke oil paintings, and I utilized a high-specular gold material for the virtual frames as visual nod to the aesthetic of traditional art salons.
THE LEARNING CURVE
Without prior game engine experience, I had to learn the entire pipeline "along the way". From importing assets to declaring Build Parameters, Unity's tutorials and documentation were invaluable, but even with their help, this project represents a significant learning curve in a new, powerful development environment.
// THE INITIAL CORE AR DESIGN SYSTEM //
Foundational rules governing frame behavior to ensure real-world placement realism.
01. WALL ALIGNMENT PRINCIPLE
Frames are always parallel to the detected wall plane. They are not allowed to rotate away from the wall, to better maintain the illusion of a physical object. When spawned, frames automatically align their back face to the wall's normal vector, ensuring a flush fit.
02. PLANE LOCK (2D MOVEMENT)
Once spawned, a frame's movement is locked to the plane of the wall. Users can drag frames freely up/down (Y-axis) and left/right (X-axis), but movement into or away from the wall (Z-axis) is strictly disallowed during interaction.
03. INTERACTION COLLISION LOGIC
Frames maintain collision checks against walls and floors, but temporarily disable collision with other frames while being held (using custom Physics.IgnoreCollision()). This prevents objects from snagging or blocking other frames when organizing a dense gallery arrangement.
[ FUTHER TECHNICAL IMPLEMENTATION ]
Custom C# scripting to extend the XR Interaction Toolkit and AR Foundation.
01. LOGIC: PREEMPTIVE COLLISION CHECKS
I modified the default ObjectSpawner.cs to include a custom IsSpawnPositionClear() method. This performs a Physics.OverlapBox query before instantiation to check the "Frames" collision layer. This solved the issue of Unity's physics engine being too slow to prevent instant overlaps on spawn.
02. MATH: WORLD SPACE ALIGNMENT
Aligning frames to vertical planes required understanding Local vs. World space. I utilized BurstMathUtility.ProjectOnPlane() to calculate the spawn rotation, ensuring the back of the frame sat perfectly flush against the wall's normal vector, rather than simply rotating to the camera's view.
03. INTERACTION: CONSTRAINED GRABS
To allow repositioning without breaking the illusion of the wall, I extended RotationAxisLockGrabTransformer.cs. This locks the rotation axes so users can drag frames along the wall surface but cannot accidentally tilt them off the plane or rotate them into the wall.
04. PHYSICS: DYNAMIC LAYER MASKING
To improve usability, I wrote FrameCollisionToggle.cs. This script uses Physics.IgnoreCollision() to temporarily disable collider interactions only while a specific frame is being grabbed. This allows users to slide a frame "through" others to organize them, with collisions reactivating immediately upon release.
/// SOLUTION LOGIC: PREEMPTIVE OVERLAP CHECK ///
// VISUAL DOCUMENTATION //
01. Plane Detection
ARPlaneManager visualizing vertical surfaces, initially sloppy.
02. Spawn Logic
Raycast hits accidentally triggering perpendicular instantiation.
03. Asset Pipeline
Learning to alter .obj frames with Unity-native materials.
04. UI Integration
Creating a scrollable menu with transparent PNG thumbnails.
[ THE TECHNICAL IMPACT ]
Key milestones achieved during development.
ASSET OPTIMIZATION
Successfully integrated custom .obj frames from Blender, fixing scale and pivot issues. Replaced external textures with Unity-native materials (Gold/Glass) to resolve rendering compatibility issues.
COLLISION STABILITY
The implementation of preemptive OverlapBox checks completely eliminated the overlap issues caused by spawning objects inside one another, a common issue in physics-based AR apps.
PROJECT REFACTORING
Rebuilt the entire prefab frame system to turn a collection of individual objects into a proper, inheritable prefab system. This allows for rapid iteration and easy scalability for future assets.
// PROJECT STATUS //
Current Status: CONTINUING PRODUCTION.
Next Phase: Implementing 90-degree snap rotation logic.