I am a highly driven XR Developer focused on leveraging immersive technologies to address real-world challenges. With a solid background in UI/UX design, I create intuitive, user-centered experiences that blend functionality with innovation. My work is guided by a commitment to meaningful problem-solving and a forward-thinking approach to shaping the future of virtual reality.
Ambitious & Dedicated – Stands out with unique ideas and attention to detail in XR development.
Technically Skilled – Expertise in Unity, Blender, rigging, animation, and 3D modeling.
High-Quality Work – Exceeds expectations in delivering immersive experiences with realistic textures and fabrics.
Innovative & Adaptable – Improves workflows, enhances interactivity, and stays updated with industry trends.
Great Team Player – Excellent communicator, shares knowledge, and fosters collaboration.
Passionate & Professional – Always goes the extra mile, ensuring top-tier results in VR/MR projects.
The idea for MARK emerged when I moved into a new house and struggled with tasks like hanging picture frames and installing shelves in perfect alignment. With just a tape measure, manually marking the walls at different heights felt tedious and inefficient. During the fourth week of my XR Bootcamp, I was tasked with choosing a solo project idea to prototype for my portfolio. Out of five concepts I brainstormed, my mentor selected WallAlign, recognizing its potential to solve a common real-world problem.
During the bootcamp, I developed the app's two key features, and after the program ended, I was motivated to expand its functionality further. Testing the app with friends—especially artists and carpenters—yielded overwhelmingly positive feedback, confirming its practical utility and inspiring further development.
Precise Alignment Assistance : Enables users to project and mark precise points for hanging items or creating level surfaces.
Line Drawing Tools : Users can draw, adjust, and mirror lines on walls, ensuring symmetry and alignment.
Dot Projection : Instantly place and adjust dots for marking positions on walls.
User-Friendly Menu : The menu UX was thoughtfully designed by analyzing the natural resting position of the human hand, optimizing for minimal movement and comfortable access angles.
Meta SDK Proficiency : Prior to this project, I had no experience with the Meta SDK or developing for Mixed Reality. I quickly adapted by completing the XR Bootcamp’s Udemy course, "Build Your MR Game & Publish it on Meta's Horizon Store," which enabled me to confidently work on this prototype.
Line Renderer Mastery : Overcame challenges with local and global coordinates in line rendering, gaining in-depth knowledge about Unity's line renderer.
Menu UX Design : Designed a menu optimized for comfort by analyzing the natural resting position of the hand and minimizing movement and angle requirements.
Canvas Layering : Resolved complex issues with managing canvas layers and UI element hierarchies, further improving my Unity UI skills.
AI Image Search Integration: Allow users to project AI-generated images onto walls for creative tracing, enhancing artistic projects.
Extended Object Detection: Utilize the camera API to detect object planes like wood, paper, and other materials, expanding functionality beyond walls.
Tool Expansion Based on Feedback: Incorporate additional tools and features based on user feedback from artists and carpenters to better suit their needs.
Shared Collaboration: Implement shared experiences, enabling multiple users to collaborate in real-time on projects for enhanced interactivity and usability.
The XR Audio School is an innovative MVP designed to teach audio skills through engaging and imaginative Mixed Reality simulations. Developed during a 2-week sprint as part of XR Bootcamp, the app leverages Meta SDK v65 and the Mixed Reality Utility Kit (MRUK). Using "Scene Understanding" and "Room Model" features, it dynamically overlays the user’s actual room geometry and furniture, creating a personalized experience that adapts to each individual space. This prototype highlights the potential of Mixed Reality to redefine skill-building and education.
Immersive Mixed Reality Environment : The app overlays room geometry and furniture using the Meta SDK's "Scene Understanding" and "Room Model" features, creating a personalized and unique learning environment for each user.
Interactive Audio Learning : Users can manipulate multitrack audio and experiment with real-time audio editing tools, gaining hands-on experience with professional techniques.
Rope Simulation for Realism : A rope simulation feature adds an element of tactile interaction, enhancing the learning experience.
Narration and Guided Learning : A built-in curriculum, narrated by Sky Deep, provides step-by-step guidance to teach users essential audio skills.
Dynamic Visuals and Effects : Engaging visuals, such as wall destruction effects and lighting animations, create an exciting and imaginative simulation to make learning enjoyable.
The MVP was a collaborative effort made possible by the contributions of:
Me :
- Visualized the UX flow to ensure intuitive and seamless user interactions.
- Modified and optimized 3D assets to enhance the virtual environment.
- Designed the branding and a user-friendly UI aligned with the app’s creative direction.
- Imported and implemented design assets from Figma, streamlining the development pipeline.
Sky Deep : Developer, Project Lead, Creative & Branding Concept, Curriculum Design, Narration Voice, Rope Simulation Configuration, and Audio Programming.
Ray Ng Jun Hui : Lead Developer, Module Programming, and GitHub Management.
Bernard Masika : 3D Asset Modification, Wall Destruction Effect, Lighting Design, NPC Import, and Animations.
Enhanced my knowledge of multitrack audio manipulation using Koreographer Pro to create immersive audio experiences.
Gained a deeper understanding of the Mixed Reality Utility Kit (MRUK) and its functionalities for designing adaptive environments.
Overcame challenges in scheduling and communication, which provided valuable experience in remote team collaboration.
Improved proficiency in branching and merging workflows through extensive use of GitHub.
Refined skills in UI design for Mixed Reality, focusing on intuitive and user-friendly interfaces.
Learned how to import a large number of Figma files directly into Unity, streamlining the design integration process.
Explored hand grab options using the Meta SDK, particularly for controller-driven hands and avatars, identifying areas for future improvement and iteration.
Enhanced Audio Modules: Expand the app with additional audio curriculum modules, covering advanced mixing, mastering techniques, and specialized genres.
Advanced Interaction Capabilities: Introduce more immersive interactions, such as voice-controlled commands and gesture-based audio editing tools.
Collaborative Learning Spaces: Enable multi-user environments where students can collaborate on audio projects in real time.
AI-Powered Feedback: Implement AI-driven analysis to provide constructive feedback on user-created audio projects, helping users improve their skills effectively.
Haptic Feedback Enhancements: Integrate more detailed haptic responses to simulate realistic audio equipment interactions, such as turning knobs and adjusting sliders.
A Virtual Reality Educational Prototype As part of the second 2-week MVP sprint during XR Bootcamp, my team and I developed Labify VR, an innovative VR app designed to make learning cell biology engaging and interactive. Labify VR immerses users in a fully virtual world where they can explore various types of cells, navigate their parts, and learn about their functions through interactive labeling, gamified quizzes, and a welcoming 3D guide. This experience leverages the spatial and immersive capabilities of VR to provide a deeper understanding of cell biology.
Teacher Talking Head : A 3D guide helps users understand the objectives and steps.
Interactive Learning :
- Explore different types of cells, including plant cells, animal cells, bacteria cells, and blood cells.
- Interact with organelles (e.g., plant cell organelles) by selecting different parts to learn their names and functions.
Video & Audio Learning : Educational videos with audio explain key concepts.
Gamified Quiz : Engages users by familiarizing them with organelles and their functions.
User-Friendly Design : Includes a main menu and tutorial for easy navigation.
The MVP was brought to life through the collaborative efforts of:
Me :
- Implemented grabbable organelles for an interactive experience.
- Developed socket interactions with haptics for tactile feedback.
- Worked on the quiz logic to create engaging gamified assessments.
Jumana S : Lead Developer & Project Lead.
Iris W : Splash Screen Design and Scene Lighting Optimization.
Yongwoo P : App Introduction Tutorial and Menu UI Design.
Improved understanding of VR Interaction Framework (VRIF), particularly its scripts and interactions.
Gained experience in creating procedural UI panels for dynamic interfaces.
Learned to optimize performance through lightmap baking for realistic yet efficient lighting.
Expand features to include additional cells and organelles.
Create an advanced lab environment to explore the human body, organs, and systems.
Add more gamified quizzes with a student database and progress tracking.
Enable multiplayer functionality.
Awards
Talks
sahithnayudu.xr@gmail.com