GuruVR Metaversity

An XR-first immersive learning platform that turns abstract engineering concepts into hands-on, gamified experiences in VR.

GuruVR Metaversity Hero

Role

XR Design Intern

Team

1 Solo Designer, 5 Developers, 3D Artists

Timeline

5 months (2024–25)

Tools

Unity, Prototyping, Design Systems

Team Role

What we owned

As the XR Design team at FireBirdVR, we ran the full design pipeline — from zero-day research to Unity prototyping — across 5 months of agile sprints.

User Research

Survey of 92 students, focus group discussions, competitor benchmarking, affinity mapping.

Platform Architecture

User-specific mind maps, information architecture across VR, desktop, and mobile.

Brand & Design System

Logo, color palette, typography, full component library.

Cross-Platform UX

Onboarding flows, dashboards, gamified interfaces.

XR Module Design

Narrative design, spatial UI, hand-tracked interactions, Unity scene prototyping.

Testing & Iteration

Usability sessions on Meta Quest 2, think-aloud protocol, heuristic evaluation.

Problem

Engineering education is broken in one specific way

Indian Engineering education relies heavily on theoretical lectures and outdated teaching methods — leading to a significant lack of hands-on learning and industry-relevant skills. Students struggle with limited access to practical sessions and insufficient guidance from untrained faculty, often resulting in a self-driven learning approach.

Key Statistics

70%

30% practice

Theory Curriculum

68%

Lack access

Hands-on Learning

63%

Unprepared for XR

Faculty Readiness

71%

Cite high costs

Implementation

Process

Timeline & deliverables

Five months, one sprint board, zero handoff errors.

Timeline

JAN
FEB
MARCH
APRIL
MAY
Secondary Research
Primary Research (Survey)
Persona + Affinity Mapping
User Mind Map
Information Architecture (IA)
Branding & Website Design
Dashboard & Gamification UX
Onboarding UX (XR/Web/Mobile)
XR Module Design
Testing & Iteration
Grad Book

Key Deliverables

Brand identity

Complete design system & logo

Website UX

All multi-platform onboarding flows

4 Dashboards

Role-based views for 4 segments

Gamification

Reward & engagement blueprint

Unity Prototype

Logic Gates Mystery Island module

Research Docs

Personas, survey & usability reports

Framework

The CLPE-A learning model

Before designing a single screen, we built the instructional skeleton. Every XR scene maps to this five-stage loop — it's what keeps learning intentional, not accidental.

C

Concept

Holographic voiceover introduces the topic with zero pressure to act

L

Learning

Tooltips and VO tell you exactly how to interact

P

Practice

Learner performs the task — pull the lever, insert the orb

E

Experiment

Interactive quiz challenges test understanding in-scene

A

Assessment

Final ritual — apply all knowledge independently to escape

Learning theories it's built on

  • Constructivism: Build knowledge through doing. Learners construct understanding through real interactions — activating a circuit, not watching one.
  • Cognitive Load: One temple. One gate. One idea. Gamified metaphors carry the abstract weight so working memory stays clear.
  • Kolb's Cycle: Experience → Reflect → Apply. Every temple follows Kolb: hands-on interaction → quiz feedback → UI guidance.
  • Bloom's Taxonomy: From Remember to Create. The module scaffolds from remembering facts to creating knowledge.
Research

Research & discovery

We ran both primary and secondary research before touching any design tool. The goal was to validate assumptions with real data, not gut feel.

Primary Research

Survey (92 Engineering Students): Multi-institution quantitative survey. Covered XR exposure, learning preferences.

85%

in academics

Never Used XR

91%

improves retention

Hands-on Focus

83%

believe XR will define it

Future of Ed

User Interview
User Interview

Qualitative Research

Focus Groups: One-on-one and group sessions with students and faculty.

Students lost in abstract theory — need visual anchors

Faculty not opposed to XR — just anxious about the curve

Navigation in VR was a consistent friction point

Focus Group Study
Focus Group Study

Affinity Mapping

Competitor Analysis

Evaluated Labster, iXR Labs, Coursera, and others on dimensions like gamification, XR narrative, Indian curriculum alignment, and cross-platform capability. GuruVR addresses clear white spaces in immersive story-driven onboarding and AI integrations.

How might we build a scalable XR platform that bridges the gap between theoretical engineering education and hands-on industrial expertise, while ensuring the UX is intuitive for both first-time VR students and non-technical faculty?

Platform Design

Platform architecture

GuruVR Metaversity is designed as a cross-platform XR ecosystem that operates seamlessly across VR headsets, desktop, and mobile. The XR-first approach ensures an immersive core with responsive layers for wider accessibility.

Creators

Educators, 3D artists, and module developers building the immersive curriculum.

Users

StudentsFacultiesCorporateGovernment

Architecture Layers

VR Core

Meta Quest 2 · Spatial UI · Hand-tracking. The primary experience.

Desktop

Dashboard UX · Course Management. Full feature access without a headset.

Mobile

Progress Tracking · Notifications. Adaptive touch UI for on-the-go.

The Spatial Campus

Rather than a menu system, GuruVR uses a 3D campus metaphor. You navigate between zones — not pages.

Social View
Main Gate
Lab View
Classroom
UI View
Campus View
Module View
Detailed View

Welcome Atrium

Onboarding + avatar

Dept. Labs

EEE / Mech / CS

XR Classrooms

Live + async sessions

Peer Zones

Avatar interaction

Corp. Pavilion

Sponsors + recruitment

Onboarding & Storyboards

Onboarding Flow
Onboarding Flow
Storyboard
Module Storyboard
UI/UX Design & Strategy

Design system

One design system to rule VR, desktop, and mobile. Lora brings the academic weight. Mulish keeps the UI sharp. The palette works in pitch-black VR environments as well as full-light web.

Role-based dashboards

Each role gets a dashboard designed around their actual job:

Student

Student

Progress tracking, XR Launchpad, AI Tutor

Faculty

Faculty

Engagement analytics, classroom management

Creator

Creator

Creator Studio, concept tagging

Corporate

Corporate

XR safety training, HR sync

Reward & gamification mechanics

Gamification without overuse. Every element earns its place by driving a specific behaviour.

XP Points

Earned through quizzes and tasks

GuruCoins

In-platform currency for avatar perks

Badges

Milestone markers for module completion

Streaks

Engagement chains

Leaderboards

Rankings driving social motivation

Reward System
Gamification Ecosystem

AI Integration — Gyaanix

Contextual AI assistant embedded across VR, desktop, and mobile. Summonable at any moment. Voice and text responses, concept explanations, and adaptive recommendations.

XR Learning Module

Logic Gates Mystery Island

The module that proves everything. Seven interconnected VR scenes. Each temple teaches one logic gate — through puzzles, not lectures. The final ritual forces you to use all five gates to escape.

Scene by scene

01. Crash Site

01. Crash Site

Orientation — no pressure to perform yet.

02. Temple of Unity (AND Gate)

02. Temple of Unity (AND Gate)

Pull both levers simultaneously to activate the gate.

03. Temple of Acceptance (OR Gate)

03. Temple of Acceptance (OR Gate)

Step on pressure pads in different combinations.

04. Chamber of Inversion (NOT Gate)

04. Chamber of Inversion (NOT Gate)

Insert an orb into the gate — the inverse shoots out as a beam.

05. Temple of Divergence (XOR Gate)

05. Temple of Divergence (XOR Gate)

Select mismatched input pairs to activate the gate.

06. Tower of Equality (XNOR Gate)

06. Tower of Equality (XNOR Gate)

Align both inputs to match — the platform rises.

07. Final Ritual

07. Final Ritual

Build one mega-circuit from gate tokens. Unlock the escape portal.

Interactions & spatial UI

Implemented hand-tracked lever pulls (Gesture), input selections at circuit nodes (Drag & Drop), accessible quiz confirmation (Gaze & Hold), and truth tables with circuit feedback (Floating UI).

Prototyping in Unity

All 7 scenes were prototyped and tested in Unity, deployed on Meta Quest 2 via Side Quest. We collaborated directly with the dev team to translate design intent into real XR behaviour using Unity (C#), Meta XR SDK, XR Interaction Toolkit, and Photon Fusion.

Unity Prototype
Unity Prototype
Unity Prototype
Unity Prototype
Testing & Feedback

We tested it. Then fixed it.

Three rounds of UX evaluation with first-year engineering students — in VR, on actual Meta Quest 2 hardware. What we found shaped everything about the final experience.

8.5 min

Average completion time (down from 11m)

21%

First-attempt error rate (down from 38%)

What we found & fixed

Floating Truth Tables

Repositioned height & angle of floating UI panels to prevent them from being missed.

NOT Gate Chamber

Added Gyaanix voice prompt at entry point to reduce confusion around mirror mechanics.

Lever Interactions

Improved haptic pulse and added visual confirmation glow upon successful activation.

Quiz Failure

Implemented an adaptive logic hint system when users fail multiple quiz attempts.

"Felt like a puzzle adventure, not just a class."

"Voiceover tips from Gyaanix made me feel guided but not forced."

Final Prototype

The Final Experience

The final developed prototype is an immersive, multi-user VR experience built for the Meta Quest 2. It integrates the spatial logic gate module and AI-driven guidance via Gyaanix.

Hardware

Optimized for Meta Quest 2 / Quest 3 with hand-tracking and controller support.

Tech Stack

Built with Unity, Meta XR SDK, and Photon Fusion for real-time collaboration.

Reflection

What we actually learned

Not a list of achievements. Genuine things that changed how we design — especially in spatial contexts.

Design

Immersive narrative unlocks engagement that UI alone can't. Gamification only works when it's earned.

Technical

VR development demands constant trade-offs between visual quality and frame rate. Gesture-based affordances need to feel like natural instinct.

User-Centric

Real users never follow the ideal path. UI placement in 3D space must adapt to user height, direction, and attention speed.

Final thought

"This project wasn't just about designing an XR module — it was a deep exploration into how technology, storytelling, and education converge. GuruVR Metaversity has potential beyond logic gates. It offers a new blueprint for how India's technical education system can evolve — more experiential, inclusive, and future-ready."

— Shubhanshu Sahu

Previous Project

FlytBase

Next Project

RGZP Zoo Systems