HomeWorkSolutionAboutHow it WorksCase StudiesContact Start Project Discussion
Case Studies · 8 Scenarios

REAL PROBLEMS.
REAL SOLUTIONS.

Eight representative scenarios across technical animation, AR/XR, exploded renders, and training visuals — how Vidhaath solves real engineering communication problems.

8
Case Scenarios
Across technical animation, AR/XR, exploded renders, and training visuals
CAD
First Approach
Every scenario starts from actual engineering data — not creative approximations
Hybrid
Workflow Notes
Each case study documents exactly where AI was used and where it was not
CS 01 Technical Animation · CAD-Driven Industrial Automation & Robotics

WHEN THE SALES TEAM
COULD NOT EXPLAIN THE ROBOT

A patented wrist assembly that reduced cycle time by 23% — invisible inside every brochure.

The Problem

A robotics manufacturer had developed a six-axis articulated arm for automotive assembly lines. The mechanism was genuinely innovative — a patented wrist assembly that reduced cycle time by 23%. But every sales presentation stalled at the same point: buyers could not visualise how the wrist moved through tight tolerances in a live cell. The product brochure showed the arm at rest. It could not show the thing that made it worth buying.

What We Built
  • 90-second mechanism animation showing the wrist assembly in full articulation, built directly from CAD data
  • Assembly cell context — robot operating within a realistic conveyor and fixture environment
  • Speed variant renders showing normal-speed operation and a slow-motion breakdown of the wrist joint path
  • Cutaway layer revealing internal joint geometry at the 47-second mark
Outcomes
  • 3× longer engagement during sales demos
  • 12 days from CAD file to final delivery
  • Zero engineering corrections required after first review
Hybrid Workflow Note

CAD geometry was prepared and cleaned in Blender. AI-assisted texture synthesis accelerated the creation of realistic surface materials for the fixture environment — cutting asset production time by roughly half. All mechanical motion paths were hand-keyed by a mechanical engineer against the actual CAD joint limits. AI was used on the environment; engineering accuracy was never delegated to it.

CS 02 WebXR Product Catalog · No App Required Consumer Appliances · India Market
Capability Demo

A PRODUCT CATALOG THAT FITS IN
A SALES REP'S POCKET

Distributor network across Tier-2 cities — customers making Rs 40,000 decisions based on a photograph.

The Challenge

A premium kitchen appliance brand was expanding its distributor network across Tier-2 cities. Physical showrooms were expensive to set up and inconsistent in presentation quality. Distributors were selling off printed catalogues — and customers were making Rs 40,000 purchase decisions based on a photograph. The brand needed its products in front of customers at full scale, in context, wherever the distributor happened to be.

What We Built
  • Web-based AR catalog — 4 flagship products viewable at full scale on any Android or iOS device, no app install required
  • Each product explorable in AR: users could walk around, inspect details, and switch colour variants in real space
  • Hosted on a single URL the distributor could share as a WhatsApp link before any customer meeting
  • Deployed on Vidhaath's own AR catalog infrastructure
Outcomes
  • Zero app installs required — runs in Android Chrome and iOS Safari
  • One shareable URL — distributed as a WhatsApp link by any distributor
  • Full-scale product visible in the customer's own kitchen before purchase
Hybrid Workflow Note

Product 3D models were built from manufacturer CAD files. AI-assisted PBR material generation produced realistic surface finishes for appliance-grade stainless and tempered glass — materials that traditionally require hours of manual shader work. WebXR integration, lighting rigs, and interaction logic were all coded by hand. The result: photorealistic material quality at a fraction of the production time, with full technical control retained over the AR behaviour.

CS 03 Exploded View Renders · Technical Illustration Medical Devices & Equipment

MAKING A 47-PART SURGICAL
INSTRUMENT TEACHABLE

Technicians were getting the assembly sequence wrong in the field. The training material was from 2011.

The Problem

A surgical instrument manufacturer needed to train distributor service technicians on a complex sterilisation-compatible biopsy forceps — 47 components, three sub-assemblies, and a spring mechanism that failed if reassembled in the wrong sequence. Their existing training material was a 14-page PDF with hand-drawn diagrams from 2011. Technicians were getting the sequence wrong in the field.

What We Built
  • Fully exploded photorealistic render — every component floating at correct axial distance, numbered and annotated
  • Three sub-assembly groupings shown in separate layered renders
  • Critical spring mechanism highlighted with a cutaway view showing correct and incorrect seating positions
  • Print-ready at A2 and web-optimised versions for digital training manuals
Outcomes
  • All 47 components visible, accurately positioned, in a single render
  • Three sub-assembly views eliminating ambiguity about grouping
  • Print and digital formats delivered from one production pass
Hybrid Workflow Note

Each of the 47 components was modelled from engineering drawings. AI-assisted rendering (denoising and lighting estimation) was used to achieve final render quality in a fraction of the compute time that traditional path-tracing requires. Component annotation layouts were designed manually — automated placement tools consistently placed labels over critical geometry, defeating the purpose. Hybrid approach: AI accelerated render quality; human judgment controlled what the viewer needed to see.

CS 04 Mechanism Animation · CAD-Driven Manufacturing Machinery

EXPLAINING A 400RPM TURRET
INDEXING SYSTEM TO PLANT MANAGERS

The innovation was inside the housing. The housing was closed during operation.

The Problem

A CNC machine tool manufacturer had developed a new turret indexing system that was quieter, faster, and 40% more accurate than the industry standard. Every technical evaluation cleared — but plant managers signing off on capital expenditure kept asking the same question: "Can you show me exactly how it works?" The innovation was inside the housing. The housing was closed during operation.

What We Built
  • 60-second mechanism animation — housing fades to reveal the indexing cam, Hirth coupling, and servo drive in motion
  • Speed-controlled playback section: full-speed operation followed by 8× slow motion showing coupling engagement
  • Comparison panel showing the competitor mechanism alongside, highlighting the additional contact points
  • Versions cut for 30-second trade show loop and full-length sales presentation
Outcomes
  • Full mechanism revealed in 60 seconds without removing the physical housing
  • 8× slow-motion breakdown of the Hirth coupling engagement
  • Two cut versions: 30-second trade show loop + full sales presentation edit
Hybrid Workflow Note

Mechanism motion paths were defined entirely from CAD data — the cam profile, coupling geometry, and servo rotation were not approximated or artistically interpreted. AI tools were used in post-production: automated scene colour grading and motion blur enhancement to make the slow-motion segment feel cinematic without additional render passes. No AI was involved in defining the mechanical motion itself — that required engineering judgement about what a plant manager needed to understand, not what looked impressive.

CS 05 Industrial Training Visuals · Step-by-Step Renders Defence & Aerospace · MRO

REPLACING A 200-PAGE MAINTENANCE MANUAL
WITH VISUALS TECHNICIANS ACTUALLY USE

New technicians were taking 4–6 weeks to become proficient. Errors during the 14-step procedure were the leading cause of rework.

The Problem

An aerospace MRO facility was onboarding technicians onto a complex hydraulic actuator overhaul procedure. The existing process documentation was a 200-page PDF written in 1998, with 2D engineering drawings that took a trained eye to interpret. New technicians were taking 4–6 weeks to become proficient. Errors during the 14-step seal replacement procedure were the leading cause of rework.

What We Built
  • 14 step-by-step 3D renders — each step a photorealistic scene showing correct tool position, component orientation, and torque direction
  • Critical failure points visualised: correct seal seating vs. the two most common incorrect positions
  • Isometric overview render showing the full actuator with numbered call-outs linking to each step
  • Formatted for tablet display in the MRO bay — no paper, no scrolling through PDFs
Outcomes
  • 14 procedure steps, each with a dedicated render
  • Correct vs. incorrect seal seating shown side by side
  • 200-page PDF replaced with a 14-screen tablet sequence
Hybrid Workflow Note

Step sequencing was defined in collaboration with the MRO facility's lead technician — the person who knew exactly where new recruits went wrong. AI-assisted depth-of-field rendering and environment lighting were used to draw visual attention to the active component in each step, replacing the need for manual highlight overlays in post-production. The "what to look at" decisions — the engineering and pedagogical judgement — stayed human throughout. AI accelerated how we communicated those decisions visually.

CS 06 AR on Any Phone · WebXR · No App Manufacturing Machinery · Trade Exhibition

A 1,200KG COMPRESSOR AT FULL SCALE —
ON ANY PHONE AT ANY STAND

Their flagship compressor weighed 1,200kg and cost Rs 18 lakhs to transport. There had to be a better way.

The Problem

A compressed air systems manufacturer was exhibiting at an industrial trade fair. Their flagship compressor unit weighed 1,200kg and cost Rs 18 lakhs to transport and install at a booth. Their competitors were bringing smaller demo units. The manufacturer needed a way to show the full-scale, full-specification machine — without the logistics bill and without asking buyers to imagine it from a brochure.

What We Built
  • WebXR AR experience — the 1,200kg unit placed at full scale on any floor surface via phone browser, no app download
  • Interactive component labels: tap any major component to reveal its specification and function
  • Internal cutaway mode — single tap removes the outer casing to reveal the compression chamber and drive train
  • Distributed to every sales rep as a single URL — usable at any customer site, not just the trade stand
Outcomes
  • 1,200kg machine at full scale — on any trade show floor, any customer site
  • Zero app installs — browser-based, shareable as a URL
  • Cutaway mode showing internal architecture without a physical demonstration unit
Hybrid Workflow Note

The 3D model was built from the manufacturer's CAD assembly. AI-assisted PBR material generation produced the industrial paint, cast iron, and rubber hose surface finishes needed for photorealism at trade show scale. The AR session management, hit-test surface detection, and interaction logic were hand-coded using the WebXR Device API. IBL lighting was loaded from a custom HDR map of a typical exhibition floor. AI handled materials; engineering and code handled everything the customer interacted with.

CS 07 Real-Time 3D Configurator · Browser-Based Manufacturing Machinery · Made-to-Order Products
Capability Demo

LETTING THE CUSTOMER BUILD THE CONVEYOR
BEFORE THE ENGINEER DOES

Configuration approval was the single biggest bottleneck in their sales cycle — 2–3 days per quote.

The Challenge

A modular conveyor manufacturer sold systems that were configured differently for every customer — length, width, drive unit position, incline angle, surface type, and side-guard configuration all varied. Their sales engineers were spending 2–3 days per quote producing CAD renders for customer approval. Customers would review the render, request a change, and the cycle would restart. Configuration approval was the single biggest bottleneck in their sales cycle.

What We Built
  • Browser-based 3D configurator — customer selects length, width, drive position, and surface type via a live panel
  • 3D model updates in real time in the browser viewport — no page reload, no new render request
  • Configuration summary auto-generated as a PDF spec sheet the customer can send for approval
  • Built with Three.js — no Unity, no plugin, runs on any laptop browser without install
Outcomes
  • Real-time model updates — configuration changes reflected instantly in the 3D viewport
  • No plugin or app — runs in any desktop browser
  • 2–3 days saved per quote cycle on configuration approval alone
Hybrid Workflow Note

The parametric model logic — how segment length changes affect belt tension geometry, how drive unit position constrains frame design — was defined by a mechanical engineer before any code was written. AI-assisted code generation accelerated the Three.js configuration UI development. The engineering constraints that govern what configurations are physically valid were hard-coded by hand. Customers can only configure what the machine can actually be built as — the AI-assisted UI serves valid engineering, not the other way around.

CS 08 Digital Twin Overlay · WebXR · Live Sensor Data Manufacturing Machinery · Industrial Facilities
Capability Demo

THE CHILLER PLANT THAT TELLS YOU
WHAT IT'S DOING — IN AR

The lag between a sensor reading and a technician understanding its implication — in context, on the machine — was costing hours per incident.

The Challenge

A facility management company overseeing a large commercial chiller plant was experiencing recurring failures in one compressor bank. Maintenance engineers were diagnosing problems by walking the plant floor, reading gauges, cross-referencing a 2D P&ID drawing, and making judgment calls. The lag between a sensor reading and a technician understanding its implication — in context, on the machine itself — was costing hours of diagnostic time per incident.

What We Built
  • 3D model of the chiller plant built from engineering drawings — accurate geometry, correct spatial relationships
  • WebXR overlay: technician holds up a phone and sees the 3D model registered over the physical plant with live sensor readings
  • Threshold-based colour states: green for normal, amber for approaching limit, red for fault — visible on the 3D overlay
  • Tap any component in AR to open its full sensor history for the past 24 hours
  • No proprietary hardware — runs on standard Android mobile browser via WebXR Device API
Outcomes
  • Live sensor data overlaid on the physical plant via phone browser
  • No proprietary AR hardware — standard Android device
  • In-situ diagnosis: the data appears on the component, not in a separate dashboard
Hybrid Workflow Note

The 3D plant model was built from P&ID drawings and site survey data — traditional engineering modelling work. AI-assisted anomaly detection was explored as a layer beneath the visualization: a pattern recognition model trained on the plant's historical sensor data flags components showing early-warning signatures before they cross alert thresholds. The AR interface then surfaces these flags visually — not as a dashboard alert, but as a colour state on the physical component the technician is standing next to. Human engineering defined the model and the thresholds; AI found patterns in the data that no manual review would have caught in time.

Your Product Next

HAVE A PROBLEM LIKE
ONE OF THESE?

Tell us about your product and the communication challenge you're facing. We'll recommend the right approach.