HyperSkill enables users to author, publish and evaluate VR/AR content without coding.

Author, Publish, and Evaluate

HyperSkill is a no-code 3D simulation authoring software for both Virtual and Augmented Reality. HyperSkill was created to enable instructional designers to create immersive training content without having to learn programming. HyperSkill enables non-programmers to author VR/AR content, publish it to various devices and audiences and collect and visualize experience data. Users can unlock the value of existing CAD and 3D assets by converting them into interactive virtual objects securely stored in the cloud. These virtual objects can enrich training and assessment. HyperSkill supports xAPI so that content creators can use xAPI to log and report data and fine grained metrics related to content and learner proficiency on targeted knowledge and skills.

  • Author: Drag and drop interactive 3D assets from our repository, add step by step instructions, add highlights and other effects, and author dialogue to design conversations.
  • Publish: Author once, experience anywhere. HyperSkill content can be experienced on a wide range of VR and AR devices ranging from mobile to high-end AR (Hololens, Magic Leap, etc.) and VR (HTC Vive, Oculus Quest, Rift etc.)
  • Evaluate: Capture rich multimodal datasets from users when they experience the content (head and hand motion, point of view video, audio, action streams, etc.) and quickly create visualizations. HyperSkill supports xAPI for logging and reporting data and fine grained metrics related to content and learner proficiency on targeted knowledge and skills.

What is HyperSkill?

Below are some sample simulations created with HyperSkill and used at different organizations to address a variety of use cases.

VR Enabled Personalized Manufacturing Training

VLab is a training simulation for material testing labs in engineering. These labs are often conducted using Instron equipment. VLab is an example of how HyperSkill can support the creation of equipment focused procedural (hard skills) content and capture data to evaluate learner engagement and efficacy. VLab was pilot tested at Texas A&M University in 2017-18 and has since then been used in three departments with hundreds of universities. VLab has also been adopted at Michigan Technological University and University of California.

Revitalizing Apprenticeships for Small and Medium-Sized Industries

There is a growing need to better educate students on what career choices are available to them in a more engaging way. For robotics and automation careers, Clemson and SimInsights developed a simulation where students are able to experience a factory populated with various specialists in different potential fields, see them with the equipment and robots they may work with, and ask them questions about the career paths. This simulation was made for a desktop and VR experience. Students are also presented with a series of next steps that they could take to further pursue a career in the desired field. This project was funded by ARM Institute.

Virtual Reality Based Healthcare Facility Mock-up Evaluation Guidelines: Optimizing Return on Investment for Quality and Patient Safety

Clinician input at blue-print stage of medical facility design is valuable from a human factors standpoint. Traditionally, gathering such input required investment in building physical facility mockups and extensive time from human factors experts to code the participants’ behaviors. Working with Health Quality Council of Alberta (HQCA, Canada) and Alberta Health Services (AHS), SimInsights used HyperSkill to develop VR mockups that enabled rapid facility design evaluation and automated behavioral analytics. Key benefits include substantial savings in time and cost. This work has been integrated into updated guidelines released by HQCA.

If you would like access to HyperSkill, please contact us.