Friday, January 11, 2019

Augmented reality: the future of field service

By David Harris & Billy Hackett on December 12, 2018
Phillips Corporation is now utilizing the potential of augmented reality technology to improve training to its worldwide team of field service engineers, including initial training of new engineers and supporting seasoned engineers, while also directly supporting customers working on their own equipment.

The current challenge of training and supporting field service engineers is a perfect storm of increasing equipment complexities and a younger, less experienced workforce. It was with this in mind, and after much research, that we were introduced to SimInsights by Greg Jones, AMT’s Vice President - Smartforce Development.


Phillips Corporation was founded by Albert Phillips in 1961 and was one of the original distributors for Haas Automation. Phillips now has more than 400 employees covering the territory from southern New Jersey south to the Florida border and west to Oklahoma, as well as the entire country of India. Phillips employs more than 100 field service engineers servicing equipment on site at customers’ locations across these locations. More recently, Phillips has established two training facilities for field service engineers, one in Knoxville, Tenn., and another in Pune, India. Phillips has plans to meet the challenge of providing an ever-increasing competency development platform for all of their field service engineers.
The challenge: Provide the most effective service in the least amount of time
Phillips’ training and support centers are hubs of knowledge faced with a constant challenge of seeing what is happening in the field today, with a variety of machines and customers, and then properly advising engineers and customers in real time as to how a machine can be serviced. Phillips understood that augmented reality technology had the potential to reduce machine downtime by reducing the need for onsite service visits and making the necessary visits more effective. AR can allow Phillips to directly see a machine in need of service and allow Phillips’ technicians to instantly provide live guidance (through text and visual aids), resolving issues in far less time, even with less experienced engineers or customers.
Augmented reality as a key part of the solution
Having studied all the AR options available in the market, it was clear to Phillips that no existing AR/VR program had all the functionality Phillips required until we were introduced to SimInsights, an award-winning software and services company based in Lake Forest, Calif. SimInsights develops education and business software by leveraging unique skills in software, math, simulation, visualization, and design. SimInsights transforms training, sales, service, production, and design by leveraging virtual and augmented reality, simulation, sensing, artificial intelligence, and machine learning across the totality of employee, customer, and product life cycles.
One of the innovative functionalities in Phillips Vision by SimInsights is the ability to define scripted procedures with text and images that can be downloaded into devices and wearables, such as smart glasses, and used offline when there is no phone signal or WiFi connectivity.
The concept was introduced and field tested at the Smartforce Student Summit at IMTS 2018, where the idea was proven effective by utilizing a Haas simulator with students downloading a program, editing that program, setting the work coordinates and tool length offsets and verifying the program in graphics mode.
With operations at opposite ends of the earth, Phillips Corporation is excited to continue to explore the possibilities of what augmented and virtual reality technology has to offer in educating field service technicians, providing real-time, on-site service applications that create a positive ROI. Phillips Corporation will display this new technology at IMTEX in Bangalore, India, in January 2019.


Wednesday, January 11, 2017

Augmented Reality & Virtual Reality






SimInsights is developing 3D, photorealistic models for both the mobile carts. In addition, physics-based models will be developed for the carts so that their motions and interactions with users as well as other objects in the environment can be accurately captured and represented. For example, users will be able to push or pull the cart, and the cart will bump into other users and objects.

Specifically for the design of user interaction with the wi-med cart, we will also build a model of the computer screen so that the users can touch or tap the virtual screen from close proximity to activate the screen, review the information and make decisions based on the information. This will realistically replicate the real world interaction.


The VR environments must allow for scenario enactments with up to four people who will interact with each other and the VR environment simultaneously.


We are developing the VR environments in Unity in a manner to allow for scenario enactments with up to four people. Users will be able to interact with each other (e.g. tapping some on the shoulder to interrupt) as well as objects in the VR environment at the same time. Unity provides a comprehensive scripting API, which we will use to control the networked states of all players. This interface allows communication between connected players and ensures that the transforms (positions and rotations) of the players’ 3D models as well as objects manipulated by the players are synchronized across all views.


Figure 2: Users on different computers and Vive headsets will inhabit the same virtual world.

SimInsights’ experience with industry professionals and leading academic researchers has indicated that HTC Vive offers the best VR experience. Thus SimInsights recommends use of  HTC Vive, although we do support many other VR and AR devices in our projects. We will assume use of HTC for the technical description in this proposal.

HTC Vive tracks interactions with the VR environment using a headset, two handheld controllers, and two wireless infrared cameras (base stations). The headset and controllers have 70 sensors in total that allows for accurate tracking of positions in space (Prasuethsut, 2016) with a tracking accuracy of about 2 mm (Kreylos, 2016). The headset also has a microphone to capture the wearer’s voice.

Signals from Vive controllers will be used to render the 3D avatar of the user in the VR environment.  The Vive signals will allow us to accurately render the position and orientation of user’s head and both hands.

Thursday, October 25, 2012

MyClass Feature in SimPhysics





Students = Data. Scores = Data.

Taken apart, these are just names and numbers.  Putting these two together is an entirely different matter.

SimInsights have taken their Simulation to another level.  The new feature, MyClass puts students' names and scores together.  It helps both Teachers and students as they use the simulation.  First, the Teacher logs in and creates a "class".  They can simultaneously hold different classes for a single simulation.  They just need to provide different names for each of their classes.  These "class names" are then given to their respective students.  As the Students log in and do the simulations, their scores in each level are logged.  The Teacher can now monitor how each student is doing.  As they see how the students progress in their understanding of the given topic, or the simulation, for that matter, they can zero in on the areas that the students found most difficult to comprehend.  

At the end of the day, both Teachers and Students are beneficiaries of the MyClass feature. 


Thursday, September 27, 2012

Together We are Smarter

At SimInsights, we believe in the power of collaborative learning. As an undergraduate at the California Institute of Technology (Caltech), I have experienced the effectiveness of learning through collaboration with others, as well as interaction with the material, firsthand. You see, the Institute operates on an Honor Code, and though each student must complete and hand in his or her own work, we are given virtually infinite freedom to collaborate with other students on most assignments. By feeding off our friends' academic strengths and supporting each other's weaknesses, we create a synergy whereby we gain a much deeper understanding of even the most complex topics than we could ever hope to on our own. We as students quickly realize that, especially in the STEM fields, not only are two heads better than one, but three heads are better than two and so forth. This freedom of collaboration led me to choose Caltech as the starting point of my higher education in the first place. This same spirit of collaboration, that same notion that ideas grow faster and with more potency the more freely they flow between individuals, drove me to pursue an internship at SimInsights. But our organization brings another key mechanism to the table, a tool which will allow the full efficacy of collaborative learning to come to fruition: that of interactive simulation. At Caltech we frequently take advantage of whiteboards strewn across every residence hall in order to represent visually the abstract concepts we strive to grasp, but this often comes up short. Even the most effective whiteboard illustration generated by a student will eventually have to get erased, eliminating the possibility of anyone sampling or otherwise gaining inspiration from that fleeting educational tool en route to producing even more powerful illustrations. And, of course, figures on a whiteboard cannot move. Games and simulations created by SimInsights and those in the SimInsights community take us beyond these frustrating limitations, and we're just getting started. With SimInsights' platform for developing interactive, dynamic, and above all fun and engaging  educational games and simulations, the future of collaborative learning, and indeed knowledge dissemination on the whole, is approaching faster than most can imagine.

These are exciting times.

So come, learn more about what we do here at SimInsights. Let us grow together in knowledge.

- Brian