Metahumans in Dental Education
Dr. Alessandro Devigus graduated in 1987 from the University of Zurich Dental School. He has been in private practice since 1990 and is a CEREC instructor at the Zurich Dental School. Dr. Devigus lectures internationally on CAD/CAM and digital technologies and was the first Editor-in-Chief of the “International Journal of Esthetic Dentistry” from 2005-2021. He is an active member of the European Academy of Esthetic Dentistry (EAED), Neue Gruppe member, ITI Fellow and Speaker and Mentor of the Bio-EmulationTM Group.
Introduction – What are MetaHumans?
MetaHumans are photorealistic 3d avatars that can be programmed to look like and behave as we desire. The technology has been developed and released by Epic Games, the makers of Fortnight. This is a truly groundbreaking release where the ideology is the „Democratization of the development process“. This is a huge undertaking and what it means in layman’s terms is that we can access technology traditionally reserved for Hollywood studios. With this technology, we can create photo-realistic environments, products, and avatars. The applications are tangible and easy to understand and all revolve around improving the experience of a user or in this, patient.
What are the benefits of using MetaHumans in Dentistry and Dental Photography?
We all know the adage, one image is worth a thousand words. A patient appreciates this and when dealing with how an intervention will look, what better way of explaining that than actually showing. We can show precise information like how much a particular treatment will affect a patient. For example, we can show what a patient will look like with different treatment options, and all this in real time allowing the patient to understand and decide.
Legal advantages in using MetaHumans
We can save a lot of potential legal problems when using MetaHumans because we are not showing the „before“ and „after“ of our patients. One of the main problems here is that a patient can at anytime say that the „after“ is not 100% how they look like. In our case, we avoid this issue by using photorealistic 3d representations of people, not the actual patient. The patient can select one of many avatars to see the change, but none of them are a true life-like copy of them. As a result, the doctor can freely show as many procedures as required without the risk of a lawsuit.
How do MetaHumans improve the patients experience?
In today’s world of digital content and events, most people are comfortable interchanging information between their computers, smartphones, and tablets. MetaHumans offer a new context for patient care. We are developing applications that contain a new module which allows for a MetaHuman to contact someone through Zoom or Skype. This allows for a main avatar to offer information on a website or other media, then be able to actually do a 1 to 1 video conferencing session with a patient. This familiarity is key in giving information to the patient. An idea of where we are headed with this can be found with Apple. They are developing a new kind of augmented reality which is based on an ordinary pair of glasses. These glasses offer information that can be seen on the glass itself. Now imagine a Metahuman offering the same information but in a more personal way of showing our patients how the procedures will change the appearance of their teeth. I’m working on an application that is a way of controlling parameters related to the teeth which then shows the changes in real-time.
We all know the adage; one image is worth a thousand words. A patient appreciates this and when dealing with how an intervention will look, what better way of explaining that than actually showing. We can show precise information like how much a particular treatment will affect a patient. For example, we can show a patient how we can fix a condition like Overjet (displayed above). An overjet is when the upper teeth protrude outward and sit over the bottom teeth. Having an overjet doesn’t only affect a patient’s appearance, they can also have difficulty chewing, drinking, and biting. It may even cause jaw pain. We can show this type of result in order for the patient to decide what’s best. As before the real value here is the ability to inform the patient and have them choose the best option.
The ability to show changes in a photorealistic 3d application is very useful when showing options. We can customize the layout of the user interface to best suit your clinic, this optimizes the use of our contents and makes their use easy for the doctor. The key is to refine the user interface into something that is easy to understand and to the point.
Have you ever seen dental photography and said “What an amazing picture!”, then asked yourself how that picture was taken but could not find the answer
You assumed it was taken with the best camera in the world, using the most powerful and precise lens, a professional illumination studio, a very professional one indeed, and that the black background must be a fabric. The downside is the heavy investment in photographic equipment before actually getting a feel for what is at hand. With VR, we can simulate dental photography in detail. The ability to take life like pictures of people and environments in a realistic 3d application is what makes Unreal Engine the ideal tool for developing such an application.
Case Study: Dental Photography – Photoshoot VR
What is Photoshoot VR
Photoshoot VR is a learning application built using Unreal Engine. We use this platform because the level of graphics that we can achieve is unparalleled. In order for any photography simulator to make sense, we need the highest level of graphics. This includes post-processing, reflection captures, time of day, and much more. We have created a user interface based on learning principles like the exposure triangle. We have created a learning user interface where the user can adjust the image based on observing changes in real-time like the focus.
Our main task has been to make the camera settings and behavior as realistic as possible. This has been done with great success. The secondary task has been to create the right environments in order to practice a particular type of photography. This has been a huge challenge because we have had to make a large quantity of 3D assets and organize them in the right way and more importantly, with the right lighting.
We have divided the user interface into four sections which are:
Choice of lenses
We take pictures then upload them to our team of experts who review the shots and select a weekly winner. They also act as guides and can schedule workshops.
Unreal Engine has become an industry-standard in places like Disney, where the world-renowned series „Mandalorian“ was created using the latest visualization technologies from Unreal. The level of realism allows for backgrounds and close-ups to look so realistic that they need to make recordings in green rooms is now a thing of the past. We can actually immerse the actors in an environment created by Unreal – this is groundbreaking and allows for many visual offshoots like our photography simulator.
Can we take life like shots of subjects simulating patients?
Absolutely. We waited to release Photoshoot VR because we wanted to incorporate „MetaHumans“. This is a new technology that allows for photorealistic life-like avatars which can be preprogrammed to talk, express emotions, and move around. All controlled by AI. What does this mean in simple terms? We have created a library of patients with „before“ and „after“ conditions which the user must evaluate and then photograph. This is very practical because we put the context of interest as the protagonist. A doctor will use this simulator in order to learn how to take pictures of their patients. The more realistic our subjects, the easier it is to learn.
How can we learn DSLR photography with Photoshoot VR?
Photoshoot VR consists of two modules, a practical module for using the camera and a theoretical module to learn all about DSLR photography. In the latter, we have recreated Oskar Barnack as the guide. He shows us the principals of DSLR in a building that consists of four large halls, these halls are:
As the user completes a module, he/she gets that part of the module appears in the user interface. With each completed hall, the user gets a part of the user interface. The halls are interactive learning levels, for example, we can understand things like Shutter Speed, ISO, and Aperture in the Exposure Hall.
What are our plans for the future?
We will release the Photoshoot VR alpha version at the end of the summer. The first module will be portrait photography. Over the coming months, we will release modules related to other areas of photography like Fashion, Photojournalism, Wild Life, and much more.
Summary: Over the last two years I have been heavily involved in developing applications using gamification for the dental industry. This work has culminated in a project for Ivoclar Vivadent where we made a series of web-based 3d training modules that showed dentists how to operate complex machinery like the PrograMil One grinder. We use web-based 3d, augmented reality, and even virtual reality to help dentists understand complex tasks like changing water and filters. Another feature of our work for Ivoclar Vivadent is the ability to connect the smartphone application to the 3d environment and literally simulate each task from your smartphone as you would with the actual product. This level of simulation is possible thanks to using technology like Epic Games Unreal Engine.
Over the next five years, we will see a huge increase in visual technologies, and what seems far-fetched today will seem quite ordinary in the future. Advents like the release of Apple’s iGlasses will only increase the speed of adoption by the general public. Join me in our monthly broadcast where we showcase new technologies that improve the quality of patient care and make our lives as doctors and dentists easier. In our first show, we will feature Photoshoot VR and how it can help the dentist understand dental photography.