Abstract

This case study presents the basic pipeline to create a 3D virtual character in order to be used to any Digital Cultural Heritage application. The main idea is to obtain a 3D model of a virtual character, which could be either a premade character mesh or by creating one through 3D scanning for instance, then change its appearance and customize it based on the content of the application, animate it (e.g. adding certain movements and animations to it) and finally deploying to the environment where the application is being developed (e.g. a 3D game engine). Once inside the main developing environment of the application, more enhancements can be made to the virtual character in order for the character to match with the rest of the application environment, which is an essential part of True Augmented Reality. It is a very useful way of creating and importing 3D characters into a Digital Cultural Heritage application, because once it is learned, it can speed up the creation of Digital Cultural Heritage applications.

Keywords

True Augmented Reality, 3D Virtual Character, Digital Cultural Heritage Application, Animation

In this work, we present the basic pipeline for creating, customizing, animating and importing a 3D virtual character to a True Augmented Reality Digital Cultural Heritage application. In a True Augmented Reality application, all 3D models have to blend perfectly with the conditions of the real environment, so that the users will not be able to distinguish the augmented models for the real objects immediately. The longer it takes them to distinguish, the better the simulation will be. In the case of virtual characters, the characters have to look as realistic as possible, their appearance has to match with the concept of the application and their movements and animations have to be executed with respect to the real environment. After the basic pipeline is presented, some main concepts of the adaption of the character to the real environment is described, which is a part of our future work on this area

Obtaining the 3D model

The 3D model of the virtual character is the most essential part of its structure. It is its skeleton, its body with its skin. There are many ways to obtain a 3D model of the character and the way that is chosen each time depends on many variables, such as the available time that the developers have or on how creative they want to be. In general, creating a 3D model of a virtual character from scratch is a very time consuming procedure. Two of the most popular ways of obtaining a 3D model is from 3D scanning/reconstruction of a real character [1] or by downloading a premade virtual character from online libraries like Mixamo [2]. The latter is quite faster but it is limited to the types of characters that it contains (one may not be able to find exactly the type of mesh they are looking for, for instance), while the former is a little more time consuming but offers a lot more options, since the 3D mesh is reconstructed from scratch in a way. Both of those ways in the end provide the developers with a 3D model of a virtual character.

Character customization

Once a model of a 3D virtual character is obtained, the next step is to customize/change its appearance in order to make it relevant with the content of the application. For instance, in our Digital Cultural Heritage application, which features a museum which used to be a machinery, the virtual character should look like a worker. The appropriate appearance can be created manually with the help of a 3D painting software like Substance Painter [3], for instance. There is another option, though, where one can create 3D meshes of clothes for the virtual character, with the help of a computer graphics software, like Autodesk Maya [4], in order to achieve an even more realistic and detailed result. This option is very time consuming though and requires high knowledge of 3D modelling. By using a 3D painting tool, the procedure is much easier, since the user is given many options about the material of the clothes, the color and all the basic elements in general, in order to create a realistic set of clothes for the character.

Figure 1. Customizing the appearance of the character (worker) using Substance Painter[3].

Character Animation

When the character is customized, the next step is to make it able to move and to give it “life” (animation), since standing still in an application is not realistic at all. Again, there are many different options and approaches in order to animate a virtual character. A very fast and efficient way to animate a character is to download the desired animation(s) from Mixamo [2]. The advantage of this approach is that Mixamo [2] contains a large selection of different animations and also, their application to the virtual character is easy and quick for the developers. The disadvantages are that these animations will work for characters that are downloaded from Mixamo [2] and may not work well to virtual characters that do not originate from that platform, unless they are rigged on Mixamo [2]. In its simplest form, 3D rigging is the process of creating a skeleton for a 3D model so it can move [5]. Mixamo [2] may contain a large number of animations but there is still a possibility that one may not be able to find the animation they are looking for. In order to overcome this, another option would be to use a 3D modelling software like Autodesk Maya. It offers many ways to rig the character automatically, or manually for better results and record whichever animation the user wants, since they are able to move the joints of the character freely as they desire and record these movements. In the end, no matter which options are chosen, there will be available animations to use in the last step.

Figure 2. An example of the virtual character (worker) after rigging it and rendering it with Autodesk Maya [4].

Importing the Character to the Application Environment

Once all the above steps are done, the final step is to import the character to the application environment. The majority of Digital Cultural Heritage application are made with the use of a 3D game engine (e.g. Unity3D [6]). The virtual character can be imported to the application, alongside with the animations and through basic scripts, the animations can be assigned to the character and be used in the appropriate time.

Conclusions

In this case study, we presented the basic pipeline to create a 3D virtual character in order to be used to any Digital Cultural Heritage application. The basic steps are obtaining a 3D model, customizing it, creating animations for it and finally importing it to the application. There are many ways and different options for these steps depending on many variables that have to do with the developer/user. All these options are effective (each one with its advantages and disadvantages) and in the end, the result is to have a virtual character in a Digital Cultural Heritage application, which in case it is an Augmented Reality application it can be improved in order to be a realistic True Augmented Reality application. This pipeline can speed up the procedure of creating Digital Cultural Heritage applications and thus contribute to the preservation of Cultural Heritage.

Future Work

Since True Augmented Reality has recently been defined to be a modification of the user’s perception of their surroundings that cannot be detected by the user [7], the virtual character must adapt to the conditions of its surrounding environment and become a part of it, achieving that way the “suspension of disbelief”. The most important condition of the environment that the virtual character must adhere to, is the lighting. The virtual character should receive lighting, which is equal to the light the rest of the objects that are surrounding the character receive. This is a part of the future work that will use basic math algorithms from the glGA framework [8], which are based on the Precomputed Radiance Transfer algorithm and Conformal Geometric Algebra. By combining these elements, there will be a very realistic global illumination, which will light each object on the scene, depending on the environment that surrounds it.

Figure 3. The virtual character (worker) in with an idle animation, inside the cross reality virtual museum in our Digital Cultural Heritage application.

References

  1. Papaefthymiou, M., Kanakis, M., Geronikolakis, E., Nohos, A., Papagiannakis, G., “Rapid reconstruction and simulation of virtual characters in Mixed Reality environments“, ITN-DCH final conference, Olimje, Slovenia, May 2017
  2. (https://www.mixamo.com)
  3. Substance Painter. (https://www.allegorithmic.com/products/substance-painter)
  4. Autodesk Maya. (https://www.autodesk.com/products/maya/overview)
  5. Key 3D Rigging Terms to Get You Moving. (https://www.pluralsight.com/blog/film-games/key-rigging-terms-get-moving)
  6. (https://unity3d.com)
  7. Sandor, M. Fuchs, A. Cassinelli, H. Li, R. Newcombe, G. Yamamoto, and S. Feiner, “Breaking the Barriers to True Augmented Reality,” arXiv.org, vol. cs. HC. 17-Dec-2015.
  8. glGA Framework. (http://george.papagiannakis.org/?page_id=513)

 

Author

Geronikolakis Efstratios, Papagiannakis George,

Foundation for Research and Technology – Hellas, 100 N. Plastira Str. 70013, Heraklion, Greece, University of Crete, Computer Science Department, Voutes Campus, 70013, Heraklion, Greece
– Thematic Area 4