FMX Tuesday 22nd 2014
"Ryse - Son of Rome": Creating Emotional Cinematic Experiences in a Real-Time Environment by Cevat Yerli (Crytek)
The last talk of the day was taken by Cevet Yerli, CEO and President of Crytek. I thought that this would be a good example to use in my Game Art and Machinima project due to how it is all about real time, and so is our project, and it was really interesting to see how they produce cinematics on an industrial level as well. They used the Cry Engine to produce the whole game and for all of the production and they wanted to tell immersive stories and the character is on a mission to save Rome. Also due to the fact that they are using there own geometric VFX Pipeline means that they can integrate faster and focus on the art and produce a higher quality visual game.
Immersive Lighting
One of the most key elements of telling any story involves the use of lighting. The use of lighting has to be designed to underline the narrative of each mission and has thoroughly been improved throughout the years especially with the help of film making knowledge. Colour design is also important and is taken from photographic references during the research process of film making and a colour script with be influential to the players journey and help guide them through the level. During the production of this film they also used a custom facial lighting rig to be able to give a more realistic look and help with the illumination of the facial features.
Characters
For each of the characters they used the same rig for the entire process unlike before, and with other games, which they will use one rig for the game character and another for the cut scene game character. This new way of producing a rig makes a much better outcome as you see the same character in every part of the game play and it is never replaced with a more high quality one as in older games you can tell the difference between the game play and cut scenes as they contained a higher quality render.
Virtual Camera
The process of filming the cut scenes users their own film editor in the form of Cinebox that they created themselves. This is a real-time GPU based SDI and has real-time physical lens parametre controls. This means that they can choose which camera they want to record from and can easily start filming in game.
Performance Capture
The performances were originally captured by Andy Serkis and Johnathon Cavendish to be able to use as reference when they are animating. Performance capture is something that no one has done in gaming before and it was hugely based on choreography. What they wanted to do was make Ryse an emotionally combat game so that it wasn't just a fighting game but actually has a element of film production in it as well.
Virtual Cinematic Pipeline Cinebox
Throughout production they had this pipeline that they followed to be able to accurately complete production as fast as possible.
- Collaborative editing
- Asset Creation
- Directors new possibilities
- Real-time VFX
- Post processing
- Lighting adjustments
- Animation Editing
- Stakeholders or site feedback
- Shot finalisation
- Data Capture
This shift in technology is only possible if we do it in real-time as it is the new way of creating games. They use an 'open camera', which means that they can get unified camera data formatting so that streaming can be made easier and has many more capabilities. They want to ensure that the cameras they use are heavy and weighty and feel like a real one with a slight camera shake to them so they look handheld.
"We are now able to apply film practices in games and real time technologies in feature films."
Overall the talk was very practical and I learnt a lot about the process of creating cut scene cinematics and the process that Crytek have in particular to be able to create them and create them well. The process was very informative and I feel that it could help me in projects to come as I feel that I would be more organised in my projects and create an effective workflow that I could follow.
No comments:
Post a Comment