Tuesday, November 25, 2014

Concept for 3D Model (Futuristic Helicopter)

When a 3D model is made, conception is very important to create the idea. Concept art is used and refined to create an overall design of the model. Using side top and front profile, the 3D model is extracted and extruded from all sides to match the profile of each angle.


In this case I am creating a 3D model of a futuristic helicopter, only the side profile has been created so far, but I have the overall design idea down to how I would like it to look.

The target polygon count is 5000, so fairly generous and the texture target is 2048x2048, but this also includes a 2048x2048 specular map and bump/normal map. 

In the next post I will be going through the steps I used to create the model, along with problems I ran into along the way. Brace yourself.



Thursday, October 23, 2014

Pioneers of 3D Graphics in video games.


This week I will be discussing people I believe have greatly contributed to the advancement of 3D graphics and technology in the gaming industry.

John Carmack and ID Software

John Carmack has always pushed the graphics capabilities of video games on a technical level, from the early days of pitching Super Mario Brothers on PC to Nintendo and failing, to creating amazing game engines that push boundaries on technology.

Wolfenstein 3D was one of the earliest examples of Johns work going beyond what people expected at the time. In the early 90s, games like Crystal Caves, Space Quest and 8-Bit console games like Super Mario were dominating the market. John Carmack created a game engine capable of rendering 3D-esque environments with textured walls and 2D sprite characters that gave the illusion of 3D depth.

Bang bang! Wolfenstein 3D by ID Software

Folloing the success of Wolfenstein, Carmack built upon the Wolfenstein game with DOOM. Doom took the same formula and added lights, higher definition textures and larger levels than Wolfenstein. DOOM was downloaded so much it caught the attention of Bill Gates and he used it to promote Windows 95 with "DOOM 95"

Doom, a huge success for ID Software

While both Doom and Wolfenstein 3D were not true 3D, John Carmack worked on a series that would take advantage of polygons with the Quake series. Using 3D models with skeleton rigs, true polygon levels and an efficient game engine that scaled well to PCs Quake 1-3 brought 3D games into a new light. The Quake 2 and Quake 3 engine were so advanced and powerful, other developers wanted to take advantage of it, ID software allowed developers to buy licenses for ID Technology, which helped one of the biggest game companies in the world, Valve to show their creative expertise with the highly acclaimed  Half-Life series.

Quake 3 Arena: Amazing graphics that took advantage of hardware at the time like no other games before it

Quake engine games can still be seen today, albeit modified versions of the engine, such as in the Call of Duty franchise.

The latest iteration of ID Tech allows for unlimited texture layers in their came using a Virtual Texturing method that super compresses a massive texture file into something more manageable. The engine also runs at a stable 60fps even on older game consoles.




Epic games started out as a very simple garage gaming company that made simple 2D games such as Epic Pinball, Jazz Jackrabbit and Jill of the Jungle. When Epic games dived into 3D however, much like John Carmack they made waves by creating an amazing game engine known as Unreal Engine.
Unreal Engine has been competing with Quake engine in the middleware software world for well over a decade now, and due to its powerful tools has become a favourite for developers around the world who do not want to spend the money making their own engine.

Jazz Jackrabbit, a 2D platformer by Epic games

Unreal Engine 4: Creating close to CG graphics in real-time game environments


Throughout the life cycle of the Unreal Engine, graphical effects such as normal mapping, parallax occlusion mapping, real time reflections and dynamic lighting have all made Unreal Engine stand out from the competition, while still running extremely well on reasonable mid-tier computer specifications.

Much of the technology we see today, we take for granted, but coming from pixels to fully rendered and realistic 3D visuals has taken time. I believe ID Software and Epic Games are prime examples of pioneers pushing 3D in ways we never thought possible


The Democratisation of 3D graphics.

It was not too long ago that 3D software was a mystery to everyone. When people saw 3D on TV, they could only imagine how such images came about. Movies like Star Wars and Tron demonstrated the possibilities, but never outright showed people how it was done.

Technology and 3D software used to be expensive, so expensive that only companies could afford the computers powerful enough to create 3D images and film and the software tools that created them.
Thankfully, technology has caught up with mainstream society and anyone can afford a computer capable of running 3D software. Software developers are also giving out software licenses, game engines and other tools for free, or a much more affordable price than before. This allows more people to experience and used 3D software and more creativity in the industry where it may not have been possible.

With technology also becoming so intertwined in everyday life, we now also have access to more people and potential clients or businesses who need people in the 3D field. Not only that but smaller companies or groups can start projects themselves for much less than before.

This democracy of 3D is a welcome and wonderful thin in this time and age. Anyone can pick up something that was once out of reach and try it and show their work online and to a broad audience.

References:

Challenges of iD Tech 5 J.M.P. van Waveren

 http://s09.idav.ucdavis.edu/talks/05-JP_id_Tech_5_Challenges.pdf









Monday, October 13, 2014

Video Game Shaders

The topic for today is shaders, what are shaders?


Shaders are layers applied to a 3D object or scene that drastically change the dynamics and look of an object. A model without a shader will simply look bare and bland, a model with only a texture will seem strange to look at. This is where shaders come in.Imagine a flat metal sheet that doesn't reflect, shine or look correct when light is applied to it. This sheet will only be a coloured object, with no real dynamics reflecting the environment it is in. Thanks to pixel shaders, the metal object can reflect, bounce light and have more depth to the texture (bumpy surface) or something more complex like liquid running down it.It was not too long ago (roughly the year 2000) that shaders were very basic cheap to make graphic tricks in games such as vertex lighting and stencil shaders were used (Quake 2 and 3 were a good example of this) and surfaces could only be told apart from blinn and phong shaders, giving surfaces a flat and basic look.


Quake 2 by Id Software : Basic shading in the early years of 3D graphics.

Generally a game engine at the time would look flat and lifeless, so tricks like coloured lighting would often make their way into them, but there was still a lack of depth.



Eventually technology caught up with hollywood and per-pixel shaders were created. Using movie techniques that were often used in films and 3D animation, games could start using higher quality models and shaders that changed the way games looked drastically. Bump mapping, per-pixel lighting, dynamic shadows and more material layers with individual shaders brought a whole new life to the dull look we were so used to



Doom 3 by Id Software : The introduction of normal maps and pixel shaders made games look farther ahead of anything they could have been.


Another major innovation is Ptex. Ptex was created by Walt Disney studios as an alternative to regular texture mapping.

Ptex was designed to make UV mapping much more simple and allows for a large number of textures to be assigned to a single file. It also fixes seam issues with models, where an object appears to have a very small thin cut or space between polygons.



Refs: 

https://developer.nvidia.com/sites/default/files/akamai/gamedev/docs/ShaderIntegration_China.pdf

http://ptex.us/documentation.html










Monday, October 6, 2014

The production pipeline for the modern videogame


In my previous blog, I went into a brief overview of a general 3D production.In this blog I will be covering more in regards to a modern video game pipeline.

It was not too long ago when video games were just pixels, simple primitives that will show a very basic representation of what the game was trying to show.

    This is tennis, as represented by the Atari 2600.
Of course, technology got better. More pixels could be shown on screen and the clarity of the objects increased.

Street Fighter 2. A massive jump from tiny sprites

With advancements in technology, we saw textures and polygons, albeit much like the first pixelated sprites, they were very primitive as well.

Wipeout - Now we see 3D, the future looks bright.


With larger budgets, more work going into 3D and technology allowing this all to happen, we now have visuals like this.
Crysis 3: Modern 3D, what a drastic change from pixels

As far as technology has come, we still find that there are limitations. Normal mapping, bump mapping, shaders, filters and other techniques have been borrowed from the movie industry and put into video games. While we still have a long way to go to get it completely believable, we still have more work to do to create a modern 3D game.

As in the previous blog,we have many parts that build a game from the idea to the final production.

*Pre-Production

Once again, we have to start off with ideas, designs and an overall goal to accomplish for a game to be made. The story is written, the game-play is discussed and the characters/world have been designed.

Concept art for Dark Souls character

Concept world design for Dark Souls 2




Now that we have something to work with, we get into the 3D work

*Modelling

We have come a long way from the basic models seen in the 90s, now we can create characters with richer detail and more life-like realism and believability. 

The characters are given to the 3D modeller as 2D sheets of the side, front, back and 3/4 view. This is then placed into a 3D programme like 3D Studio Max, Maya, Blender or Lightwave to be turned into 3D models. The 3D modeller has to take into account the limitations of the game engine they are using or the polygon budget they have been given as the game has a much lower overall freedom as say a blockbuster movie, which could have a near limitless polygon count. Modern CPUs and processors are definitely more powerful than they used to be, but can still only render so much on screen and efficiently.

To counter the limitations, tricks have been made to overcome the lack of polygon count. 

A low polygon mesh is created that ticks to the overall polygon count requested, the model looks slightly more polygonal than it would seem before it is completed. Next a higher polygon version of the character is created and the detail of the character is captured in a "Normal" sheet and applied to the low polygon model, giving the illusion to the low poly model that it has the same detail of its higher poly version. If the 3D artist is skilled, they can create a character or object that looks indistinguishable from its high poly version as a game asset.
Example of a normal map
CD Projects Geralt model for the Witcher. From the left; Low poly, Low poly with smoothed polygons, high definition and poly model, final render of low poly with the difference applied in a normal map and textured.





Texturing:
Texturing allows the flat colourless model to have colouring, materials and more believability. The model is unwrapped to a flat sheet and painted on.

Wall texture



Character texture (Unreal 2003)
Much like polygons, games and game engines have a texture budget that has to be stuck to. Ram and video-ram only allows so much space for textures to be cached, streamed and used. There are some examples of game engines allowing very large or unlimited texture budgets, but is compressed once in game. Id Software's "RAGE" is an example of this, with "Id Tech 5" allowing massive textures or "Mega textures" to be shown, allowing developers to apply an unlimited amount of textures with no layer limits.

*Specular

Specular is a technique used to show reflection and material highlights on a model. Generally a black and white sheet, the white represents highights, the black highlights no highlights.
                                   
Example of specular map
*LOD

LOD or "Level of detail" allows a model to lower its overall detail over a distance. Models that are closer to the viewers eyes have full detail, while a model that is very far away may only be a sprite sheet or 5-20 polygons. This allows game engines to swap between models and lower polygon and texture size of objects that do not need it.

LOD example. The polygon count of the front model is much higher than the one in the
background.
Animation:

Much like film makers, a model that is animated uses bones and skeleton rigs in order to make the character or object animate. 

A character model and skeleton rig.


Since animation in games is much less predictable than a linear film, more animations are required to make sure the character or objects movements blend together. If a player goes from a run to a kick, or a run to a jump the animation has to blend together based on the players input. Once again, animations are stored to ram so there are limitations to how many can be used.

Mocap: Ryse Son of Rome
Mocap, or motion capture is also a possibility. This allows actors to be set up in special suits and act as puppets for the game characters.The actions and motions are recorded and used in scenes or for character movement/action.

With technology like PhysX coming out, objects like cloth, hair, fur and other dynamic objects can be calculated in real time by the game engine.



*Exporting and compiling
Exporting and compiling models and assets allows the 3D models to be converted to the game engine. The game engine can then recognise the objects as something that can be rendered.

*Level Editing
Once objects are in the engine, they can be placed by the level designer into the game world, or scripted to animate in the game world

*Optimisation
Not everything goes to plan and some objects may need to be optimised due to going over the polygon/texture/animation budget. Games are often given a target frame rate (30-60fps) that needs to be consistent, so sacrifices such as compression need to be made in order to reach or maintain these goals.  

Once again, this is a general overview of what is necessary in modern 3D video games and the pipeline to create modern games.









                                 






                               

Thursday, October 2, 2014

The 3D Production Pipeline.

The 3D Pipeline: Working from start to finish in 3D

This is a brief overview of a regular 3D production pipeline. What is a 3D production pipeline?

A 3D production pipeline is the steps used by creative minds in the industry to create a 3D project. The pipeline ensures that all steps necessary to reach the goal of a short animation, a rendering for an advertisement or a video game. It is a co-ordination of many people to reach a final goal of completing a set task at hand using a variety of different techniques and disciplines in order to create a single vision. It also ensures efficiency and a step by step process from start to finish.

An example of a 3D production pipeline would be;

Pre-production
- Creating and designing worlds/characters as sketches
- Creating a story or script
- Creating dialogue
- Creating storyboards for the script

Modelling and Texturing
- 3D modellers receive the 2D story boards and designs and then create 3D assets from them as closely as possible
- Textures, bump-maps, normal maps, shaders, effects and materials are all created to give the model colour and depth.

Rigging
- The animators will rig the model up with either a skeletal mesh or bones used to make the character or objects move
-The animators use the assets on hand to create a 3D representation of the 2D story boards in pre-production.

Rendering and Lighting
- The 3D scenes are lit with appropriate lighting and the entire scene and its assets are rendered frame by frame by a single computer or a render farm ( a series of computers designed to work together to process renders faster) to speed up the overall rendering time.

Not all pipelines need to match up like this, as there are companies that may not have the time, budget or staff size to make a big blockbuster, but catering a pipeline to suit their needs will ensure they follow a step-by step process in order to complete their projects.

Daniel Tarr

References ->

Digital tutors: 3D production pipeline
https://www.youtube.com/watch?v=UlQVPbC5iJ0

The Art of TitanFall. by Andy McVitte, Published 25th Febuary 2014