Unity Manual Printable
Unity Manual Printable
Unity Manual
Welcome to Unity.
Unity is made to empower you to create the best interactive entertainment or multimedia experience that you can. This manual is designed to
help you learn how to use Unity, from basic to advanced techniques. It can be read from start to finish or used as a reference.
The manual is divided into different sections. The first section, User Guide, is an introduction to Unity's interface, asset workflow, and the basics
of building a game. If you are new to Unity, you should start by reading the Unity Basics subsection.
The iOS Guide addresses iOS specific topics such as iOS-specific scripting API, optimizations, and general platform development questions.
The Android Guide addresses Android specific topics such as setting up the Android SDK and general development questions.
The next section, FAQ, is a collection of frequently asked questions about performing common tasks that require a few steps.
The last section, Advanced, addresses topics such as game optimization, shaders, file sizes, and deployment.
When you've finished reading, take a look at the Reference Manual and the Scripting Reference for further details about the different
possibilities of constructing your games with Unity.
If you find that any question you have is not answered in this manual please ask on Unity Answers or Unity Forums. You will be able to find
your answer there.
Happy reading,
The Unity team
The Unity Manual Guide contains some sections that apply only to certain platforms. Please select which platforms you want to see. Platform-
specific information can always be seen by clicking on the disclosure triangles on each page.
Table of Contents
User Guide
Unity Basics
Learning the Interface
Project Browser
Hierarchy
Toolbar
Scene View
Game View
Inspector
Other Views
Customizing Your Workspace
Asset Workflow
Creating Scenes
Publishing Builds
Tutorials
Unity Hotkeys
Preferences
Building Scenes
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
GameObjects
The GameObject-Component Relationship
Using Components
The Component-Script Relationship
Deactivating GameObjects
Using the Inspector
Editing Value Properties
Assigning References
Multi-Object Editing
Inspector Options
Using the Scene View
Scene View Navigation
Positioning GameObjects
View Modes
Gizmo and Icon Display Controls
Searching
Prefabs
Lights
Cameras
Terrain Engine Guide
Asset Import and Creation
Importing Assets
Models
3D formats
Legacy animation system
Materials and Shaders
Texture 2D
Procedural Materials
Movie Texture
Audio Files
Tracker Modules
Using Scripts
Asset Store
Asset Store Publisher Administration
Asset Server (Team License Only)
Setting up the Asset Server
Cache Server (Team License Only)
Cache Server (Team license only)
Cache Server FAQ
Behind the Scenes
Creating Gameplay
Instantiating Prefabs at runtime
Input
Transforms
Physics
Adding Random Gameplay Elements
Particle Systems
Particle System Curve Editor
Colors and Gradients in the Particle System (Shuriken)
Gradient Editor
Particle System Inspector
Introduction to Particle System Modules (Shuriken)
Particle System Modules (Shuriken)
Particle Effects (Shuriken)
Mecanim Animation System
A Glossary of Animation and Mecanim terms
Asset Preparation and Import
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Analytics
Check For Updates
Installing Multiple Versions of Unity
Trouble Shooting
Troubleshooting Editor
Troubleshooting Webplayer
Shadows in Unity
Directional Shadow Details
Troubleshooting Shadows
Shadow Size Computation
IME in Unity
Optimizing for integrated graphics cards
Web Player Deployment
HTML code to load Unity content
Working with UnityObject2
Customizing the Unity Web Player loading screen
Customizing the Unity Web Player's Behavior
Unity Web Player and browser communication
Using web player templates
Web Player Streaming
Webplayer Release Channels
Using the Chain of Trust system in the Web Player
Page last updated: 2012-11-14
User Guide
This section of the Manual is focused on the features and functions of Unity. It discusses the interface, core Unity building blocks, asset
workflow, and basic gameplay creation. By the time you are done reading the user guide, you will have a solid understanding of how to use
Unity to put together an interactive scene and publish it.
We recommend that new users begin by reading the Unity Basics section.
Unity Basics
Learning the Interface
Project Browser
Hierarchy
Toolbar
Scene View
Game View
Inspector
Other Views
Customizing Your Workspace
Asset Workflow
Creating Scenes
Publishing Builds
Tutorials
Unity Hotkeys
Preferences
Building Scenes
GameObjects
The GameObject-Component Relationship
Using Components
The Component-Script Relationship
Deactivating GameObjects
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Unity Basics
This section is your key to getting started with Unity. It will explain the Unity interface, menu items, using assets, creating scenes, and
publishing builds.
When you are finished reading this section, you will understand how Unity works, how to use it effectively, and the steps to put a basic game
together.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Project Browser
Hierarchy
Toolbar
Scene View
Game View
Inspector
Other Views
Page last updated: 2012-10-17
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
ProjectView40
In this view, you can access and manage the assets that belong to your project.
The left panel of the browser shows the folder structure of the project as a hierarchical list. When a folder is selected from the list by clicking, its
contents will be shown in the panel to the right. The individual assets are shown as icons that indicate their type (script, material, sub-folder,
etc). The icons can be resized using the slider at the bottom of the panel; they will be replaced by a hierarchical list view if the slider is moved to
the extreme left. The space to the left of the slider shows the currently selected item, including a full path to the item if a search is being
performed.
Above the project structure list is a Favorites section where you can keep frequently-used items for easy access. You can drag items from the
project structure list to the Favourites and also save search queries there (see Searching below).
Just above the panel is a "breadcrumb trail" that shows the path to the folder currently being viewed. The separate elements of the trail can be
clicked for easy navigation around the folder hierarchy. When searching, this bar changes to show the area being searched (the root Assets
folder, the selected folder or the Asset Store) along with a count of free and paid assets available in the store, separated by a slash. There is an
option in the General section of Unity's Preferences window to disable the display of Asset Store hit counts if they are not required.
Located at the left side of the toolbar, the Create menu lets you add new assets and sub-folders to the current folder. To its right are a set of
tools to allow you to search the assets in your project.
The Window menu provides the option of switching to a one-column version of the project view, essentially just the hierarchical structure list
without the icon view. The lock icon next to the menu enables you to "freeze" the current contents of the view (ie, stop them being changed by
events elsewhere) in a similar manner to the inspector lock.
Searching
The browser has a powerful search facility that is especially useful for locating assets in large or unfamiliar projects. The basic search will filter
assets according to the text typed in the search box
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
If you type more than one search term then the search is narrowed, so if you type coastal scene it will only find assets with both "coastal" and
"scene" in the name (ie, terms are ANDed together).
To the right of the search bar are three buttons. The first allows you to further filter the assets found by the search according to their type.
Continuing to the right, the next button filters assets according to their Label (labels can be set for an asset in the Inspector). Since the number
of labels can potentially be very large, the label menu has its own mini-search filter box.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Note that the filters work by adding an extra term in the search text. A term beginning with "t:" filters by the specified asset type, while "l:" filters
by label. You can type these terms directly into the search box rather than use the menu if you know what you are looking for. You can search
for more than one type or label at once. Adding several types will expand the search to include all specified types (ie, types will be ORed
together). Adding multiple labels will narrow the search to items that have all the specified labels (ie, labels are ANDed).
The rightmost button saves the search by adding an item to the Favourites section of the asset list.
If you select an item from the list, its details will be displayed in the inspector along with the option to purchase and/or download it. Some asset
types have previews available in this section so you can, for example, play an audio clip or rotate a 3D model before buying. The inspector also
gives the option of viewing the asset in the usual Asset Store window to see additional details.
Shortcuts
The following keyboard shortcuts are available when the browser view has focus. Note that some of them only work when the view is using the
two-column layout (you can switch between the one- and two-column layouts using the panel menu in the very top right corner).
F Frame selection
Tab Shift focus between first column and second column (Two columns)
Ctrl/Cmd + F Focus search field
Ctrl/Cmd + A Select all visible items in list
Ctrl/Cmd + D Duplicate selected assets
Delete Delete with dialog
Delete + Shift Delete without dialog
Backspace + Cmd Delete without dialogs (OSX)
Enter Begin rename selected (OSX)
Cmd + down arrow Open selected assets (OSX)
Cmd + up arrow Jump to parent folder (OSX, Two columns)
F2 Begin rename selected (Win)
Enter Open selected assets (Win)
Backspace Jump to parent folder (Win, Two columns)
Right arrow Expand selected item (tree views and search results). If the item is already expanded, this will select its first child item.
Left arrow Collapse selected item (tree views and search results). If the item is already collapsed, this will select its parent item.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Hierarchy
The Hierarchy contains every GameObject in the current Scene. Some of these are direct instances of asset files like 3D models, and others
are instances of Prefabs, custom objects that will make up much of your game. You can select objects in the Hierarchy and drag one object
onto another to make use of Parenting (see below). As objects are added and removed in the scene, they will appear and disappear from the
Hierarchy as well.
Parenting
Unity uses a concept called Parenting. To make any GameObject the child of another, drag the desired child onto the desired parent in the
Hierarchy. A child will inherit the movement and rotation of its parent. You can use a parent object's foldout arrow to show or hide its children as
necessary.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
To learn more about Parenting, please review the Parenting section of the Transform Component page.
Toolbar
The Toolbar consists of five basic controls. Each relate to different parts of the Editor.
SceneView40
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Scene View is your interactive sandbox. You will use the Scene View to select and position environments, the player, the camera,
enemies, and all other GameObjects. Maneuvering and manipulating objects within the Scene View are some of the most important functions
in Unity, so it's important to be able to do them quickly. To this end, Unity provides keystrokes for the most common operations.
Hold the right mouse button to enter Flythrough mode. This turns your mouse and WASD keys (plus Q and E for up and down) into quick
first-person view navigation.
Select any GameObject and press the F key. This will center the Scene View and pivot point on the selection.
Use the arrow keys to move around on the X/Z plane.
Hold Alt and click-drag to orbit the camera around the current pivot point.
Hold Alt and middle click-drag to drag the Scene View camera around.
Hold Alt and right click-drag to zoom the Scene View. This is the same as scrolling with your mouse wheel.
You might also find use in the Hand Tool (shortcut: Q), especially if you are using a one-button mouse. With the Hand tool is selected,
Hold Alt and click-drag to orbit the camera around the current pivot point.
In the upper-right corner of the Scene View is the Scene Gizmo. This displays the Scene Camera's current orientation, and allows you to
quickly modify the viewing angle.
Each of the coloured "arms" of the gizmo represents a geometric axis. You can click on any of the arms to set the camera to an orthographic
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
(i.e., perspective-free) view looking along the corresponding axis. You can click on the text underneath the gizmo to switch between the normal
perspective view and an isometric view. While in isometric mode, you can right-click drag to orbit, and Alt-click drag to pan.
Positioning GameObjects
See Positioning GameObjects for full details on positioning GameObjects in the scene. Here's a brief overview of the essentials:
When building your games, you'll place lots of different objects in your game world. To do this use the Transform Tools in the Toolbar to
Translate, Rotate, and Scale individual GameObjects. Each has a corresponding Gizmo that appears around the selected GameObject in the
Scene View. You can use the mouse and manipulate any Gizmo axis to alter the Transform Component of the GameObject, or you can type
values directly into the number fields of the Transform Component in the Inspector.
The Scene View control bar lets you see the scene in various view modes - Textured, Wireframe, RGB, Overdraw, and many others. It will also
enable you to see (and hear) in-game lighting, game elements, and sound in the Scene View. See View Modes for all the details.
GameView40
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Game View is rendered from the Camera(s) in your game. It is representative of your final, published game. You will need to use one or
more Cameras to control what the player actually sees when they are playing your game. For more information about Cameras, please view
the Camera Component page.
Play Mode
Use the buttons in the Toolbar to control the Editor Play Mode and see how your published game will play. While in Play mode, any changes
you make are temporary, and will be reset when you exit Play mode. The Editor UI will darken to remind you of this.
The first drop-down on the Game View control bar is the Aspect Drop-down. Here, you can force the aspect ratio of the Game View window to
different values. It can be used to test how your game will look on monitors with different aspect ratios.
Further to the right is the Maximize on Play toggle. While enabled, the Game View will maximize itself to 100% of your Editor Window for a
nice full-screen preview when you enter Play mode.
Continuing to the right is the Stats button. This shows Rendering Statistics window that is very useful for monitoring the graphics performance
of your game (see Optimizing Graphics Performance for further details).
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The last button is the Gizmos toggle. While enabled, all Gizmos that appear in Scene View will also be drawn in Game View. This includes
Gizmos drawn using any of the Gizmos class functions. The Gizmos button also has a popup menu showing the various different types of
Components used in the game.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Next to each Component's name are the settings for the icon and gizmos associated with it. The Icon setting reveals another popup menu
which lets you choose from a selection of preset icons or a custom icon defined by a texture.
The Gizmo setting enables you to selectively disable Gizmo drawing for specific components.
The 3D Gizmos setting at the top of the menu refers to the Gizmo icons. With the setting enabled, the icons will show the perspective of the
camera (ie, icons for nearby objects will be larger than those for distant objects), otherwise they will be the same size regardless of distance.
The slider next to the checkbox allows you to vary the size of the icons, which can be useful for reducing clutter when there are a lot of gizmos
visible.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Inspector
Games in Unity are made up of multiple GameObjects that contain meshes, scripts, sounds, or other graphical elements like Lights. The
Inspector displays detailed information about your currently selected GameObject, including all attached Components and their properties.
Here, you modify the functionality of GameObjects in your scene. You can read more about the GameObject-Component relationship, as it is
very important to understand.
Any property that is displayed in the Inspector can be directly modified. Even script variables can be changed without modifying the script itself.
You can use the Inspector to change variables at runtime to experiment and find the magic gameplay for your game. In a script, if you define a
public variable of an object type (like GameObject or Transform), you can drag and drop a GameObject or Prefab into the Inspector to make
the assignment.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Click the question mark beside any Component name in the Inspector to load its Component Reference page. Please view the Component
Reference for a complete and detailed guide to all of Unity's Components.
You can click the tiny gear icon (or right-click the Component name) to bring up a context menu for the specific Component.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Inspector will also show any Import Settings for a selected asset file.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Use the Layer drop-down to assign a rendering Layer to the GameObject. Use the Tag drop-down to assign a Tag to this GameObject.
Prefabs
If you have a Prefab selected, some additional buttons will be available in the Inspector. For more information about Prefabs, please view the
Prefab manual page.
Labels
Unity allows assets to be marked with Labels to make them easier to locate and categorise. The bottom item on the inspector is the Asset
Labels panel.
At the bottom right of this panel is a button titled with an ellipsis ("...") character. Clicking this button will bring up a menu of available labels
You can select one or more items from the labels menu to mark the asset with those labels (they will also appear in the Labels panel). If you
click a second time on one of the active labels, it will be removed from the asset.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The menu also has a text box that you can use to specify a search filter for the labels in the menu. If you type a label name that does not yet
exist and press return/enter, the new label will be added to the list and applied to the selected asset. If you remove a custom label from all
assets in the project, it will disappear from the list.
Once you have applied labels to your assets, you can use them to refine searches in the Project Browser (see this page for further details).
You can also access an asset's labels from an editor script using the AssetDatabase class.
Other Views
The Views described on this page covers the basics of the interface in Unity. The other Views in Unity are described elsewhere on separate
pages:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Tabs can also be detached from the Main Editor Window and arranged into their own floating Editor Windows. Floating Windows can contain
arrangements of Views and Tabs just like the Main Editor Window.
Floating Editor Windows are the same as the Main Editor Window, except there is no Toolbar
When you've created a Layout of Editor Windows, you can Save the layout and restore it any time. You do this by expanding the Layout drop-
down (found on the Toolbar) and choosing Save Layout.... Name your new layout and save it, then restore it by simply choosing it from the
Layout drop-down.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
At any time, you can right-click the tab of any view to view additional options like Maximize or add a new tab to the same window.
Asset Workflow
Here we'll explain the steps to use a single asset with Unity. These steps are general and are meant only as an overview for basic actions. For
the example, we'll talk about using a 3D mesh.
Import
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
When you save your asset initially, you should save it normally to the Assets folder in your Project folder. When you open the Unity project, the
asset will be detected and imported into the project. When you look in the Project View, you'll see the asset located there, right where you
saved it. Please note that Unity uses the FBX exporter provided by your modeling package to convert your models to the FBX file format. You
will need to have the FBX exporter of your modeling package available for Unity to use. Alternatively, you can directly export as FBX from your
application and save in the Projects folder. For a list of applications that are supported by Unity, please see this page.
Import Settings
If you select the asset in the Project View the import settings for this asset will appear in the Inspector. The options that are displayed will
change based on the type of asset that is selected.
Creating a Prefab
Prefabs are a collection of GameObjects & Components that can be re-used in your scenes. Several identical objects can be created from a
single Prefab, called instancing. Take trees for example. Creating a tree Prefab will allow you to instance several identical trees and place them
in your scene. Because the trees are all linked to the Prefab, any changes that are made to the Prefab will automatically be applied to all tree
instances. So if you want to change the mesh, material, or anything else, you just make the change once in the Prefab and all the other trees
inherit the change. You can also make changes to an instance, and choose GameObject->Apply Changes to Prefab from the main menu.
This can save you lots of time during setup and updating of assets.
When you have a GameObject that contains multiple Components and a hierarchy of child GameObjects, you can make a Prefab of the top-
level GameObject (or root), and re-use the entire collection of GameObjects.
Think of a Prefab as a blueprint for a structure of GameObjects. All the Prefab clones are identical to the blueprint. Therefore, if the blueprint is
updated, so are all the clones. There are different ways you can update the Prefab itself by changing one of its clones and applying those
changes to the blueprint. To read more about using and updating Prefabs, please view the Prefabs page.
To actually create a Prefab from a GameObject in your scene, simply drag the GameObject from the scene into the project, and you should see
the Game Object's name text turn blue. Name the new Prefab whatever you like. You have now created a re-usable prefab.
Updating Assets
You have imported, instantiated, and linked your asset to a Prefab. Now when you want to edit your source asset, just double-click it from the
Project View. The appropriate application will launch, and you can make any changes you want. When you're done updating it, just Save it.
Then, when you switch back to Unity, the update will be detected, and the asset will be re-imported. The asset's link to the Prefab will also be
maintained. So the effect you will see is that your Prefab will update. That's all you have to know to update assets. Just open it and save!
Select the asset you want to add the label to (From the project view).
In the inspector click on the "Add Label" icon ( ) if you dont have any Labels associated to that asset.
If you have a label associated to an asset then just click where the labels are.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Notes:
You can have more than one label for any asset.
To separate/create labels, just press space or enter when writing asset label names.
Page last updated: 2012-09-14
Creating Scenes
Scenes contain the objects of your game. They can be used to create a main menu, individual levels, and anything else. Think of each unique
Scene file as a unique level. In each Scene, you will place your environments, obstacles, and decorations, essentially designing and building
your game in pieces.
Instancing Prefabs
Use the method described in the last section to create a Prefab. You can also read more details about Prefabs here. Once you've created a
Prefab, you can quickly and easily make copies of the Prefab, called an Instance. To create an instance of any Prefab, drag the Prefab from
the Project View to the Hierarchy or Scene View. Now you have a unique instance of your Prefab to position and tweak as you like.
If adding a Component breaks the GameObject's connection to its Prefab, you can always use GameObject->Apply Changes to Prefab from
the menu to re-establish the link.
Placing GameObjects
Once your GameObject is in the scene, you can use the Transform Tools to position it wherever you like. Additionally, you can use the
Transform values in the Inspector to fine-tune placement and rotation. Please view the Transform Component page for more information about
positioning and rotating GameObjects.
Lights
Except for some very few cases, you will always need to add Lights to your scene. There are three different types of lights, and all of them
behave a little differently from each other. The important thing is that they add atmosphere and ambience to your game. Different lighting can
completely change the mood of your game, and using lights effectively will be an important subject to learn. To read about the different lights,
please view the Light component page.
Publishing Builds
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
At any time while you are creating your game, you might want to see how it looks when you build and run it outside of the editor as a
standalone or web player. This section will explain how to access the Build Settings and how to create different builds of your games.
File->Build Settings... is the menu item to access the Build Settings window. It pops up an editable list of the scenes that will be included
when you build your game.
The first time you view this window in a project, it will appear blank. If you build your game while this list is blank, only the currently open scene
will be included in the build. If you want to quickly build a test player with only one scene file, just build a player with a blank scene list.
It is easy to add scene files to the list for multi-scene builds. There are two ways to add them. The first way is to click the Add Current button.
You will see the currently open scene appear in the list. The second way to add scene files is to drag them from the Project View to the list.
At this point, notice that each of your scenes has a different index value. Scene 0 is the first scene that will be loaded when you build the
game. When you want to load a new scene, use Application.LoadLevel() inside your scripts.
If you've added more than one scene file and want to rearrange them, simply click and drag the scenes on the list above or below others until
you have them in the desired order.
If you want to remove a scene from the list, click to highlight the scene and press Command-Delete. The scene will disappear from the list and
will not be included in the build.
When you are ready to publish your build, select a Platform and make sure that the Unity logo is next to the platform; if its not then click in the
Switch Platform button to let Unity know which platform you want to build for. Finally press the Build button. You will be able to select a name
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
and location for the game using a standard Save dialog. When you click Save, Unity builds your game pronto. It's that simple. If you are unsure
where to save your built game to, consider saving it into the projects root folder. You cannot save the build into the Assets folder.
Enabling the Development Build checkbox on a player will enable Profiler functionality and also make the Autoconnect Profiler and Script
Debugging options available.
Desktop
Put simply, Streaming Web Players will get players playing your game faster than ever.
The only thing you need to worry about is checking to make sure that the next level you want to load is finished streaming before you load it.
Normally, in a non-streamed player, you use the following code to load a level:
Application.LoadLevel("levelName");
In a Streaming Web Player, you must first check that the level is finished streaming. This is done through the CanStreamedLevelBeLoaded()
function. This is how it works:
var levelToLoad = 1;
function LoadNewLevel () {
if (Application.CanStreamedLevelBeLoaded (levelToLoad)) {
Application.LoadLevel (levelToLoad);
}
}
If you would like to display the level streaming progress to the player, for a loading bar or other representation, you can read the progress by
accessing GetStreamProgressForLevel().
Distributing your standalone on Mac is just to provide the app bundle (everything is packed in there). On Windows you need to provide both the
.exe file and the Data folder for others to run it. Think of it like this: Other people must have the same files on their computer, as the resulting
files that Unity builds for you, in order to run your game.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The building process will place a blank copy of the built game application wherever you specify. Then it will work through the scene list in the
build settings, open them in the editor one at a time, optimize them, and integrate them into the application package. It will also calculate all the
assets that are required by the included scenes and store that data in a separate file within the application package.
Any GameObject in a scene that is tagged with 'EditorOnly' will be not be included in the published build. This is useful for debugging
scripts that don't need to be included in the final game.
When a new level loads, all the objects in the previous level are destroyed. To prevent this, use DontDestroyOnLoad() on any objects you
don't want destroyed. This is most commonly used for keeping music playing while loading a level, or for game controller scripts which keep
game state and progress.
After the loading of a new level is finished, the message: OnLevelWasLoaded() will be sent to all active game objects.
For more information on how to best create a game with multiple scenes, for instance a main menu, a high-score screen, and actual game
levels, see the Scripting Tutorial.pdf
iOS
1. XCode project is generated with all the required libraries, precompiled .NET code and serialized assets.
2. XCode project is built and deployed on the actual device.
When "Build" is hit on "Build settings" dialog only the first step is accomplished. Hitting "Build and Run" performs both steps. If in the project
save dialog the user selects an already existing folder an alert is displayed. Currently there are two XCode project generation modes to select:
replace - all the files from target folder are removed and the new content is generated
append - the "Data", "Libraries" and project root folder are cleaned and filled with newly generated content. The XCode project file is
updated according to the latest Unity project changes. XCode project "Classes" subfolder could be considered as safe place to place
custom native code, but making regular backups is recommended. Append mode is supported only for the existing XCode projects
generated with the same Unity iOS version.
If Cmd+B is hit then the automatic build and run process is invoked and the latest used folder is assumed as the build target. In this case
append mode is assumed as default.
Android
The Android application build process is performed in two steps:
1. Application package (.apk file) is generated with all the required libraries and serialized assets.
2. Application package is deployed on the actual device.
When "Build" is hit on "Build settings" dialog only the first step is accomplished. Hitting "Build and Run" performs both steps. If Cmd+B is hit
then the automatic build and run process is invoked and the latest used file is assumed as the build target.
Upon the first attempt to build an Android project, Unity would ask you to locate the Android SDK, that is required to build and install your
Android application on the device. You can change this setting later in Preferences.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
When building the app to the Android, be sure that the device has the "USB Debugging" and the "Allow mock locations" checkboxes checked in
the device settings.
You can ensure that the operating system sees your device by running adb devices command found in your Android SDK/platform-
tools folder. This should work both for Mac and Windows.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Unity builds an application archive (.apk file) for you and installs it on the connected device. In some cases your application cannot autostart
like on iPhone, so you need to unlock the screen, and in some rare cases find the newly installed application in the menu.
Texture Compression
Under Build Settings you'll also find the Texture Compression option. By default, Unity uses ETC1/RGBA16 texture format for textures that
don't have individual texture format overrides (see Texture 2D / Per-Platform Overrides).
If you want to build an application archive (.apk file) targeting a specific hardware architecture, you can use the Texture Compression option to
override this default behavior. Any texture that is set to not be compressed will be left alone; only textures using a compressed texture format
will use the format selected in the Texture Compression option.
To make sure the application is only deployed on devices which support the selected texture compression, Unity will edit the AndroidManifest to
include tags matching the particular format selected. This will enable the Android Market filtering mechanism to only serve the application to
devices with the appropriate graphics hardware.
Preloading
Published builds automatically preload all assets in a scene when the scene loads. The exception to this rule is scene 0. This is because the
first scene is usually a splashscreen, which you want to display as quickly as possible.
To make sure all your content is preloaded, you can create an empty scene which calls Application.LoadLevel(1). In the build settings make
this empty scene's index 0. All subsequent levels will be preloaded.
To learn more details about using Unity itself, you can continue reading the manual or follow the Tutorials.
To learn more about Components, the nuts & bolts of game behaviors, please read the Component Reference.
To learn more about creating Art assets, please read the Assets section of the manual.
To interact with the community of Unity users and developers, visit the Unity Forums. You can ask questions, share projects, build a team,
anything you want to do. Definitely visit the forums at least once, because we want to see the amazing games that you make.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Tutorials
These tutorials will let you work with Unity while you follow along. They will give you hands-on experience with building real projects. For new
users, it is recommended that you follow the GUI Essentials and Scripting Essentials tutorials first. After that, you can follow any of them. They
are all in PDF format, so you can print them out and follow along or read them alongside Unity.
Note: These Tutorials are intended for use with the Desktop version of Unity, these will not work with Android or iOS devices (iPhone/iPad).
Also if you are searching for other resources like presentations, articles, assets or extensions for Unity, then you can find them here.
You can also check the latest additions about tutorials just by checking our Unity3D Tutorial's Home Page.
Unity Hotkeys
This page gives an overview of the default Unity Hotkeys. You can also download a PDF of the table for Windows and MacOSX. Where a
command has CTRL/CMD as part of the keystroke, this indicates that the Control key should be used on Windows and the Command key on
MacOSX.
Tools
Keystroke Command
Q Pan
W Move
E Rotate
R Scale
Z Pivot Mode toggle
X Pivot Rotation
Toggle
V Vertex Snap
CTRL/CMD+LMB Snap
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
GameObject
CTRL/CMD+SHIFT+NNew game object
CTRL/CMD+ALT+F Move to view
CTRL/CMD+SHIFT+F Align with view
Window
CTRL/CMD+1 Scene
CTRL/CMD+2 Game
CTRL/CMD+3 Inspector
CTRL/CMD+4 Hierarchy
CTRL/CMD+5 Project
CTRL/CMD+6 Animation
CTRL/CMD+7 Profiler
CTRL/CMD+9 Asset store
CTRL/CMD+0 Animation
CTRL/CMD+SHIFT+CConsole
Edit
CTRL/CMD+Z Undo
CTRL+Y (Windows Redo
only)
CMD+SHIFT+Z (Mac Redo
only)
CTRL/CMD+X Cut
CTRL/CMD+C Copy
CTRL/CMD+V Paste
CTRL/CMD+D Duplicate
SHIFT+Del Delete
F Frame (centre)
selection
CTRL/CMD+F Find
CTRL/CMD+A Select All
Selection
CTRL/CMD+SHIFT+1 Load Selection 1
CTRL/CMD+SHIFT+2 Load Selection 2
CTRL/CMD+SHIFT+3 Load Selection 3
CTRL/CMD+SHIFT+4 Load Selection 4
CTRL/CMD+SHIFT+5 Load Selection 5
CTRL/CMD+SHIFT+6 Load Selection 6
CTRL/CMD+SHIFT+7 Load Selection 7
CTRL/CMD+SHIFT+8 Load Selection 8
CTRL/CMD+SHIFT+9 Load Selection 9
CTRL/CMD+ALT+1 Save Selection 1
CTRL/CMD+ALT+2 Save Selection 2
CTRL/CMD+ALT+3 Save Selection 3
CTRL/CMD+ALT+4 Save Selection 4
CTRL/CMD+ALT+5 Save Selection 5
CTRL/CMD+ALT+6 Save Selection 6
CTRL/CMD+ALT+7 Save Selection 7
CTRL/CMD+ALT+8 Save Selection 8
CTRL/CMD+ALT+9 Save Selection 9
Assets
CTRL/CMD+R Refresh
Page last updated: 2012-09-12
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Preferences
Unity provides a number of preference panels to allow you to customise the behaviour of the editor.
General
Auto Refresh Should the editor update assets automatically as they change?
Always Show Project Should the project wizard be shown at startup? (By default, it is shown only when the alt key is held down during
Wizard launch)
Compress Assets On Should assets be compressed automatically during import?
Import
OSX Color Picker Should the native OSX color picker be used instead of Unity's own?
Editor Analytics Can the editor send information back to Unity automatically?
Show Asset Store search Should the number of free/paid assets from the store be shown in the Project Browser?
hits
Verify Saving Assets Should Unity verify which assets to save individually on quitting?
Skin (Pro Only) Which color scheme should Unity use for the editor? Pro users have the option of dark grey in addition to the
default light grey.
External Tools
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
External Script Editor Which application should Unity use to open script files?
Editor Attaching Should Unity allow debugging to be controlled from the external script editor?
Image Application Which application should Unity use to open image files?
Asset Server Diff Tool Which application should Unity use to resolve file differences with the asset server?
Android SDK Location Where in the filesystem is the Android SDK folder located?
iOS Xcode 4.x support Should support for Xcode 4.x be enabled for iOS build targets?
Colors
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
This panel allows you to choose the colors that Unity uses when displaying various user interface elements.
Keys
This panel allows you to set the keystrokes that activate the various commands in Unity.
Cache Server
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Building Scenes
This section will explain the core elements you will work with to build scenes for complete games.
GameObjects
The GameObject-Component Relationship
Using Components
The Component-Script Relationship
Deactivating GameObjects
Using the Inspector
Editing Value Properties
Assigning References
Multi-Object Editing
Inspector Options
Using the Scene View
Scene View Navigation
Positioning GameObjects
View Modes
Gizmo and Icon Display Controls
Searching
Prefabs
Lights
Cameras
Terrain Engine Guide
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
GameObjects
GameObjects are the most important objects in Unity. It is very important to understand what a GameObject is, and how it can be used. This
page will explain all that for you.
The answer to this question is that GameObjects are containers. They are empty boxes which can hold the different pieces that make up a
lightmapped island or a physics-driven car. So to really understand GameObjects, you have to understand these pieces; they are called
Components. Depending on what kind of object you want to create, you will add different combinations of Components to the GameObject.
Think of a GameObject as an empty cooking pot, and Components as different ingredients that make up your recipe of gameplay. You can also
make your own Components using Scripts.
You can read more about GameObjects, Components, and Script Components on the pages in this section:
Notice that an empty GameObject still contains a Name, a Tag, and a Layer. Every GameObject also contains a Transform Component.
The Transform Component also enables a concept called Parenting, which is utilized through the Unity Editor and is a critical part of working
with GameObjects. To learn more about the Transform Component and Parenting, read the Transform Component Reference page.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Other Components
The Transform Component is critical to all GameObjects, so each GameObject has one. But GameObjects can contain other Components as
well.
Looking at the Main Camera GameObject, you can see that it contains a different collection of Components. Specifically, a Camera
Component, a GUILayer, a Flare Layer, and an Audio Listener. All of these Components provide additional functionality to the GameObject.
Without them, there would be nothing rendering the graphics of the game for the person playing! Rigidbodies, Colliders, Particles, and Audio
are all different Components (or combinations of Components) that can be added to any given GameObject.
Using Components40
Components are the nuts & bolts of objects and behaviors in a game. They are the functional pieces of every GameObject. If you don't yet
understand the relationship between Components and GameObjects, read the GameObjects page before going any further.
A GameObject is a container for many different Components. By default, all GameObjects automatically have a Transform Component. This is
because the Transform dictates where the GameObject is located, and how it is rotated and scaled. Without a Transform Component, the
GameObject wouldn't have a location in the world. Try creating an empty GameObject now as an example. Click the GameObject->Create
Empty menu item. Select the new GameObject, and look at the Inspector.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Remember that you can always use the Inspector to see which Components are attached to the selected GameObject. As Components are
added and removed, the Inspector will always show you which ones are currently attached. You will use the Inspector to change all the
properties of any Component (including scripts)
Adding Components
You can add Components to the selected GameObject through the Components menu. We'll try this now by adding a Rigidbody to the empty
GameObject we just created. Select it and choose Component->Physics->Rigidbody from the menu. When you do, you will see the
Rigidbody's properties appear in the Inspector. If you press Play while the empty GameObject is still selected, you might get a little surprise. Try
it and notice how the Rigidbody has added functionality to the otherwise empty GameObject. (The y-component of the GameObject starts to
decrease. This is because the physics engine in Unity is causing the GameObject to fall under gravity.)
Another option is to use the Component Browser, which can be activated with the Add Component button in the object's inspector.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The browser lets you navigate the components conveniently by category and also has a search box that you can use to locate components by
name.
You can attach any number or combination of Components to a single GameObject. Some Components work best in combination with others.
For example, the Rigidbody works with any Collider. The Rigidbody controls the Transform through the NVIDIA PhysX physics engine, and the
Collider allows the Rigidbody to collide and interact with other Colliders.
If you want to know more about using a particular Component, you can read about any of them in the Component Reference. You can also
access the reference page for a Component from Unity by clicking on the small ? on the Component's header in the Inspector.
Editing Components
One of the great aspects of Components is flexibility. When you attach a Component to a GameObject, there are different values or Properties
in the Component that can be adjusted in the editor while building a game, or by scripts when running the game. There are two main types of
Properties: Values and References.
Look at the image below. It is an empty GameObject with an Audio Source Component. All the values of the Audio Source in the Inspector
are the default values.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
This Component contains a single Reference property, and seven Value properties. Audio Clip is the Reference property. When this Audio
Source begins playing, it will attempt to play the audio file that is referenced in the Audio Clip property. If no reference is made, an error will
occur because there is no audio to be played. You must reference the file within the Inspector. This is as easy as dragging an audio file from the
Project View onto the Reference Property or using the Object Selector.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Components can include references to any other type of Component, GameObjects, or Assets. You can read more about assigning references
on the Assigning References page.
The remaining properties on the Audio Clip are all Value properties. These can be adjusted directly in the Inspector. The Value properties on the
Audio Clip are all toggles, numeric values, drop-down fields, but value properties can also be text strings, colors, curves, and other types. You
can read more about these and about editing value properties on the Editing Value Properties page.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The copied values can be pasted to an existing component using the Paste Component Values menu item. Alternatively, you can use Paste
Component As New to create a new Component with those values.
Removing Components
If you want to remove a Component, option- or right-click on its header in the Inspector, and choose Remove Component. Or you can left-click
the options icon next to the ? on the Component header. All the property values will be lost and this cannot be undone, so be completely sure
you want to remove the Component before you do.
Read more about creating and using scripts on the Scripting page.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
DeactivatingGameObjects
A GameObject can be temporarily removed from the scene by marking it as inactive. This can be done using its activeSelf property from a script
or with the activation checkbox in the inspector
This overriding behaviour was introduced in Unity 4.0. In earlier versions, there was a function called SetActiveRecursively which could be
used to activate or deactivate the children of a given parent object. However, this function worked differently in that the activation setting of
each child object was changed - the whole hierarchy could be switched off and on but the child objects had no way to "remember" the state
they were originally in. To avoid breaking legacy code, SetActiveRecursively has been kept in the API for 4.0 but its use is not recommended
and it may be removed in the future. In the unusual case where you actually want the children's activeSelf settings to be changed, you can use
code like the following:-
// JavaScript
function DeactivateChildren(g: GameObject, a: boolean) {
g.activeSelf = a;
// C#
void DeactivateChildren(GameObject g, bool a) {
g.activeSelf = a;
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Inspector is used to view and edit Properties of many different types.
Games in Unity are made up of multiple GameObjects that contain meshes, scripts, sounds, or other graphical elements like Lights. When you
select a GameObject in the Hierarchy or Scene View, the Inspector will show and let you modify the Properties of that GameObject and all
the Components and Materials on it. The same will happen if you select a Prefab in the Project View. This way you modify the functionality of
GameObjects in your game. You can read more about the GameObject-Component relationship, as it is very important to understand.
Inspector shows the properties of a GameObject and the Components and Materials on it.
When you create a script yourself, which works as a custom Component type, the member variables of that script are also exposed as
Properties that can be edited directly in the Inspector when that script component has been added to a GameObject. This way script variables
can be changed without modifying the script itself.
Furthermore, the Inspector is used for showing import options of assets such as textures, 3D models, and fonts when selected. Some scene
and project-wide settings are also viewed in the Inspector, such as all the Settings Managers.
Any property that is displayed in the Inspector can be directly modified. There are two main types of Properties: Values and References.
Values might be the color of a light, or a vector. References are links to other objects such as textures or other game objects.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Many value properties have a text field and can be adjusted simply by clicking on them, entering a value using the keyboard, and pressing
Enter to save the value.
You can also put your mouse next to a numeric property, left-click and drag it to scroll values quickly
Some numeric properties also have a slider that can be used to visually tweak the value.
Some Value Properties open up a small popup dialog that can be used to edit the value.
Color Picker
Properties of the Color type will open up the Color Picker. (On Mac OS X this color picker can be changed to the native OS color picker by
enabling Use OS X Color Picker under Unity->Preferences.)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Use the Eyedropper Tool when you want to find a value just by putting your mouse over the color you want to grab.
RGB / HSV Selector lets you switch your values from Red, Green, Blue to Hue, Saturation and Value (Strength) of your color.
Finally, the transparency of the Color selected can be controlled by the Alpha Channel value.
Curve Editor
Properties of the AnimationCurve type will open up the Curve Editor. The Curve Editor lets you edit a curve or choose from one of the
presets. For more information on editing curves, see the guide on Editing Curves.
The type is called AnimationCurve for legacy reasons, but it can be used to define any custom curve function. The function can then be
evaluated at runtime from a script.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Wrapping Mode Lets you select between Ping Pong, Clamp and Loop for the Control Keys in your curve.
The Presets lets you modify your curve to default outlines the curves can have.
Gradient editor
In graphics and animation, it is often useful to be able to blend one colour gradually into another, over space or time. A gradient is a visual
representation of a colour progression, which simply shows the main colours (which are called stops) and all the intermediate shades between
them. In Unity, gradients have their own special value editor, shown below.
The upward-pointing arrows along the bottom of the gradient bar denote the stops. You can select a stop by clicking on it; its value will be
shown in the Color box which will open the standard colour picker when clicked. A new stop can be created by clicking just underneath the
gradient bar. The position of any of the stops can be changed simply by clicking and dragging and a stop can be removed with ctrl/cmd +
delete.
The downward-pointing arrows above the gradient bar are also stops but they correspond to the alpha (transparency) of the gradient at that
point. By default, there are two stops set to 100% alpha (ie, fully opaque) but any number of stops can be added and edited in much the same
way as the colour stops.
Arrays
Scripts that you write can expose native .Net arrays to the Inspector. When the Inspector encounters an array it will allow you to edit the length
of the array. The length defaults to zero. When the size is set above zero the Inspector creates slots where you can enter values for the
elements of the array. If your Array stores data of a type that Unity knows it will insert the appropriate value editor. For example:
would result in an color picker editor for each element in the array.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Audio Clip property slot shows that it accept references to objects of type Audio Clip
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
This type of referencing is very quick and powerful, especially when using scripting. To learn more about using scripts and properties, see the
Scripting Tutorial on the Tutorials page.
Object references can be assigned to a reference property either by drag and drop or by using the Object Picker.
If a reference property accepts a specific Component type (for example a Transform) then dragging a GameObject or a Prefab onto the
reference property will work fine provided that the GameObject or Prefab contains a component of the correct type. The property will then
reference the component in question, even though it was a GameObject or Prefab you dragged onto it.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
If you drag an object onto an reference property, and the object is not of the correct type, or does not contain the right component, then you
won't be able to assign the object to the reference property.
The Object Picker is a simple window for assigning objects in the inspector after allowing you to preview and search those available.
Although the Object Picker is really easy to use, there are a few things you should be aware of. These are described below.
1. Search: When there are lots of objects in the picker, you can use the Search field to filter them. This search field can also search
objects using their Labels.
2. View Selector: Switches the base of the search between objects in the scene and assets.
3. Preview Size: This horizontal scroll bar lets you increase/decrease the size of your preview objects in the preview window. With this
you can see more or fewer objects in the preview window at any moment.
4. Preview Window: Here are all the objects that reside in your Scene/Assets folder filtered by the Search field.
5. Object Info: Displays information about the currently selected object. The content of this field depends on the type of object being
viewed, so if for example you pick a mesh, it will tell you the number of vertices and triangles, and whether or not it has UVs and is
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
skinned. However, if you pick an audio file it will give you information such as the bit rate of the audio, the length, etc.
6. Object Preview: This also depends on the type of object you are viewing. If you select a mesh, it will display you how the mesh looks,
but if you select a script file, it will just display an icon of the file.
The Object Picker works on any asset you have in your project, which can be a video, a song, a terrain, a GUI skin, a scripting file, or a mesh; it
is a tool you will use often.
Hints
Use Labels on your Assets and you will be able to find them more easily by searching for them using the search field of the Object Picker.
If you dont want to see the descriptions of the objects you can move the slider in the bottom middle of the preview window downward.
If you want to see a detailed preview of the object, you can enlarge the object preview by dragging the slider in the bottom middle of the
preview window.
Page last updated: 2012-08-13
Multi-Object Editing
Starting in Unity 3.5 you can select multiple objects of the same type and edit them simultaneously in the Inspector. Any changed properties
will be applied to all of the selected objects. This is a big time saver if you want to make the same change to many objects.
When selecting multiple objects, a component is only shown in the Inspector if that component exists on all the selected objects. If it only exists
on some of them, a small note will appear at the bottom of the Inspector saying that components that are only on some of the selected objects
cannot be multi-edited.
Property Values
When multiple objects are selected, each property shown in the Inspector represents that property on each of the selected objects. If the value
of the property is the same for all the objects, the value will be shown as normal, just like when editing a single object. If the value of the
property is not the same for all the selected objects, no value is shown and a dash or similar is shown instead, indicating that the values are
different.
Regardless of whether a value is shown or a dash, the property value can be edited as usual and the changed value is applied to all the
selected objects. If the values are different and a dash is thus shown, it's also possible to right-click on the label of the property. This brings up
a menu that lets you choose from which of the objects to inherit the value.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Non-Supported Objects
A few object types do not support multi-object editing. When you select multiple objects simultaneously, these objects will show a small note
saying "Multi-object editing not supported".
If you have made a custom editor for one of your own scripts, it will also show this message if it doesn't support multi-object editing. See the
script reference for the Editor class to learn how to implement support for multi-object editing for your own custom editors.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Inspector Options
The Inspector Lock and the Inspector Debug Mode are two useful options that can help you in your workflow.
Lock
The Lock lets you maintain focus on a specific GameObject in the Inspector while selecting other GameObjects. To toggle the lock of an
Inspector click on the lock/unlock ( ) icon above the Inspector or open the tab menu and select Lock.
Note that you can have more than one Inspector open, and that you can for example lock one of them to a specific GameObject while keeping
the other one unlocked to show whichever GameObject is selected.
Debug
The Debug Mode lets you inspect private variables of components in the Inspector, which are normally not shown. To change to Debug Mode,
open the tab menu and select Debug.
In Debug Mode, all components are shown using a default interface, rather than the custom interfaces that some components use in the
Normal Mode. For example, the Transform component will in Debug Mode show the raw Quaternion values of the rotation rather than the Euler
angles shown in the Normal Mode. You can also use the Debug Mode to inspect the values of private variables in your own script components.
Debug Mode in the Inspector lets you inspect private variables in your scripts and in other components.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Debug mode is per Inspector and you can have one Inspector in Debug Mode while another one is not.
Arrow Movement
You can use the Arrow Keys to move around the scene as though "walking" through it. The up and down arrows move the camera forward and
backward in the direction it is facing. The left and right arrows pan the view sideways. Hold down the Shift key with an arrow to move faster.
Focusing
If you select a GameObject in the hierarchy, then move the mouse over the scene view and press the F key, the view will move so as to center
on the object. This feature is referred to as frame selection.
Orbit: Hold Alt and click-drag to orbit the camera around the current pivot point.
Zoom: Hold Control (Command on Mac) and click-drag to zoom the camera.
Holding down Shift will increase the rate of movement and zooming.
Action 3-button mouse 2-button mouse or track- Mac with only one mouse button or track-
pad pad
Move Hold Alt and middle click-drag. Hold Alt-Control and click-drag. Hold Alt-Command and click-drag.
Orbit Hold Alt and click-drag. Hold Alt and click-drag. Hold Alt and click-drag.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Zoom Hold Alt and right click-drag or use scroll- Hold Alt and right click-drag. Hold Alt-Control and click-drag or use two-finger
wheel. swipe.
Flythrough Mode
The Flythrough mode lets you navigate the Scene View by flying around in first person similar to how you would navigate in many games.
Flythrough mode is designed for Perspective Mode. In Isometric Mode, holding down the right mouse button and moving the mouse will orbit
the camera instead.
Scene Gizmo
In the upper-right corner of the Scene View is the Scene Gizmo. This displays the Scene View Camera's current orientation, and allows you to
quickly modify the viewing angle.
You can click on any of the arms to snap the Scene View Camera to that direction. Click the middle of the Scene Gizmo, or the text below it, to
toggle between Isometric Mode and Perspective Mode. You can also always shift-click the middle of the Scene Gizmo to get a "nice"
perspective view with an angle that is looking at the scene from the side and slightly from above.
Perspective mode.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
You can also use three fingers to simulate the effect of clicking the arms of the Scene Gizmo: drag up, left, right or down to snap the Scene
View Camera to the corresponding direction. In OS X 10.7 "Lion" you may have to change your trackpad settings in order to enable this
feature:
Open System Preferences and then Trackpad (or type trackpad into Spotlight).
Click into the "More Gestures" option.
Click the first option labelled "Swipe between pages" and then either set it to "Swipe left or right with three fingers" or "Swipe with two or
three fingers".
Page last updated: 2012-11-16
Positioning GameObjects
When building your games, you'll place lots of different objects in your game world.
Focusing
It can be useful to focus the Scene View Camera on an object before manipulating it. Select any GameObject and press the F key. This will
center the Scene View and pivot point on the selection. This is also known as Frame Selection.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Click and drag in the center of the Gizmo to manipulate the object on all axes at once.
At the center of the Translate gizmo, there are three small squares that can be used to drag the object within a single plane (ie, two axes
can be moved at once while the third is kept still).
If you have a three button mouse, you can click the middle button to adjust the last-adjusted axis (which turns yellow) without clicking
directly on it.
Be careful when using the scaling tool, as non-uniform scales (e.g. 1,2,1) can cause unusual scaling of child objects.
For more information on transforming GameObjects, please view the Transform Component page.
Position:
Center will position the Gizmo at the center of the object's rendered bounds.
Pivot will position the Gizmo at the actual pivot point of a Mesh.
Rotation:
Local will keep the Gizmo's rotation relative to the object's.
Global will clamp the Gizmo to world space orientation.
Unit Snapping
While dragging any Gizmo Axis using the Translate Tool, you can hold the Control key (Command on Mac) to snap to increments defined in
the Snap Settings.
You can change the unit distance that is used for the unit snapping using the menu Edit->Snap Settings...
Surface Snapping
While dragging in the center using the Translate Tool, you can hold Shift and Control (Command on Mac) to snap the object to the
intersection of any Collider. This makes precise positioning of objects incredibly fast.
Look-At Rotation
While using the Rotate Tool, you can hold Shift and Control (Command on Mac) to rotate the object towards a point on the surface of any
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Vertex Snapping
You can assemble your worlds more easily with a feature called Vertex Snapping. This feature is a really simple but powerful tool in Unity. It
lets you take any vertex from a given mesh and with your mouse place that vertex in the same position as any vertex from any other mesh you
choose.
With this you can assemble your worlds really fast. For example, you can place roads in a racing game with high precision and add power up
items on the vertices of a mesh.
Select the mesh you want to manipulate and make sure the Transform Tool is active.
Press and hold the V key to activate the vertex snapping mode.
Move your cursor over the vertex on your mesh that you want to use as the pivot point.
Hold down the left button once your cursor is over the desired vertex and drag your mesh next to any other vertex on another mesh.
Release your mouse button and the V key when you are happy with the results.
Shift-V acts as a toggle of this functionality.
You can snap vertex to vertex, vertex to surface and pivot to vertex.
Page last updated: 2013-02-05
View Modes
The Scene View control bar lets you choose various options for viewing the scene and also control whether lighting and audio are enabled.
These controls only affect the scene view during development and have no effect on the built game.
Draw Mode
The first drop-down menu selects which Draw Mode will be used to depict the scene.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Render Mode
The next drop-down along selects which of four Render Modes will be used to render the scene.
The first button determines whether the view will be lit using a default scheme or with the lights that have actually been added to the scene. The
default scheme is used initially but this will change automatically when the first light is added. The second button controls whether skyboxes
and GUI elements will be rendered in the scene view and also shows and hides the placement grid. The third button switches audio sources in
the scene on and off.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Inspector. To change the icon for a GameObject, simply click on its icon in the Inspector. The icons of script assets can be changed in a similar
way. In the Icon Selector is a special kind of icon called a Label Icon. This type of icon will show up in the Scene View as a text label using the
name of the GameObject. Icons for built-in Components cannot be changed.
Note: When an asset's icon is changed, the asset will be marked as modified and therefore picked up by Revision Control Systems.
To show the state of the current gizmo and icon, click on Gizmos in the control bar of the Scene or Game View. The toggles here are used to
set which icons and gizmos are visible.
Note that the scripts that show up in the Scripts section are those that either have a custom icon or have an OnDrawGizmos () or
OnDrawGizmosSelected () function implemented.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Gizmos dropdown, displaying the visibility state of icons and gizmos
The Icon Scaling slider can be used to adjust the size used for icon display in the scene. If the slider is placed at the extreme right, the icon will
always be drawn at its natural size. Otherwise, the icon will be scaled according to its distance from the scene view camera (although there is
an upper limit on the display size in order that screen clutter be avoided).
Searching
When working with large complex scenes it can be useful to search for specific objects. By using the Search feature in Unity, you can filter out
only the object or group of objects that you want to see. You can search assets by their name, by Component type, and in some cases by
asset Labels. You can specify the search mode by choosing from the Search drop-down menu.
Scene Search
When a scene is loaded in the Editor, you can see the objects in both the Scene View and the Hierarchy. The specific assets are shared in both
places, so if you type in a search term (eg, "elevator"), you'll see the the filter applied both visually in the Scene View and a more typical manner
in the Hierarchy. There is also no difference between typing the search term into the search field in the Scene View or the Hierachy -- the filter
takes effect in both views in either case.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
When a search term filter is active, the Hierarchy doesn't show hierarchical relationships between GameObjects, but you can select any
GameObject, and it's hierarchical path in the scene will be shown at the bottom of the Hierarchy.
When you want to clear the search filter, just click the small cross in the search field.
In the Scene search you can search either by Name or by Type. Click on the small magnifying glass in the search field to open the search
drop-down menu and choose the search mode.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Project Search
The same fundamentals apply to searching of assets in the Project View -- just type in your search term and you'll see all the relevant assets
appear in the filter.
In the Project search you can search by Name or by Type as in the Scene search, and additionally you can search by Label. Click on the small
magnifying glass in the search field to open the search drop-down menu and choose the search mode.
Prefabs
A Prefab is a type of asset -- a reusable GameObject stored in Project View. Prefabs can be inserted into any number of scenes, multiple
times per scene. When you add a Prefab to a scene, you create an instance of it. All Prefab instances are linked to the original Prefab and are
essentially clones of it. No matter how many instances exist in your project, when you make any changes to the Prefab you will see the change
applied to all instances.
Creating Prefabs
In order to create a Prefab, simply drag a GameObject that you've created in the scene into the Project View. The GameObject's name will turn
blue to show that it is a Prefab. You can rename your new Prefab.
After you have performed these steps, the GameObject and all its children have been copied into the Prefab data. The Prefab can now be re-
used in multiple instances. The original GameObject in the Hierarchy has now become an instance of the Prefab.
Prefab Instances
To create a Prefab instance in the current scene, drag the Prefab from the Project View into the Scene or Hierarchy View. This instance is
linked to the Prefab, as displayed by the blue text used for their name in the Hierarchy View.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
If you have selected a Prefab instance, and want to make a change that affects all instances, you can click the Select button in the
Inspector to select the source Prefab.
Information about instantiating prefabs from scripts is in the Instantiating Prefabs page.
Inheritance
Inheritance means that whenever the source Prefab changes, those changes are applied to all linked GameObjects. For example, if you add a
new script to a Prefab, all of the linked GameObjects will instantly contain the script as well. However, it is possible to change the properties of
a single instance while keeping the link intact. Simply change any property of a prefab instance, and watch as the variable name becomes
bold. The variable is now overridden. All overridden properties will not be affected by changes in the source Prefab.
This allows you to modify Prefab instances to make them unique from their source Prefabs without breaking the Prefab link.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
If you want to update the source Prefab and all instances with the new overridden values, you can click the Apply button in the Inspector.
Note that the root's position and rotation will not be applied, as that affects the instances absolute position and would put all instances in
the same place. However position and rotation from any children or ancestors of the root will be applied as they are computed relative to
the root's transform.
If you want to discard all overrides on a particular instance, you can click the Revert button.
Imported Prefabs
When you place a mesh asset into your Assets folder, Unity automatically imports the file and generates something that looks similar to a
Prefab out of the mesh. This is not actually a Prefab, it is simply the asset file itself. Instancing and working with assets introduces some
limitations that are not present when working with normal Prefabs.
Notice the asset icon is a bit different from the Prefab icons
The asset is instantiated in the scene as a GameObject, linked to the source asset instead of a normal Prefab. Components can be added and
removed from this GameObject as normal. However, you cannot apply any changes to the asset itself since this would add data to the asset file
itself! If you're creating something you want to re-use, you should make the asset instance into a Prefab following the steps listed above under
"Creating Prefabs".
When you have selected an instance of an asset, the Apply button in the Inspector is replaced with an Edit button. Clicking this button will
launch the editing application for your asset (e.g. Maya or Max).
Page last updated: 2012-09-14
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Lights
Lights are an essential part of every scene. While meshes and textures define the shape and look of a scene, lights define the color and mood
of your 3D environment. You'll likely work with more than one light in each scene. Making them work together requires a little practice but the
results can be quite amazing.
Lights can be added to your scene from the GameObject->Create Other menu. Once a light has been added, you can manipulate it like any
other GameObject. Additionally, you can add a Light Component to any selected GameObject by using Component->Rendering->Light.
There are many different options within the Light Component in the Inspector.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
By simply changing the Color of a light, you can give a whole different mood to the scene.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The lights you create this way are realtime lights - their lighting is calculated each frame while the game is running. If you know the light will not
change, you can make your game faster and look much better by using Lightmapping.
Rendering paths
Unity supports different Rendering Paths, these paths affect mainly Lights and Shadows, so choosing the correct rendering path depending on
your game requirements can improve your project's performance. For more info about rendering paths you can visit the Rendering paths
section.
More information
For more information about using Lights, check the Lights page in the Reference Manual.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Cameras
Just as cameras are used in films to display the story to the audience, Cameras in Unity are used to display the game world to the player. You
will always have at least one camera in a scene, but you can have more than one. Multiple cameras can give you a two-player splitscreen or
create advanced custom effects. You can animate cameras, or control them with physics. Practically anything you can imagine is possible with
cameras, and you can use typical or unique cameras to fit your game's style.
Camera
Cameras are the devices that capture and display the world to the player. By customizing and manipulating cameras, you can make the
presentation of your game truly unique. You can have an unlimited number of cameras in a scene. They can be set to render in any order, at
any place on the screen, or only certain parts of the screen.
Properties
Clear Flags Determines which parts of the screen will be cleared. This is handy when using multiple Cameras to draw different
game elements.
Background Color applied to the remaining screen after all elements in view have been drawn and there is no skybox.
Culling Mask Include or omit layers of objects to be rendered by the Camera. Assign layers to your objects in the Inspector.
Projection Toggles the camera's capability to simulate perspective.
Perspective Camera will render objects with perspective intact.
Orthographic Camera will render objects uniformly, with no sense of perspective.
Size (when Orthographic is The viewport size of the Camera when set to Orthographic.
selected)
Field of view (when Width of the Camera's view angle, measured in degrees along the local Y axis.
Perspective is selected)
Clipping Planes Distances from the camera to start and stop rendering.
Near The closest point relative to the camera that drawing will occur.
Far The furthest point relative to the camera that drawing will occur.
Normalized View Port Rect Four values that indicate where on the screen this camera view will be drawn, in Screen Coordinates (values 0-1).
X The beginning horizontal position that the camera view will be drawn.
Y The beginning vertical position that the camera view will be drawn.
W (Width) Width of the camera output on the screen.
H (Height) Height of the camera output on the screen.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Depth The camera's position in the draw order. Cameras with a larger value will be drawn on top of cameras with a
smaller value.
Rendering Path Options for defining what rendering methods will be used by the camera.
Use Player Settings This camera will use whichever Rendering Path is set in the Player Settings.
Vertex Lit All objects rendered by this camera will be rendered as Vertex-Lit objects.
Forward All objects will be rendered with one pass per material.
Deferred Lighting (Unity All objects will be drawn once without lighting, then lighting of all objects will be rendered together at the end of
Pro only) the render queue.
Target Texture (Unity Pro Reference to a Render Texture that will contain the output of the Camera view. Making this reference will disable
only) this Camera's capability to render to the screen.
HDR Enables High Dynamic Range rendering for this camera.
Details
Cameras are essential for displaying your game to the player. They can be customized, scripted, or parented to achieve just about any kind of
effect imaginable. For a puzzle game, you might keep the Camera static for a full view of the puzzle. For a first-person shooter, you would
parent the Camera to the player character, and place it at the character's eye level. For a racing game, you'd likely want to have the Camera
follow your player's vehicle.
You can create multiple Cameras and assign each one to a different Depth. Cameras are drawn from low Depth to high Depth. In other words,
a Camera with a Depth of 2 will be drawn on top of a Camera with a depth of 1. You can adjust the values of the Normalized View Port
Rectangle property to resize and position the Camera's view onscreen. This can create multiple mini-views like missile cams, map views, rear-
view mirrors, etc.
Render Path
Unity supports different Rendering Paths. You should choose which one you use depending on your game content and target platform /
hardware. Different rendering paths have different features and performance characteristics that mostly affect Lights and Shadows.
The rendering Path used by your project is chosen in Player Settings. Additionally, you can override it for each Camera.
For more info on rendering paths, check the rendering paths page.
Clear Flags
Each Camera stores color and depth information when it renders its view. The portions of the screen that are not drawn in are empty, and will
display the skybox by default. When you are using multiple Cameras, each one stores its own color and depth information in buffers,
accumulating more data as each Camera renders. As any particular Camera in your scene renders its view, you can set the Clear Flags to
clear different collections of the buffer information. This is done by choosing one of the four options:
Skybox
This is the default setting. Any empty portions of the screen will display the current Camera's skybox. If the current Camera has no skybox set, it
will default to the skybox chosen in the Render Settings (found in Edit->Render Settings). It will then fall back to the Background Color.
Alternatively a Skybox component can be added to the camera. If you want to create a new Skybox, you can use this guide.
Solid Color
Any empty portions of the screen will display the current Camera's Background Color.
Depth Only
If you wanted to draw a player's gun without letting it get clipped inside the environment, you would set one Camera at Depth 0 to draw the
environment, and another Camera at Depth 1 to draw the weapon alone. The weapon Camera's Clear Flags should be set to to depth only.
This will keep the graphical display of the environment on the screen, but discard all information about where each object exists in 3-D space.
When the gun is drawn, the opaque parts will completely cover anything drawn, regardless of how close the gun is to the wall.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The gun is drawn last, after clearing the depth buffer of the cameras before it
Don't Clear
This mode does not clear either the color or the depth buffer. The result is that each frame is drawn over the next, resulting in a smear-looking
effect. This isn't typically used in games, and would likely be best used with a custom shader.
Clip Planes
The Near and Far Clip Plane properties determine where the Camera's view begins and ends. The planes are laid out perpendicular to the
Camera's direction and are measured from the its position. The Near plane is the closest location that will be rendered, and the Far plane is the
furthest.
The clipping planes also determine how depth buffer precision is distributed over the scene. In general, to get better precision you should move
the Near plane as far as possible.
Note that the near and far clip planes together with the planes defined by the field of view of the camera describe what is popularly known as
the camera frustum. Unity ensures that when rendering your objects those which are completely outside of this frustum are not displayed. This
is called Frustum Culling. Frustum Culling happens irrespective of whether you use Occlusion Culling in your game.
For performance reasons, you might want to cull small objects earlier. For example, small rocks and debris could be made invisible at much
smaller distance than large buildings. To do that, put small objects into a separate layer and setup per-layer cull distances using
Camera.layerCullDistances script function.
Culling Mask
The Culling Mask is used for selectively rendering groups of objects using Layers. More information on using layers can be found here.
Commonly, it is good practice to put your User Interface on a different layer, then render it by itself with a separate Camera set to render the UI
layer by itself.
In order for the UI to display on top of the other Camera views, you'll also need to set the Clear Flags to Depth only and make sure that the UI
Camera's Depth is higher than the other Cameras.
It's easy to create a two-player split screen effect using Normalized Viewport Rectangle. After you have created your two cameras, change
both camera H value to be 0.5 then set player one's Y value to 0.5, and player two's Y value to 0. This will make player one's camera display
from halfway up the screen to the top, and player two's camera will start at the bottom and stop halfway up the screen.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Orthographic
Marking a Camera as Orthographic removes all perspective from the Camera's view. This is mostly useful for making isometric or 2D games.
Note that fog is rendered uniformly in orthographic camera mode and may therefore not appear as expected. Read more about why in the
component reference on Render Settings.
Perspective camera.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Render Texture
This feature is only available for Unity Advanced licenses . It will place the camera's view onto a Texture that can then be applied to another
object. This makes it easy to create sports arena video monitors, surveillance cameras, reflections etc.
Hints
Cameras can be instantiated, parented, and scripted just like any other GameObject.
To increase the sense of speed in a racing game, use a high Field of View.
Cameras can be used in physics simulation if you add a Rigidbody Component.
There is no limit to the number of Cameras you can have in your scenes.
Orthographic cameras are great for making 3D user interfaces
If you are experiencing depth artifacts (surfaces close to each other flickering), try setting Near Plane to as large as possible.
Cameras cannot render to the Game Screen and a Render Texture at the same time, only one or the other.
Pro license holders have the option of rendering a Camera's view to a texture, called Render-to-Texture, for even more unique effects.
Unity comes with pre-installed Camera scripts, found in Components->Camera Control. Experiment with them to get a taste of what's
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
possible.
Page last updated: 2007-11-16
Terrains
This section will explain how to use the Terrain Engine. It will cover creation, technical details, and other considerations. It is broken into the
following sections:
Using Terrains
This section covers the most basic information about using Terrains. This includes creating Terrains and how to use the new Terrain tools &
brushes.
Height
This section explains how to use the different tools and brushes that alter the Height of the Terrain.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Terrain Textures
This section explains how to add, paint and blend Terrain Textures using different brushes.
Trees
This section contains important information for creating your own tree assets. It also covers adding and painting trees on your Terrain.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Grass
This section explains how grass works, and how to use it.
Detail Meshes
This section explains practical usage for detail meshes like rocks, haystacks, vegetation, etc.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Lightmaps
You can lightmap terrains just like any other objects using Unity's built-in lightmapper. See Lightmapping Quickstart for help.
Other Settings
This section covers all the other settings associated with Terrains.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Project View displays all source files and created Prefabs
This view shows the organization of files in your project's Assets folder. Whenever you update one of your asset files, the changes are
immediately reflected in your game!
To import an asset file into your project, move the file into (your Project folder)->Assets in the Finder, and it will automatically be imported into
Unity. To apply your assets, simply drag the asset file from the Project View window into the Hierarchy or Scene View. If the asset is meant to
be applied to another object, drag the asset over the object.
Hints
It is always a good idea to add labels to your assets when you are working with big projects or when you want to keep organized all your
assets, with this you can search for the labels associated to each asset in the search field in the project view.
When backing up a project folder always back up Assets, ProjectSettings and Library folders. The Library folder contains all meta data and
all the connections between objects, thus if the Library folder gets lost, you will lose references from scenes to assets. Easiest is just to
back up the whole project folder containing the Assets, ProjectSettings and Library folders.
Rename and move files to your heart's content inside Project View; nothing will break.
Never rename or move anything from the Finder or another program; everything will break. In short, Unity stores lots of metadata for each
asset (things like import settings, cached versions of compressed textures, etc.) and if you move a file externally, Unity can no longer
associate metadata with the moved file.
Importing Assets
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Models
3D formats
Legacy animation system
Materials and Shaders
Texture 2D
Procedural Materials
Movie Texture
Audio Files
Tracker Modules
Using Scripts
Asset Store
Asset Store Publisher Administration
Asset Server (Team License Only)
Setting up the Asset Server
Cache Server (Team License Only)
Cache Server (Team license only)
Cache Server FAQ
Behind the Scenes
Page last updated: 2012-01-08
Importing Assets
Unity will automatically detect files as they are added to your Project folder's Assets folder. When you put any asset into your Assets folder,
you will see the asset appear in your Project View.
The Project View is your window into the Assets folder, normally accessible from the file manager
When you are organizing your Project View, there is one very important thing to remember:
Never move any assets or organize this folder from the Explorer (Windows) or Finder (OS X). Always use the Project View!
There is a lot of meta data stored about relationships between asset files within Unity. This data is all dependent on where Unity expects to find
these assets. If you move an asset from within the Project View, these relationships are maintained. If you move them outside of Unity, these
relationships are broken. You'll then have to manually re-link lots of dependencies, which is something you probably don't want to do.
So just remember to only save assets to the Assets folder from other applications, and never rename or move files outside of Unity. Always use
Project View. You can safely open files for editing from anywhere, of course.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Additionally, as you update and save your assets, the changes will be detected and the asset will be re-imported in Unity. This allows you to
focus on refining your assets without struggling to make them compatible with Unity. Updating and saving your assets normally from its native
application provides optimum, hassle-free workflow that feels natural.
Asset Types
There are a handful of basic asset types that will go into your game. The types are:
We'll discuss the details of importing each of these file types and how they are used.
Your mesh file does not need to have an animation to be imported. If you do use animations, you have your choice of importing all animations
from a single file, or importing separate files, each with one animation. For more information about importing animations, please see the Legacy
animation system page.
Once your mesh is imported into Unity, you can drag it to the Scene or Hierarchy to create an instance of it. You can also add Components to
the instance, which will not be attached to mesh file itself.
Meshes will be imported with UVs and a number of default Materials (one material per UV). You can then assign the appropriate texture files to
the materials and complete the look of your mesh in Unity's game engine.
Textures
Unity supports all image formats. Even when working with layered Photoshop files, they are imported without disturbing the Photoshop format.
This allows you to work with a single texture file for a very care-free and streamlined experience.
You should make your textures in dimensions that are to the power of two (e.g. 32x32, 64x64, 128x128, 256x256, etc.) Simply placing them in
your project's Assets folder is sufficient, and they will appear in the Project View.
Once your texture has been imported, you should assign it to a Material. The material can then be applied to a mesh, Particle System, or GUI
Texture. Using the Import Settings, it can also be converted to a Cubemap or Normalmap for different types of applications in the game. For
more information about importing textures, please read the Texture Component page.
Sounds
Desktop
Unity features support for two types of audio: Uncompressed Audio or Ogg Vorbis. Any type of audio file you import into your project will be
converted to one of these formats.
Import Settings
If you are importing a file that is not already compressed as Ogg Vorbis, you have a number of options in the Import Settings of the Audio Clip.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Select the Audio Clip in the Project View and edit the options in the Audio Importer section of the Inspector. Here, you can compress the Clip
into Ogg Vorbis format, force it into Mono or Stereo playback, and tweak other options. There are positives and negatives for both Ogg Vorbis
and uncompressed audio. Each has its own ideal usage scenarios, and you generally should not use either one exclusively.
Read more about using Ogg Vorbis or Uncompressed audio on the Audio Clip Component Reference page.
iOS
Unity features support for two types of audio: Uncompressed Audio or MP3 Compressed audio. Any type of audio file you import into your
project will be converted to one of these formats.
Import Settings
When you are importing an audio file, you can select its final format and choose to force it to stereo or mono channels. To access the Import
Settings, select the Audio Clip in the Project View and find the Audio Importer in the Inspector. Here, you can compress the Clip into Ogg
Vorbis format, force it into Mono or Stereo playback, and tweak other options, such as the very important Decompress On Load setting.
Read more about using MP3 Compressed or Uncompressed audio on the Audio Clip Component Reference page.
Android
Unity features support for two types of audio: Uncompressed Audio or MP3 Compressed audio. Any type of audio file you import into your
project will be converted to one of these formats.
Import Settings
When you are importing an audio file, you can select its final format and choose to force it to stereo or mono channels. To access the Import
Settings, select the Audio Clip in the Project View and find the Audio Importer in the Inspector. Here, you can compress the Clip into Ogg
Vorbis format, force it into Mono or Stereo playback, and tweak other options, such as the very important Decompress On Load setting.
Read more about using MP3 Compressed or Uncompressed audio on the Audio Clip Component Reference page.
Once sound files are imported, they can be attached to any GameObject. The Audio file will create an Audio Source Component automatically
when you drag it onto a GameObject.
Meshes
When a 3D model is imported, Unity represents it as many different objects, including a hierarchy of GameObjects, Meshes (can be skinned
depending on import options), AnimationClips, etc. In the Project folder the main imported object is a Model Prefab.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
A Mesh must be attached to a GameObject using a Mesh Filter component. For the mesh to be visible, the GameObject must also have a
Mesh Renderer or other suitable rendering component attached. With these components in place, the mesh will be visible at the GameObject's
position with its exact appearance dependent on the Material used by the renderer.
A Mesh Filter together with Mesh Renderer makes the model appear on screen.
Unity's mesh importer provides many options for controlling the generation of the mesh and associating it with its textures and materials. These
options are covered by the following pages:
3D formats
Page last updated: 2012-01-19
3D-formats
Importing meshes into Unity can be achieved from two main types of files:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Either should enable you to get your meshes into Unity, but there are considerations as to which type you choose:
Exported 3D files
Unity can read .FBX, .dae (Collada), .3DS, .dxf and .obj files, FBX exporters can be found here and obj or Collada exporters can also be
found for many applications
Advantages:
Disadvantages:
Advantages:
Quick iteration process (save the source file and Unity reimports)
Simple initially
Disadvantages:
A licensed copy of that software must be installed on all machines using the Unity project
Files can become bloated with unnecessary data
Big files can slow Unity updates
Less validation � harder to troubleshoot problems
Page last updated: 2012-10-24
Animations (Legacy)
Prior to the introduction of Mecanim, Unity used its own animation system and for backward compatiblity, this system is still available. The main
reason for using legacy animation is to continue working with an old project without the work of updating it for Mecanim. However, it is not
recommended that you use the legacy system for new projects.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Animation tab on the importer will then look something like this:
Below the properties in the inspector is a list of animation clips. When you click on a clip in the list, an additional panel will appear below it in the
inspector:-
The Start and End values can be changed to allow you to use just a part of the original clip (see the page on |splitting animations for further
details). The Add Loop Frame option adds an extra keyframe to the end of the animation that is exactly the same as the keyframe at the start.
This enables the animation to loop smoothly even when the last frame doesn't exactly match up with the first. The Wrap Mode setting is
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
identical to the master setting in the main animation properties but applies only to that specific clip.
Materials
There is a close relationship between Materials and Shaders in Unity. Shaders contain code that defines what kind of properties and assets to
use. Materials allow you to adjust properties and assign assets.
To create a new Material, use Assets->Create->Material from the main menu or the Project View context menu. Once the Material has been
created, you can apply it to an object and tweak all of its properties in the Inspector. To apply it to an object, just drag it from the Project View
to any object in the Scene or Hierarchy.
Built-in Shaders
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
A set of built-in Shaders are installed with the Unity editor. Over eighty shaders are available - the main ones used for texturing game objects
fall into the following categories:-
In each group, built-in shaders range by complexity, from the simple VertexLit to the complex Parallax Bumped with Specular. For more
information about performance of Shaders, please read the built-in Shader performance page
In addition to the main game object shaders, there are a number of other categories for specialised purposes:-
Also, some of these shaders have special versions for use with mobile devices.
A Shader basically defines a formula for how the in-game shading should look. Within any given Shader is a number of properties (typically
textures). Shaders are implemented through Materials, which are attached directly to individual GameObjects. Within a Material, you will
choose a Shader, then define the properties (usually textures and colors, but properties can vary) that are used by the Shader.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
On the left side of the graph is the Carbody Shader. 2 different Materials are created from this: Blue car Material and Red car Material. Each
of these Materials have 2 textures assigned; the Car Texture defines the main texture of the car, and a Color FX texture. These properties are
used by the shader to make the car finish look like 2-tone paint. This can be seen on the front of the red car: it is yellow where it faces the
camera and then fades towards purple as the angle increases. The car materials are attached to the 2 cars. The car wheels, lights and
windows don't have the color change effect, and must hence use a different Material. At the bottom of the graph there is a Simple Metal
Shader. The Wheel Material is using this Shader. Note that even though the same Car Texture is reused here, the end result is quite different
from the car body, as the Shader used in the Material is different.
The method to render an object. This includes using different methods depending on the graphics card of the end user.
Any vertex and fragment programs used to render.
Some texture properties that are assignable within Materials.
Color and number settings that are assignable within Materials.
A Material defines:
Shaders are meant to be written by graphics programmers. They are created using the ShaderLab language, which is quite simple. However,
getting a shader to work well on a variety graphics cards is an involved job and requires a fairly comprehensive knowledge of how graphics
cards work.
A number of shaders are built into Unity directly, and some more come in the Standard Assets Library. For further information about shaders,
see the Built-in Shader Guide.
Textures
Textures bring your Meshes, Particles, and interfaces to life! They are image or movie files that you lay over or wrap around your objects. As
they are so important, they have a lot of properties. If you are reading this for the first time, jump down to Details, and return to the actual
settings when you need a reference.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The shaders you use for your objects put specific requirements on which textures you need, but the basic principle is that you can put any
image file inside your project. If it meets the size requirements (specified below), it will get imported and optimized for game use. This extends
to multi-layer Photoshop or TIFF files - they are flattened on import, so there is no size penalty for your game. Note that this flattening happens
internally to Unity, and is optional, so you can continue to save and import your PSD files with layers intact. The PSD file is not flattened, in
other words.
Properties
The Texture Inspector looks a bit different from most others:
The inspector is split into two sections, the Texture Importer and the texture preview.
Texture Importer
Textures all come from image files in your Project Folder. How they are imported is specified by the Texture Importer. You change these by
selecting the file texture in the Project View and modifying the Texture Importer in the Inspector.
The topmost item in the inspector is the Texture Type menu that allows you to select the type of texture you want to create from the source
image file.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Texture Type Select this to set basic parameters depending on the purpose of your texture.
Texture This is the most common setting used for all the textures in general.
Normal Map Select this to turn the color channels into a format suitable for real-time normal mapping. For more info, see
Normal Maps
GUI Use this if your texture is going to be used on any HUD/GUI Controls.
Reflection Also known as Cube Maps, used to create reflections on textures. check Cubemap Textures for more info.
Cookie This sets up your texture with the basic parameters used for the Cookies of your lights
Advanced Select this when you want to have specific parameters on your texture and you want to have total control over
your texture.
Alpha From Grayscale If enabled, an alpha transparency channel will be generated by the image's existing values of light & dark.
Wrap Mode Selects how the Texture behaves when tiled:
Repeat The Texture repeats (tiles) itself
Clamp The Texture's edges get stretched
Filter Mode Selects how the Texture is filtered when it gets stretched by 3D transformations:
Point The Texture becomes blocky up close
Bilinear The Texture becomes blurry up close
Trilinear Like Bilinear, but the Texture also blurs between the different mip levels
Aniso Level Increases texture quality when viewing the texture at a steep angle. Good for floor and ground textures, see below.
Create from Greyscale If this is enabled then Bumpiness and Filtering options will be shown.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Filter Mode Selects how the Texture is filtered when it gets stretched by 3D transformations:
Point The Texture becomes blocky up close
Bilinear The Texture becomes blurry up close
Trilinear Like Bilinear, but the Texture also blurs between the different mip levels
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
An interesting way to add a lot of visual detail to your scenes is to use Cookies - greyscale textures you use to control the precise look of in-
game lighting. This is fantastic for making moving clouds and giving an impression of dense foliage. The Light page has more info on all this,
but the main thing is that for textures to be usable for cookies you just need to set the Texture Type to Cookie.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Light Type Type of light that the texture will be applied to. (This can be Spotlight, Point or Directional lights). For Directional Lights this
texture will tile, so in the texture inspector, you must set the Edge Mode to Repeat; for SpotLights You should keep the edges of
your cookie texture solid black in order to get the proper effect. In the Texture Inspector, set the Edge Mode to Clamp.
Mapping (Point light only) Options for mapping the texture onto the spherical cast of the point light.
Sphere Maps the texture to a "sphere like" cubemap.
Mapped
CylindricalMaps the texture to a cylinder, use this when you want to use reflections on objects that are like cylinders.
Simple Maps the texture to a simple sphere, deforming the reflection when you rotate it.
Sphere
Nice Maps the texture to a sphere, deforming it when you rotate but you still can see the texture's wrap
Sphere
6 Frames The texture contains six images arranged in one of the standard cubemap layouts, cross or sequence (+x -x +y -y +z -z) and the
Layout images can be in either horizontal or vertical orientation.
Fixup edge (Point light only) Removes visual artifacts at the joined edges of the map image(s).
seams
Alpha from If enabled, an alpha transparency channel will be generated by the image's existing values of light & dark.
Greyscale
Filter Mode Selects how the Texture is filtered when it gets stretched by 3D transformations:
Point The Texture becomes blocky up close
Bilinear The Texture becomes blurry up close
Trilinear Like Bilinear, but the Texture also blurs between the different mip levels
Aniso Level Increases texture quality when viewing the texture at a steep angle. Good for floor and ground textures, see below.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Non Power of 2 If texture has non-power-of-two size, this will define a scaling behavior at import time (for more info see the Texture
Sizes section below):
None Texture size will be kept as-is.
To nearest Texture will be scaled to the nearest power-of-two size at import time. For instance 257x511 texture will become
256x512. Note that PVRTC formats require textures to be square (width equal to height), therefore final size will be
upscaled to 512x512.
To larger Texture will be scaled to the next larger power-of-two size at import time. For instance 257x511 texture will become
512x512.
To smaller Texture will be scaled to the next smaller power-of-two size at import time. For instance 257x511 texture will become
256x256.
Generate Cube Map Generates a cubemap from the texture using different generation methods.
Spheremap Maps the texture to a "sphere like" cubemap.
Cylindrical Maps the texture to a cylinder, use this when you want to use reflections on objects that are like cylinders.
SimpleSpheremapMaps the texture to a simple sphere, deforming the reflection when you rotate it.
NiceSpheremap Maps the texture to a sphere, deforming it when you rotate but you still can see the texture's wrap
FacesVertical The texture contains the six faces of the cube arranged in a vertical strip in the order +x -x +y -y +z -z.
FacesHorizontal The texture contains the six faces of the cube arranged in a horizontal strip in the order +x -x +y -y +z -z.
CrossVertical The texture contains the six faces of the cube arranged in a vertically oriented cross.
CrossHorizontal The texture contains the six faces of the cube arranged in a horizontally oriented cross.
Read/Write Enabled Select this to enable access to the texture data from scripts (GetPixels, SetPixels and other Texture2D functions). Note
however that a copy of the texture data will be made, doubling the amount of memory required for texture asset. Use
only if absolutely necessary. This is only valid for uncompressed and DTX compressed textures, other types of
compressed textures cannot be read from. Disabled by default.
Import Type The way the image data is interpreted.
Default Standard texture.
Normal Map Texture is treated as a normal map (enables other options)
Lightmap Texture is treated as a lightmap (disables other options)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Alpha from grayscale(Default mode only) Generates the alpha channel from the luminance information in the image
Create from (Normal map mode only) Creates the map from the luminance information in the image
grayscale
Bypass sRGB (Default mode only) Use the exact colour values from the image rather than compensating for gamma (useful when the
sampling texture is for GUI or used as a way to encode non-image data)
Generate Mip Maps Select this to enable mip-map generation. Mip maps are smaller versions of the texture that get used when the texture is
very small on screen. For more info, see Mip Maps below.
In Linear Space Generate mipmaps in linear colour space.
Border Mip Maps Select this to avoid colors seeping out to the edge of the lower Mip levels. Used for light cookies (see below).
Mip Map Filtering Two ways of mip map filtering are available to optimize image quality:
Box The simplest way to fade out the mipmaps - the mip levels become smoother and smoother as they go down in size.
Kaiser A sharpening Kaiser algorithm is run on the mip maps as they go down in size. If your textures are too blurry in the
distance, try this option.
Fade Out Mipmaps Enable this to make the mipmaps fade to gray as the mip levels progress. This is used for detail maps. The left most
scroll is the first mip level to begin fading out at. The rightmost scroll defines the mip level where the texture is completely
grayed out
Wrap Mode Selects how the Texture behaves when tiled:
Repeat The Texture repeats (tiles) itself
Clamp The Texture's edges get stretched
Filter Mode Selects how the Texture is filtered when it gets stretched by 3D transformations:
Point The Texture becomes blocky up close
Bilinear The Texture becomes blurry up close
Trilinear Like Bilinear, but the Texture also blurs between the different mip levels
Aniso Level Increases texture quality when viewing the texture at a steep angle. Good for floor and ground textures, see below.
Per-Platform Overrides
When you are building for different platforms, you have to think about the resolution of your textures for the target platform, the size and the
quality. You can set default options and then override the defaults for a specific platform.
If you have set the Texture Type to Advanced then the Texture Format has different values.
Desktop
Texture What internal representation is used for the texture. This is a tradeoff between size and quality. In the examples below we show
Format the final size of an in-game texture of 256 by 256 pixels:
RGB Compressed RGB texture. This is the most common format for diffuse textures. 4 bits per pixel (32 KB for a 256x256 texture).
Compressed
DXT1
RGBA Compressed RGBA texture. This is the main format used for diffuse & specular control textures. 1 byte/pixel (64 KB for a
Compressed 256x256 texture).
DXT5
RGB 16 65 thousand colors with no alpha. Compressed DXT formats use less memory and usually look better. 128 KB for a 256x256
bit texture.
RGB 24 Truecolor but without alpha. 192 KB for a 256x256 texture.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
bit
Alpha 8 High quality alpha channel but without any color. 64 KB for a 256x256 texture.
bit
RGBA 16 Low-quality truecolor. Has 16 levels of red, green, blue and alpha. Compressed DXT5 format uses less memory and usually
bit looks better. 128 KB for a 256x256 texture.
RGBA 32 Truecolor with alpha - this is the highest quality. At 256 KB for a 256x256 texture, this one is expensive. Most of the time, DXT5
bit offers sufficient quality at a much smaller size. The main way this is used is for normal maps, as DXT compression there often
carries a visible quality loss.
iOS
Texture Format What internal representation is used for the texture. This is a tradeoff between size and quality. In the examples below
we show the final size of a in-game texture of 256 by 256 pixels:
RGB Compressed Compressed RGB texture. This is the most common format for diffuse textures. 4 bits per pixel (32 KB for a 256x256
PVRTC 4 bits texture)
RGBA CompressedCompressed RGBA texture. This is the main format used for diffuse & specular control textures or diffuse textures with
PVRTC 4 bits transparency. 4 bits per pixel (32 KB for a 256x256 texture)
RGB Compressed Compressed RGB texture. Lower quality format suitable for diffuse textures. 2 bits per pixel (16 KB for a 256x256
PVRTC 2 bits texture)
RGBA CompressedCompressed RGBA texture. Lower quality format suitable for diffuse & specular control textures. 2 bits per pixel (16 KB
PVRTC 2 bits for a 256x256 texture)
RGB Compressed Compressed RGB texture. This format is not supported on iOS, but kept for backwards compatibility with desktop
DXT1 projects.
RGBA CompressedCompressed RGBA texture. This format is not supported on iOS, but kept for backwards compatibility with desktop
DXT5 projects.
RGB 16 bit 65 thousand colors with no alpha. Uses more memory than PVRTC formats, but could be more suitable for UI or crisp
textures without gradients. 128 KB for a 256x256 texture.
RGB 24 bit Truecolor but without alpha. 192 KB for a 256x256 texture.
Alpha 8 bit High quality alpha channel but without any color. 64 KB for a 256x256 texture.
RGBA 16 bit Low-quality truecolor. Has 16 levels of red, green, blue and alpha. Uses more memory than PVRTC formats, but can
be handy if you need exact alpha channel. 128 KB for a 256x256 texture.
RGBA 32 bit Truecolor with alpha - this is the highest quality. At 256 KB for a 256x256 texture, this one is expensive. Most of the
time, PVRTC formats offers sufficient quality at a much smaller size.
Compression quality Choose Fast for quickest performance, Best for the best image quality and Normal for a balance between the two.
Android
Texture Format What internal representation is used for the texture. This is a tradeoff between size and quality. In the examples below we
show the final size of a in-game texture of 256 by 256 pixels:
RGB Compressed RGB texture. Supported by Nvidia Tegra. 4 bits per pixel (32 KB for a 256x256 texture).
Compressed DXT1
RGBA Compressed RGBA texture. Supported by Nvidia Tegra. 6 bits per pixel (64 KB for a 256x256 texture).
Compressed DXT5
RGB Compressed RGB texture. This is the default texture format for Android projects. ETC1 is part of OpenGL ES 2.0 and is
Compressed ETC 4supported by all OpenGL ES 2.0 GPUs. It does not support alpha. 4 bits per pixel (32 KB for a 256x256 texture)
bits
RGB Compressed RGB texture. Supported by Imagination PowerVR GPUs. 2 bits per pixel (16 KB for a 256x256 texture)
Compressed
PVRTC 2 bits
RGBA Compressed RGBA texture. Supported by Imagination PowerVR GPUs. 2 bits per pixel (16 KB for a 256x256 texture)
Compressed
PVRTC 2 bits
RGB Compressed RGB texture. Supported by Imagination PowerVR GPUs. 4 bits per pixel (32 KB for a 256x256 texture)
Compressed
PVRTC 4 bits
RGBA Compressed RGBA texture. Supported by Imagination PowerVR GPUs. 4 bits per pixel (32 KB for a 256x256 texture)
Compressed
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
PVRTC 4 bits
RGB Compressed RGB texture. Supported by Qualcomm Snapdragon. 4 bits per pixel (32 KB for a 256x256 texture).
Compressed ATC
4 bits
RGBA Compressed RGBA texture. Supported by Qualcomm Snapdragon. 6 bits per pixel (64 KB for a 256x256 texture).
Compressed ATC
8 bits
RGB 16 bit 65 thousand colors with no alpha. Uses more memory than the compressed formats, but could be more suitable for UI or
crisp textures without gradients. 128 KB for a 256x256 texture.
RGB 24 bit Truecolor but without alpha. 192 KB for a 256x256 texture.
Alpha 8 bit High quality alpha channel but without any color. 64 KB for a 256x256 texture.
RGBA 16 bit Low-quality truecolor. The default compression for the textures with alpha channel. 128 KB for a 256x256 texture.
RGBA 32 bit Truecolor with alpha - this is the highest quality compression for the textures with alpha. 256 KB for a 256x256 texture.
Compression Choose Fast for quickest performance, Best for the best image quality and Normal for a balance between the two.
quality
Unless you're targeting a specific hardware, like Tegra, we'd recommend using ETC1 compression. If needed you could store an external alpha
channel and still benefit from lower texture footprint. If you absolutely want to store an alpha channel in a texture, RGBA16 bit is the
compression supported by all hardware vendors.
Textures can be imported from DDS files but only DXT or uncompressed pixel formats are currently supported.
If your app utilizes an unsupported texture compression, the textures will be uncompressed to RGBA 32 and stored in memory along with the
compressed ones. So in this case you lose time decompressing textures and lose memory storing them twice. It may also have a very negative
impact on rendering performance.
Flash
Format Image format
RGB JPG Compressed RGB image data compressed in JPG format
RGBA JPG Compressed RGBA image data (ie, with alpha) compressed in JPG format
RGB 24-bit Uncompressed RGB image data, 8 bits per channel
RGBA 32-bit Uncompressed RGBA image data, 8 bits per channel
Details
Supported Formats
Unity can read the following file formats: PSD, TIFF, JPG, TGA, PNG, GIF, BMP, IFF, PICT. It should be noted that Unity can import multi-layer
PSD & TIFF files just fine. They are flattened automatically on import but the layers are maintained in the assets themselves, so you don't lose
any of your work when using these file types natively. This is important as it allows you to just have one copy of your textures that you can use
from Photoshop, through your 3D modelling app and into Unity.
Texture Sizes
Ideally texture sizes should be powers of two on the sides. These sizes are as follows: 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048 etc. pixels.
The textures do not have to be square, i.e. width can be different from height. Note that each platform may impose maximum texture sizes.
It is possible to use other (non power of two - "NPOT") texture sizes with Unity. Non power of two texture sizes generally take slightly more
memory and might be slower to read by the GPU, so for performance it's best to use power of two sizes whenever you can. If the platform or
GPU does not support NPOT texture sizes, then Unity will scale and pad the texture up to next power of two size, which will use even more
memory and make loading slower (in practice, this always happens on Flash and some older Android devices). In general you'd want to use
non power of two sizes only for GUI purposes.
Non power of two texture assets can be scaled up at import time using the Non Power of 2 option in the advanced texture type in the import
settings.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
UV Mapping
When mapping a 2D texture onto a 3D model, some sort of wrapping is done. This is called UV mapping and is done in your 3D modelling app.
Inside Unity, you can scale and move the texture using Materials. Scaling normal & detail maps is especially useful.
Mip Maps
Mip Maps are a list of progressively smaller versions of an image, used to optimise performance on real-time 3D engines. Objects that are far
away from the camera use the smaller texture versions. Using mip maps uses 33% more memory, but not using them can be a huge
performance loss. You should always use mipmaps for in-game textures; the only exceptions are textures that will never be minified (e.g. GUI
textures).
Normal Maps
Normal maps are used by normal map shaders to make low-polygon models look as if they contain more detail. Unity uses normal maps
encoded as RGB images. You also have the option to generate a normal map from a grayscale height map image.
Detail Maps
If you want to make a terrain, you normally use your main texture to show where there are areas of grass, rocks sand, etc... If your terrain has
a decent size, it will end up very blurry. Detail textures hide this fact by fading in small details as your main texture gets up close.
When drawing detail textures, a neutral gray is invisible, white makes the main texture twice as bright and black makes the main texture
completely black.
Anisotropic filtering
Anisotropic filtering increases texture quality when viewed from a grazing angle, at some expense of rendering cost (the cost is entirely on the
graphics card). Increasing anisotropy level is usually a good idea for ground and floor textures. In Quality Settings anisotropic filtering can be
forced for all textures or disabled completely.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Procedural Materials
Unity incorporates a new asset type known as Procedural Materials. These are essentially the same as standard Materials except that the
textures they use can be generated at runtime rather than being predefined and stored.
The script code that generates a texture procedurally will typically take up much less space in storage and transmission than a bitmap image
and so Procedural Materials can help reduce download times. Additionally, the generation script can be equipped with parameters that can be
changed in order to vary the visual properties of the material at runtime. These properties can be anything from color variations to the size of
bricks in a wall. Not only does this mean that many variations can be generated from a single Procedural Material but also that the material can
be animated on a frame-by-frame basis. Many interesting visual effects are possible - imagine a character gradually turning to stone or acid
damaging a surface as it touches.
Unity's Procedural Material system is based around an industry standard product called Substance, developed by Allegorithmic
Supported Platforms
In Unity, Procedural Materials are fully supported for standalone and webplayer build targets only (Windows and Mac OS X). For all other
platforms, Unity will pre-render or bake them into ordinary Materials during the build. Although this clearly negates the runtime benefits of
procedural generation, it is still useful to be able to create variations on a basic material in the editor.
Although they are implemented differently, Unity handles a Procedural Material just like any other Material. To assign a Procedural Material to a
mesh, for example, you just drag and drop it onto the mesh exactly as you would with any other Material.
Procedural Properties
Each Procedural Material is a custom script which generates a particular type of material. These scripts are similar to Unity scripts in that they
can have variables exposed for assignment in the inspector. For example, a "Brick Wall" Procedural Material could expose properties that let
you set the number of courses of bricks, the colors of the bricks and the color of the mortar. This potentially offers infinite material variations
from a single asset. These properties can also be set from a script at runtime in much the same way as the public variables of a
MonoBehaviour script.
Procedural Materials can also incorporate complex texture animation. For example, you could animate the hands of the clock or cockroaches
running across a floor.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
images can be filtered and modified before use. Unlike a standard Material, a Procedural Material can use vector images in the form of SVG
files which allows for resolution-independent textures.
The design tools available for creating Procedural Materials from scratch use visual, node-based editing similar to the kind found in artistic tools.
This makes creation accessible to artists who may have little or no coding experience. As an example, here is a screenshot from Allegorithmic's
Substance Designer which shows a "brick wall" Procedural Material under construction:
Procedural Materials support a form of caching whereby the material is only updated if its parameters have changed since it was last generated.
Further to this, some materials may have many properties that could theoretically be changed and yet only a few will ever need to change at
runtime. In such cases, you can inform Unity about the variables that will not change to help it cache as much data as possible from the
previous generation of the material. This will often improve performance significantly.
Procedural Materials can refer to hidden, system-wide, variables, such as elapsed time or number of Procedural Material instances (this data
can be useful for animations). Changes in the values of these variables can still force a Procedural Material to update even if none of the
explicitly defined parameters change.
Procedural Materials can also be used purely as a convenience in the editor (ie, you can generate a standard Material by setting the parameters
of a Procedural Material and then "baking" it). This will remove the runtime overhead of material generation but naturally, the baked materials
can't be changed or animated during gameplay.
Substance Player uses the same optimized rendering engine as the one integrated into Unity, so its rendering measurement is more
representative of performance in Unity than that of Substance Designer.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Video Files
Note: This is a Pro/Advanced feature only.
Desktop
Movie Textures are animated Textures that are created from a video file. By placing a video file in your project's Assets Folder, you can
import the video to be used exactly as you would use a regular Texture.
Video files are imported via Apple QuickTime. Supported file types are what your QuickTime installation can play (usually .mov, .mpg, .mpeg,
.mp4, .avi, .asf). On Windows movie importing requires Quicktime to be installed (download here).
Properties
The Movie Texture Inspector is very similar to the regular Texture Inspector.
Aniso Level Increases Texture quality when viewing the texture at a steep angle. Good for floor and ground textures
Filtering Mode Selects how the Texture is filtered when it gets stretched by 3D transformations
Loop If enabled, the movie will loop when it finishes playing
Quality Compression of the Ogg Theora video file. A higher value means higher quality, but larger file size
Details
When a video file is added to your Project, it will automatically be imported and converted to Ogg Theora format. Once your Movie Texture has
been imported, you can attach it to any GameObject or Material, just like a regular Texture.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
// this line of code will make the Movie Texture begin playing
renderer.material.mainTexture.Play();
Attach the following script to toggle Movie playback when the space bar is pressed:
function Update () {
if (Input.GetButtonDown ("Jump")) {
if (renderer.material.mainTexture.isPlaying) {
renderer.material.mainTexture.Pause();
}
else {
renderer.material.mainTexture.Play();
}
}
}
For more information about playing Movie Textures, see the Movie Texture Script Reference page
Movie Audio
When a Movie Texture is imported, the audio track accompanying the visuals are imported as well. This audio appears as an AudioClip child of
the Movie Texture.
The video's audio track appears as a child of the Movie Texture in the Project View
To play this audio, the Audio Clip must be attached to a GameObject, like any other Audio Clip. Drag the Audio Clip from the Project View onto
any GameObject in the Scene or Hierarchy View. Usually, this will be the same GameObject that is showing the Movie. Then use audio.Play()
to make the the movie's audio track play along with its video.
iOS
Movie Textures are not supported on iOS. Instead, full-screen streaming playback is provided using Handheld.PlayFullScreenMovie.
You need to keep your videos inside the StreamingAssets folder located in your Project directory.
Unity iOS supports any movie file types that play correctly on an iOS device, implying files with the extensions .mov, .mp4, .mpv, and .3gp
and using one of the following compression standards:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
For more information about supported compression standards, consult the iPhone SDK MPMoviePlayerController Class Reference.
As soon as you call iPhoneUtils.PlayMovie or iPhoneUtils.PlayMovieURL, the screen will fade from your current content to the designated
background color. It might take some time before the movie is ready to play but in the meantime, the player will continue displaying the
background color and may also display a progress indicator to let the user know the movie is loading. When playback finishes, the screen will
fade back to your content.
The video player does not respect switching to mute while playing videos
As written above, video files are played using Apple's embedded player (as of SDK 3.2 and iPhone OS 3.1.2 and earlier). This contains a bug
that prevents Unity switching to mute.
Android
Movie Textures are not supported on Android. Instead, full-screen streaming playback is provided using Handheld.PlayFullScreenMovie.
You need to keep your videos inside of the StreamingAssets folder located in your Project directory.
Unity Android supports any movie file type supported by Android, (ie, files with the extensions .mp4 and .3gp) and using one of the following
compression standards:
H.263
H.264 AVC
MPEG-4 SP
However, device vendors are keen on expanding this list, so some Android devices are able to play formats other than those listed, such as HD
videos.
For more information about the supported compression standards, consult the Android SDK Core Media Formats documentation.
As soon as you call iPhoneUtils.PlayMovie or iPhoneUtils.PlayMovieURL, the screen will fade from your current content to the designated
background color. It might take some time before the movie is ready to play. In the meantime, the player will continue displaying the background
color and may also display a progress indicator to let the user know the movie is loading. When playback finishes, the screen will fade back to
your content.
Audio Files
As with Meshes or Textures, the workflow for Audio File assets is designed to be smooth and trouble free. Unity can import almost every
common file format but there are a few details that are useful to be aware of when working with Audio Files.
Audio in Unity is either Native or Compressed. Unity supports most common formats (see the list below) and will import an audio file when it is
added to the project. The default mode is Native, where the audio data from the original file is imported unchanged. However, Unity can also
compress the audio data on import, simply by enabling the Compressed option in the importer. (iOS projects can make use of the hardware
decoder - see the iOS documentation for further details). The difference between Native and Compressed modes are as follows:-
Native: Use Native (WAV, AIFF) audio for short sound effects. The audio data will be larger but sounds won't need to be decoded at
runtime.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Compressed: The audio data will be small but will need to be decompressed at runtime, which entails a processing overhead. Depending
on the target, Unity will encode the audio to either Ogg Vorbis(Mac/PC/Consoles) or MP3 (Mobile platforms). For the best sound quality,
supply the audio in an uncompressed format such as WAV or AIFF (containing PCM data) and let Unity do the encoding. If you are targeting
Mac and PC platforms only (including both standalones and webplayers) then importing an Ogg Vorbis file will not degrade the quality.
However, on mobile platforms, Ogg Vorbis and MP3 files will be re-encoded to MP3 on import, which will introduce a slight quality
degradation.
Any Audio File imported into Unity is available from scripts as an Audio Clip instance, which is effectively just a container for the audio data.
The clips must be used in conjunction with Audio Sources and an Audio Listener in order to actually generate sound. When you attach your
clip to an object in the game, it adds an Audio Source component to the object, which has Volume, Pitch and a numerous other properties.
While a Source is playing, an Audio Listener can "hear" all sources within range, and the combination of those sources gives the sound that will
actually be heard through the speakers. There can be only one Audio Listener in your scene, and this is usually attached to the Main Camera.
Supported Formats
Format Compressed as (Mac/PC) Compressed as (Mobile)
MPEG(1/2/3) Ogg Vorbis MP3
Ogg Vorbis Ogg Vorbis MP3
WAV Ogg Vorbis MP3
AIFF Ogg Vorbis MP3
MOD - -
IT - -
S3M - -
XM - -
See the Sound chapter in the Creating Gameplay section of this manual for more information on using sound in Unity.
Audio Clip
Audio Clips contain the audio data used by Audio Sources. Unity supports mono, stereo and multichannel audio assets (up to eight channels).
The audio file formats that Unity can import are .aif, .wav, .mp3, and .ogg. Unity can also import tracker modules in the .xm, .mod, .it, and
.s3m formats. The tracker module assets behave the same way as any other audio assets in Unity although no waveform preview is available
in the asset import inspector.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Properties
Audio Format The specific format that will be used for the sound at runtime.
Native This option offers higher quality at the expense of larger file size and is best for very short sound effects.
Compressed The compression results in smaller files but with somewhat lower quality compared to native audio. This format is
best for medium length sound effects and music.
3D Sound If enabled, the sound will play back in 3D space. Both Mono and Stereo sounds can be played in 3D.
Force to mono If enabled, the audio clip will be down-mixed to a single channel sound.
Load Type The method Unity uses to load audio assets at runtime.
Decompress on load Audio files will be decompressed as soon as they are loaded. Use this option for smaller compressed sounds to
avoid the performance overhead of decompressing on the fly. Be aware that decompressing sounds on load will
use about ten times more memory than keeping them compressed, so don't use this option for large files.
Compressed in Keep sounds compressed in memory and decompress while playing. This option has a slight performance
memory overhead (especially for Ogg/Vorbis compressed files) so only use it for bigger files where decompression on load
would use a prohibitive amount of memory. Note that, due to technical limitations, this option will silently switch to
Stream From Disc (see below) for Ogg Vorbis assets on platforms that use FMOD audio.
Stream from disc Stream audio data directly from disc. The memory used by this option is typically a small fraction of the file size, so
it is very useful for music or other very long tracks. For performance reasons, it is usually advisable to stream only
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
one or two files from disc at a time but the of streams that can comfortably be handled depends on the hardware.
Compression Amount of Compression to be applied to a Compressed clip. Statistics about the file size can be seen under the
slider. A good approach to tuning this value is to drag the slider to a place that leaves the playback "good enough"
while keeping the file small enough for your distribution requirements.
Hardware (iOS only) On iOS devices, Apple's hardware decoder can be used resulting in lower CPU overhead during decompression.
Decoding Check out platform specific details for more info.
Gapless (Android/iOS only) Use this when compressing a seamless looping audio source file (in a non-compressed PCM format) to ensure
looping perfect continuity is preserved at the seam. Standard MPEG encoders introduce a short silence at the loop point, which will be audible
as a brief "click" or "pop".
As a general rule of thumb, Compressed audio (or modules) are best for long files like background music or dialog, while Native is better for
short sound effects. You should tweak the amount of Compression using the compression slider. Start with high compression and gradually
reduce the setting to the point where the loss of sound quality is perceptible. Then, increase it again slightly until the perceived loss of quality
disappears.
Using 3D Audio
If an audio clip is marked as a 3D Sound then it will be played back so as to simulate its position in the game world's 3D space. 3D sounds
emulate the distance and location of sounds by attenuating volume and panning across speakers. Both mono and multiple channel sounds can
be positioned in 3D. For multiple channel audio, use the spread option on the Audio Source to spread and split out the discrete channels in
speaker space. Unity offers a variety of options to control and fine-tune the audio behavior in 3D space - see the Audio Source component
reference for further details.
iOS
On mobile platforms compressed audio is encoded as MP3 to take advantage of hardware decompression.
To improve performance, audio clips can be played back using the Apple hardware codec. To enable this option, check the "Hardware
Decoding" checkbox in the Audio Importer. Note that only one hardware audio stream can be decompressed at a time, including the
background iPod audio.
If the hardware decoder is not available, the decompression will fall back on the software decoder (on iPhone 3GS or later, Apple's software
decoder is used in preference to Unity's own decoder (FMOD)).
Android
On mobile platforms compressed audio is encoded as MP3 to take advantage of hardware decompression.
TrackerModules
Tracker Modules are essentially just packages of audio samples that have been modeled, arranged and sequenced programatically. The
concept was introduced in the 1980's (mainly in conjunction with the Amiga computer) and has been popular since the early days of game
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Tracker Module files are similar to MIDI files in many ways. The tracks are scores that contain information about when to play the instruments,
and at what pitch and volume and from this, the melody and rhythm of the original tune can be recreated. However, MIDI has a disadvantage in
that the sounds are dependent on the sound bank available in the audio hardware, so MIDI music can sound different on different computers. In
contrast, tracker modules include high quality PCM samples that ensure a similar experience regardless of the audio hardware in use.
Supported formats
Unity supports the four most common module file formats, namely Impulse Tracker (.it), Scream Tracker (.s3m), Extended Module File Format
(.xm), and the original Module File Format (.mod).
Scripting
This brief introduction explains how to create and use scripts in a project. For detailed information about the Scripting API, please view the
Scripting Reference. For detailed information about creating game play through scripting, please view the Creating Gameplay page of this
manual.
Behaviour scripts in Unity can be written in JavaScript, C#, or Boo. It is possible to use any combination of the three languages in a single
project, although there are certain restrictions in cases where one script incorporates classes defined in another script.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
You can edit the script by double-clicking on it in the Project View. This will launch your default text editor as specified in Unity's preferences. To
set the default script editor, change the drop-down item in Unity->Preferences->External Script editor.
function Update () {
}
A new, empty script does not do a lot on its own, so let's add some functionality. Change the script to read the following:
function Update () {
print("Hello World");
}
When executed, this code will print "Hello World" to the console. But there is nothing that causes the code to be executed yet. We have to
attach the script to an active GameObject in the Scene before it will be executed.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Now drag the script from the Project View to the Cube (in the Scene or Hierarchy View, it doesn't matter). You can also select the Cube and
choose Component->Scripts->New Behaviour Script. Either of these methods will attach the script to the Cube. Every script you create will
appear in the Component->Scripts menu.
If you select the Cube and look at the Inspector, you will see that the script is now visible. This means it has been attached.
Press Play to test your creation. You should see the text "Hello World" appear beside the Play/Pause/Step buttons. Exit play mode when you
see it.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
function Update () {
transform.Rotate(0, 5*Time.deltaTime, 0);
}
If you're new to scripting, it's okay if this looks confusing. These are the important concepts to understand:
1. function Update () {} is a container for code that Unity executes multiple times per second (once per frame).
2. transform is a reference to the GameObject's Transform Component.
3. Rotate() is a function contained in the Transform Component.
4. The numbers in-between the commas represent the degrees of rotation around each axis of 3D space: X, Y, and Z.
5. Time.deltaTime is a member of the Time class that evens out movement over one second, so the cube will rotate at the same speed
no matter how many frames per second your machine is rendering. Therefore, 5 * Time.deltaTime means 5 degrees per second.
With all this in mind, we can read this code as "every frame, rotate this GameObject's Transform component a small amount so that it will equal
five degrees around the Y axis each second."
You can access lots of different Components the same way as we accessed transform already. You have to add Components to the
GameObject using the Component menu. All the Components you can access directly are listed under Variables on the GameObject Scripting
Reference Page.
For more information about the relationship between GameObjects, Scripts, and Components, please jump ahead to the GameObjects page or
Using Components page of this manual.
Instead of typing 5 into the Rotate() function, we will declare a speed variable and use that in the function. Change the script to the following
code and save it:
function Update () {
transform.Rotate(0, speed*Time.deltaTime, 0);
}
Now, select the Cube and look at the Inspector. Notice how our speed variable appears.
This variable can now be modified directly in the Inspector. Select it, press Return and change the value. You can also right- or option-click on
the value and drag the mouse up or down. You can change the variable at any time, even while the game is running.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Hit Play and try modifying the speed value. The Cube's rotation speed will change instantly. When you exit Play mode, you'll see that your
changes are reverted back to their value before entering Play mode. This way you can play, adjust, and experiment to find the best value, then
apply that value permanently.
The technique of changing a variable's value in the Inspector makes it easy to reuse one script on many objects, each with a different variable
value. If you attach the script to multiple Cubes, and change the speed of each cube, they will all rotate at different speeds even though they
use the same script.
Typing this will be accessing the script Component that you are writing. Typing this.gameObject is referring to the GameObject that the script
is attached to. You can access the same GameObject by simply typing gameObject. Logically, typing this.transform is the same as typing
transform. If you want to access a Component that is not included as a GameObject member, you have to use gameObject.GetComponent()
which is explained on the next page.
There are many Components that can be directly accessed in any script. For example, if you want to access the Translate function of the
Transform component, you can just write transform.Translate() or gameObject.transform.Translate(). This works because all scripts are
attached to a GameObject. So when you write transform you are implicitly accessing the Transform Component of the GameObject that is
being scripted. To be explicit, you write gameObject.transform. There is no advantage in one method over the other, it's all a matter of
preference for the scripter.
To see a list of all the Components you can access implicitly, take a look at the GameObject page in the Scripting Reference.
Using GetComponent()
There are many Components which are not referenced directly as members of the GameObject class. So you cannot access them implicitly,
you have to access them explicitly. You do this by calling the GetComponent("component name") and storing a reference to the result. This
is most common when you want to make a reference to another script attached to the GameObject.
Pretend you are writing Script B and you want to make a reference to Script A, which is attached to the same GameObject. You would have to
use GetComponent() to make this reference. In Script B, you would simply write:
scriptA = GetComponent("ScriptA");
For more help with using GetComponent(), take a look at the GetComponent() Script Reference page.
function Start () {
// Print the position of the transform component, for the gameObject this script is attached to
Debug.Log(gameObject.GetComponent<Transform>.().position);
}
In the previous example the GetComponent<T>. function is used to access the position property of the Transform component. The same
technique can be used to access a variable in a custom script Component:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
(MyClass.js)
public var speed : float = 3.14159;
(MyOtherClass.js)
function Start () {
// Print the speed variable from the MyClass script Component attached to the gameObject
Debug.Log(gameObject.GetComponent<MyClass>.().speed);
}
In general the code inside the "Standard Assets", "Pro Standard Assets" or "Plugins" folders, regardless of the language (C#, Javascript or
Boo), will be compiled first and available to scripts in subsequent compilation steps.
(MyClass.js)
public var speed : float = 3.14159;
(MyOtherClass.js)
private var myClass : MyClass;
function Start () {
// Get a reference to the MyClass script Component attached to the gameObject
myClass = gameObject.GetComponent<MyClass>.();
}
function Update () {
// Verify that the reference is still valid and print the speed variable
if(myClass != null)
Debug.Log (myClass.speed);
}
Static Variables
It is also possible to declare variables in your classes as static. There will exist one and only one instance of a static variable for a specific
class and it can be modified without the need of an instance of a class object:
(MyClass.js)
static public var speed : float = 3.14159;
(MyOtherClass.js)
function Start () {
Debug.Log (MyClass.speed);
}
It is recommended to not use static variables for object references to make sure unused objects are removed from memory.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Asset Store
Unity's Asset Store is home to a growing library of free and commercial assets created both by Unity Technologies and also members of the
community. A wide variety of assets is available, covering everything from textures, models and animations to whole project examples, tutorials
and Editor extensions. The assets are accessed from a simple interface built into the Unity Editor and are downloaded and imported directly into
your project.
The Store provides a browser-like interface which allows you to navigate either by free text search or by browsing packages and categories. To
the left of the main tool bar are the familiar browsing buttons for navigating through the history of viewed items:-
To the right of these are buttons for viewing the Download Manager and for viewing the current contents of your shopping cart.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Download Manager allows you to view the packages you have already bought and also to find and install any updates. Additionally, the
standard packages supplied with Unity can be viewed and added to your project with the same interface.
~/Library/Unity/Asset Store
C:\Users\accountName\AppData\Roaming\Unity\Asset Store
...on Windows. These folders contain subfolders that correspond to particular Asset Store vendors - the actual asset files are contained in the
appropriate subfolders.
1. Open up the Asset Store and download the latest version of �Asset Store Tools� from the Asset Store (you�ll need to sign in, if you
haven�t done so already).
2. Once downloaded and imported to your project, you should see the �Asset Store Tools� menu appear in your Toolbar. Scroll down
and click the Package Manager button.
3. Now you have the Package Manager open, you can click the link in the top right-hand corner that reads �Publisher account�.
4. This will bring up a window that prompts you to create your Publisher Account. You�ll need to fill out your Publisher name, Publisher
URL, Publisher description (including an email Support address for your packages) and Key images.
Publisher Administration
Once you have your Publisher Account for the Asset Store setup, you�ll be able to log into the Publisher Administration portal, here:
https://publisher.assetstore.unity3d.com/
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The �Sales� tab allows you to track all purchases made by customers, organised by Package name. Quantity of Sales and Gross
Revenue will be shown here, as well as any refunds (if applicable).
�Free Downloads� allows you track all packages you have published on the Asset Store, for free, much the same as the Sales tab.
The �Revenue� shows your revenue for any given month since you started selling on the Asset Store. Credits, debits, your balance and
recent payouts will be shown here.
The �Pending� tab will show any outstanding packages you have submitted, that are pending approval from our Vetting Team, before
being accepted for sale on the Asset Store.
If and when a customer needs Support for your package, you can verify they have indeed purchased your package using the �Verify
Invoice� tab.
You can add a number of administrative users (teammates, employees, colleagues) to your master account via the �Users� tab.
In the �Payout� tab, you will specify how you would like to receive your earnings. You can amend your payout details for the Asset Store
at any point. There are three options when receiving payouts from the Asset Store.
Q&A
Q: What date will I receive my monthly or quarterly transfer?
A: All payouts are scheduled for the 15th of each month
Q: My package has shown as Pending for a while now, what should I do?
A: Our Vetting Team receive a huge amount of submissions per week, please be patient when waiting for acceptance. If you feel there may be
an issue with your submission, please contact [email protected] stating your Publisher and Package details.
Asset Server
It is available only for Unity Pro, and is an additional license per client. To purchase an Asset Server Client License, please visit the Unity store
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
at http://unity3d.com/store
In a way, the Asset Server functions as a backup of your Project Folder. You do not directly manipulate the contents of the Asset Server while
you are developing. You make changes to your Project locally, then when you are done, you Commit Changes to the Project on the Server.
This makes your local Project and the Asset Server Project identical.
Now, when your fellow developers make a change, the Asset Server is identical to their Project, but not yours. To synchronize your local
Project, you request to Update from Server. Now, whatever changes your team members have made will be downloaded from the server to
your local Project.
This is the basic workflow for using the Asset Server. In addition to this basic functionality, the Asset Server allows for rollback to previous
versions of assets, detailed file comparison, merging two different scripts, resolving conflicts, and recovering deleted assets.
The rest of this guide explains how to deploy, administrate, and regularly use the Asset Server.
Getting Started
If you are joining a team that has a lot of work stored on the Asset Server already, this is the quickest way to get up and running correctly. If you
are starting your own project from scratch, you can skip down to the Workflow Fundamentals section.
Continue reading for detailed information on how to use the Asset Server effectively every day.
Workflow Fundamentals
When using the Asset Server with a multi-person team, it is generally good practice to Update all changed assets from the server when you
begin working, and Commit your changes at the end of the day, or whenever you're done working. You should also commit changes when you
have made significant progress on something, even if it is in the middle of the day. Committing your changes regularly and frequently is
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
recommended.
The Server View is broken into tabs: Overview Update, and Commit. Overview will show you any differences between your local project and
the latest version on the server with options to quickly commit local changes or download the latest updates. Update will show you the latest
remote changes on the server and allow you to download them to your local project. Commit allows you to create a Changeset and commit it
to the server for others to download.
1. Server address
2. Username
3. Password
By clicking Show projects you can now see the available projects on the asset server, and choose which one to connect to by clicking
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Connect. Note that the username and password you use can be obtain from your system administrator. Your system administrator created
accounts when they installed Asset Server.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Now you will be able to see all the local changes made to the project since your last update, and will be able to select which changes you wish
to upload to the server. You can add changes to the changeset either by manually dragging them into the changeset field, or by using the
buttons placed below the commit message field. Remember to type in a commit message which will help you when you compare versions or
revert to an earlier version later on, both of which are discussed below.
Resolving conflicts
With multiple people working on the same collection of data, conflicts will inevitably arise. Remember, there is no need to panic! If a conflict
exists, you will be presented with the Conflict Resolution dialog when updating your project.
Here, you will be informed of each individual conflict, and be presented with different options to resolve each individual conflict. For any single
conflict, you can select Skip Asset (which will not download that asset from the server), Discard My Changes (which will completely overwrite
your local version of the asset) or Ignore Server Changes (which will ignore the changes others made to the asset and after this update you
will be able to commit your local changes over server ones) for each individual conflict. Additionally, you can select Merge for text assets like
scripts to merge the server version with the local version.
Note: If you choose to discard your changes, the asset will be updated to the latest version from the server (i.e., it will incorporate other users'
changes that have been made while you were working). If you want to get the asset back as it was when you started working, you should revert
to the specific version that you checked out. (See Browsing revision history and reverting assets below.)
If you run into a conflict while you are committing your local changes, Unity will refuse to commit your changes and inform you that a conflict
exists. To resolve the conflicts, select Update. Your local changes will not automatically be overwritten. At this point you will see the Conflict
Resolution dialog, and can follow the instructions in the above paragraph.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Here, you can see the version number and added comments with each version of the asset or project. This is one reason why descriptive
comments are helpful. Select any asset to see its history or Entire Project for all changes made in project. Find revision you need. You can
either select whole revision or particular asset in revision. Then click Download Selected File to get your local asset replaced with a copy of
the selected revision. Revert All Project will revert entire project to selected revision.
Prior to reverting, if there are any differences between your local version and the selected server version, those changes will be lost when the
local version is reverted.
If you only want to abandon the changes made to the local copy, you don't have to revert. You can discard those local modifications by
selecting Discard Changes in the main asset server window. This will immediately download the current version of the project from the server
to your local Project.
Note: this feature requires that you have one of supported file diff/merge tools installed. Supported tools are:
On Windows:
TortoiseMerge: part of TortoiseSVN or a separate download from the project site.
WinMerge.
SourceGear Diff/Merge.
Perforce Merge (p4merge): part of Perforce's visual client suite (P4V).
TkDiff.
On Mac OS X:
SourceGear Diff/Merge.
FileMerge: part of Apple's XCode development tools.
TkDiff.
Perforce Merge (p4merge): part of Perforce's visual client suite (P4V).
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Expand Deleted Assets item, find and select assets from the list and hit Recover, the selected assets will be downloaded and re-added to the
local project. If the folder that the asset was located in before the deletion still exists, the asset will be restored to the original location, otherwise
it will be added to the root of the Assets folder in the local project.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
You should now be equipped with the knowledge you need to start using the Asset Server effectively. Get to it, and don't forget the good
workflow fundamentals. Commit changes often, and don't be afraid of losing anything.
Server-side Installation
The Asset Server is designed to be a simple one-time installation on a server machine. Interacting with the Asset Server is done through Unity.
Unity can be installed on the server machine, but it does not need to be. It must be administrated from a Client machine, where Projects and
Users can be added. Each additional client must be configured to synchronize with a Project, using a specific User credential.
You can install the Asset Server on Mac OS X 10.4 or later, Windows XP, Windows Vista and various Linux distributions including CentOS,
Ubuntu and Suse Linux. Download Unity Asset Server from here.
The installer will install all necessary files, setup a database and launch the Asset Server. At the end of the process you will be asked to create
an Admin password. This password is required to administer the Asset Server from within Unity. You must connect to the Asset Server as the
administrator before you can create any projects or users.
To access the Administrator controls, launch Unity and select Window->Asset Server, then click the Administration button.
In the Server Address field, enter either the ip address or host name of the computer running the Asset Server that you want to administer. If
the Asset Server is installed on your local machine, you can use "localhost" as the Server Address. Next, provide the administrator name and
password. The administrator name is always "admin", and the password is what was entered when installing the Asset Server. Finally, hit the
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Connect button. You're now connected to the Asset Server, and can perform the initial setup.
New Projects can be created by clicking on the Create button in the Server Administration tab.
New users can be created by first selecting an existing project and then clicking on the New User button.
After a user has been created in one Project, the user can be added to another project by enabling the checkbox on the left of the user name in
the users list.
You can enable or disable user access for individual projects. To completely remove a project or user from the server use the Delete Project
and Delete User buttons.
Firewall settings
The Unity Asset Server uses TCP port 10733. You might need to enable connections to this port in your firewall and/or router.
Advanced
The Asset Server is built using a modified version of PostgreSQL. Accessing the SQL database directly requires a bit of technical knowledge
about SQL and Unix/Linux command lines. User discretion is advised.
Backing up
We have provided a command line tool to back up an asset server. The tool should be run from an administrator account on the machine
running the asset server. Replace BACKUP_LOCATION with the path name you want the backup tool to place the backups:
Mac OS X
sudo /Library/UnityAssetServer/bin/as_backup BACKUP_LOCATION
Linux
sudo /opt/unity_asset_server/bin/as_backup BACKUP_LOCATION
Windows
"\Unity\AssetServer\bin\as_backup.cmd" BACKUP_LOCATION
as_backup will create a directory at BACKUP_LOCATION containing one or more files per project plus files containing information about each
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Restoring a Backup
To restore an Asset Server backup produced with as_backup, first perform a clean installation of the Asset Server without any projects created.
(The restore procedure will refuse to overwrite already existing projects with the same name.)
Then run the provided backup restoration tool, as_restore pointing it to the location of a backup created with as_backup:
Mac OS X
sudo /Library/UnityAssetServer/bin/as_restore BACKUP_LOCATION
Linux
sudo /opt/unity_asset_server/bin/as_restore BACKUP_LOCATION
Windows
"\Unity\AssetServer\bin\as_restore.cmd" BACKUP_LOCATION
Note that you can also use as_backup and as_restore to move an asset server installation from one machine to another by performing the
backup on the source machine, moving the backup directory to the destination machine (or mount it through a network file share,) and then
running as_restore to insert the data into the newly installed Asset Server instance. This will even work when the source and destination Asset
Servers have different versions or are running on different operating systems.
Mac OS X
/Library/UnityAssetServer/bin/psql -U admin -h localhost -d postgres -c 'select * from
all_databases__view'
Linux
/opt/unity_asset_server/bin/psql -U admin -h localhost -d postgres -c 'select * from all_databases__view'
Windows
"\Unity\AssetServer\bin\psql.exe" -U admin -h localhost -d postgres -c "select * from
all_databases__view"
This and other commands will prompt you for a password. Every time this happens, enter the admin password for the database, which was set
during the installation. The result will be a table that follows this basic layout:
Now you need to identify the "databasename" of the Project you want to back up. When creating a database, the default "databasename" is
same as the "projectname" as shown inside Unity, but in lowercase and spaces replaced with underscores.
Note that if your server hosts multiple PostgreSQL databases on different ports you nay need to explicitly provide the port used to connect to
the Asset Server database. In this case add -p 10733 to the commands given (assuming you have used the default port of 10733 for your
instance.) For example:
Linux
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Cache Server
Why should I be using the Cache Server?
The time it takes to import assets can be drastically reduced by caching the imported asset data on the Cache Server.
If any of the above change, the asset gets reimported, otherwise it gets downloaded from the Cache Server.
When you enable the cache server in the preferences, you can even share asset imports across multiple projects.
Note that once the cache server is set up, this process is completely automatic, which means there are no additional workflow requirements. It
will simply reduce the time it takes to import projects without getting in your way.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
If you are hosting the Cache Server on your local machine, specify localhost for the server address. However, due to hard drive size limitations,
it is recommended you host the Cache Server on separate machine.
Purchase Cache Server (as part of the Team License) in the Online Store.
Download the Cache Server. Go to the Unity Team License page and click on the button to Download the Cache Server.
Unzip the file, after which you should see something like this:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Cache Server needs to be on a reliable machine with very large storage (much larger than the size of the project itself, as there will be
multiple versions of imported resources stored). If the hard disk becomes full the Cache Server could perform slowly.
--path lets you specify a cache location, and --size lets you specify the maximum cache size in bytes.
One of the main distinctions between the Cache Server and version control is that its cached data can always be rebuilt locally. It is simply a
tool for improving performance. For this reason it doesn't make sense to use a Cache Server over the Internet. If you have a distributed team,
you should place a separate cache server in each location.
The cache server should run on a Linux or Mac OS X machine. The Windows file system is not particularly well optimized for how the Asset
Cache Server stores data and problems with file locking on Windows can cause issues that don't occur on Linux or Mac OS X.
Will the size of my Cache Server database grow indefinitely as more and more resources get imported and stored?
The Cache Server removes assets that have not been used for a period of time automatically (of course if those assets are needed again, they
will be re-created during next usage).
Does the cache server work only with the asset server?
The cache server is designed to be transparent to source/version control systems and so you are not restricted to using Unity's asset server.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
When Unity is about to import an asset, it generates an MD5 hash of all source data.
If that hash is different from what is stored on the Cache Server, the asset will be reimported, otherwise the cached version will be downloaded.
The client Unity editor will only pull assets from the server as they are needed - assets don't get pushed to each project as they change.
It is also easy to use AssetPostprocessors to introduce dependencies. For example you might use data from a text file next to the asset to
add additional components to the imported game objects. This is not supported in the Cache Server. If you want to use the Cache Server, you
will have to remove dependency on other assets in the project folder. Since the Cache Server doesn't know anything about the dependency in
your postprocessor, it will not know that anything has changed thus use an old cached version of the asset.
In practice there are plenty of ways you can do asset postprocessing to work well with the cache server. You can use:
Are there any asset types which will not be cached by the server?
There are a few kinds of asset data which the server doesn't cache. There isn't really anything to be gained by caching script files and so the
server will ignore them. Also, native files used by 3D modelling software (Maya, 3D Max, etc) are converted to FBX using the application itself.
Currently, the asset server caches neither the native file nor the intermediate FBX file generated in the import process. However, it is possible
to benefit from the server by exporting files as FBX from the modelling software and adding those to the Unity project.
When you place an Asset such as a texture in the Assets folder, Unity will first detect that a new file has been added (the editor frequently
checks the contents of the Assets folder against the list of assets it already knows about). Once a unique ID value has been assigned to the
asset to enable it to be accessed internally, it will be imported and processed. The asset that you actually see in the Project panel is the result
of that processing and its data contents will typically be different to those of the original asset. For example, a texture may be present in the
Assets folder as a PNG file but will be converted to an internal format after import and processing.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Using an internal format for assets allows Unity to keep additional data known as metadata which enables the asset data to be handled in a
much more flexible way. For example, the Photoshop file format is convenient to work with, but you wouldn't expect it to support game engine
features such as mip maps. Unity's internal format, however, can add extra functionality like this to any asset type. All metadata for assets is
stored in the Library folder. As as user, you should never have to alter the Library folder manually and attempting to do so may corrupt the
project.
Unity allows you to create folders in the Project view to help you organize assets, and those folders will be mirrored in the actual filesystem.
However, you must move the files within Unity by dragging and dropping in the Project view. If you attempt to use the filesystem/desktop to
move the files then Unity will misinterpret the change (it will appear that the old asset has been deleted and a new one created in its place).
This will lose information, such as links between assets and scripts in the project.
When backing up a project, you should always back up the main Unity project folder, containing both the Assets and Library folders. All the
information in the subfolders is crucial to the way Unity works.
Creating Gameplay
Unity empowers game designers to make games. What's really special about Unity is that you don't need years of experience with code or a
degree in art to make fun games. There are a handful of basic workflow concepts needed to learn Unity. Once understood, you will find yourself
making games in no time. With the time you will save getting your games up and running, you will have that much more time to refine, balance,
and tweak your game to perfection.
This section will explain the core concepts you need to know for creating unique, amazing, and fun gameplay. The majority of these concepts
require you to write Scripts. For an overview of creating and working with Scripts, please read the Scripting page.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Instantiating Prefabs
By this point you should understand the concept of Prefabs at a fundamental level. They are a collection of predefined GameObjects &
Components that are re-usable throughout your game. If you don't know what a Prefab is, we recommend you read the Prefabs page for a
more basic introduction.
Prefabs come in very handy when you want to instantiate complicated GameObjects at runtime. The alternative to instantiating Prefabs is to
create GameObjects from scratch using code. Instantiating Prefabs has many advantages over the alternative approach:
You can instantiate a Prefab from one line of code, with complete functionality. Creating equivalent GameObjects from code takes an
average of five lines of code, but likely more.
You can set up, test, and modify the Prefab quickly and easily in the Scene and Inspector.
You can change the Prefab being instanced without changing the code that instantiates it. A simple rocket might be altered into a super-
charged rocket, and no code changes are required.
Common Scenarios
To illustrate the strength of Prefabs, let's consider some basic situations where they would come in handy:
1. Building a wall out of a single "brick" Prefab by creating it several times in different positions.
2. A rocket launcher instantiates a flying rocket Prefab when fired. The Prefab contains a Mesh, Rigidbody, Collider, and a child
GameObject with its own trail Particle System.
3. A robot exploding to many pieces. The complete, operational robot is destroyed and replaced with a wrecked robot Prefab. This Prefab
would consist of the robot split into many parts, all set up with Rigidbodies and Particle Systems of their own. This technique allows you
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
to blow up a robot into many pieces, with just one line of code, replacing one object with a Prefab.
Building a wall
This explanation will illustrate the advantages of using a Prefab vs creating objects from code.
// JavaScript
function Start () {
for (var y = 0; y < 5; y++) {
for (var x = 0; x < 5; x++) {
var cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
cube.AddComponent(Rigidbody);
cube.transform.position = Vector3 (x, y, 0);
}
}
}
// C#
public class Instantiation : MonoBehaviour {
void Start() {
for (int y = 0; y < 5; y++) {
for (int x = 0; x < 5; x++) {
GameObject cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
cube.AddComponent<Rigidbody>();
cube.transform.position = new Vector3(x, y, 0);
}
}
}
}
To use the above script we simply save the script and drag it onto an empty GameObject.
Create an empty GameObject with GameObject->Create Empty.
If you execute that code, you will see an entire brick wall is created when you enter Play Mode. There are two lines relevant to the functionality
of each individual brick: the CreatePrimitive() line, and the AddComponent() line. Not so bad right now, but each of our bricks is un-textured.
Every additional action to want to perform on the brick, like changing the texture, the friction, or the Rigidbody mass, is an extra line.
If you create a Prefab and perform all your setup before-hand, you use one line of code to perform the creation and setup of each brick. This
relieves you from maintaining and changing a lot of code when you decide you want to make changes. With a Prefab, you just make your
changes and Play. No code alterations required.
If you're using a Prefab for each individual brick, this is the code you need to create the wall.
// JavaScript
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
// C#
public Transform brick;
void Start() {
for (int y = 0; y < 5; y++) {
for (int x = 0; x < 5; x++) {
Instantiate(brick, new Vector3(x, y, 0), Quaternion.identity);
}
}
}
This is not only very clean but also very reusable. There is nothing saying we are instantiating a cube or that it must contain a rigidbody. All of
this is defined in the Prefab and can be quickly created in the Editor.
Now we only need to create the Prefab, which we do in the Editor. Here's how:
We've created our Brick Prefab, so now we have to attach it to the brick variable in our script. Select the empty GameObject that contains the
script. Notice that a new variable has appeared in the Inspector, called "brick".
Now drag the "Brick" Prefab from the Project View onto the brick variable in the Inspector. Press Play and you'll see the wall built using the
Prefab.
This is a workflow pattern that can be used over and over again in Unity. In the beginning you might wonder why this is so much better,
because the script creating the cube from code is only 2 lines longer.
But because you are using a Prefab now, you can adjust the Prefab in seconds. Want to change the mass of all those instances? Adjust the
Rigidbody in the Prefab only once. Want to use a different Material for all the instances? Drag the Material onto the Prefab only once. Want to
change friction? Use a different Physic Material in the Prefab's collider. Want to add a Particle System to all those boxes? Add a child to the
Prefab only once.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
1. A rocket launcher instantiates a rocket Prefab when the user presses fire. The Prefab contains a mesh, Rigidbody, Collider, and a child
GameObject that contains a trail particle system.
2. The rocket impacts and instantiates an explosion Prefab. The explosion Prefab contains a Particle System, a light that fades out over
time, and a script that applies damage to surrounding GameObjects.
While it would be possible to build a rocket GameObject completely from code, adding Components manually and setting properties, it is far
easier to instantiate a Prefab. You can instantiate the rocket in just one line of code, no matter how complex the rocket's Prefab is. After
instantiating the Prefab you can also modify any properties of the instantiated object (e.g. you can set the velocity of the rocket's Rigidbody).
Aside from being easier to use, you can update the prefab later on. So if you are building a rocket, you don't immediately have to add a Particle
trail to it. You can do that later. As soon as you add the trail as a child GameObject to the Prefab, all your instantiated rockets will have particle
trails. And lastly, you can quickly tweak the properties of the rocket Prefab in the Inspector, making it far easier to fine-tune your game.
This script shows how to launch a rocket using the Instantiate() function.
// JavaScript
function FireRocket () {
var rocketClone : Rigidbody = Instantiate(rocket, transform.position, transform.rotation);
rocketClone.velocity = transform.forward * speed;
// You can also acccess other components / scripts of the clone
rocketClone.GetComponent(MyRocketScript).DoSomething();
}
// C#
void FireRocket () {
Rigidbody rocketClone = (Rigidbody) Instantiate(rocket, transform.position, transform.rotation);
rocketClone.velocity = transform.forward * speed;
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
A far better approach is to immediately delete the entire character and replace it with an instantiated wrecked prefab. This gives you a lot of
flexibility. You could use a different material for the dead character, attach completely different scripts, spawn a Prefab containing the object
broken into many pieces to simulate a shattered enemy, or simply instantiate a Prefab containing a version of the character.
Any of these options can be achieved with a single call to Instantiate(), you just have to hook it up to the right prefab and you're set!
The important part to remember is that the wreck which you Instantiate() can be made of completely different objects than the original. For
example, if you have an airplane, you would model two versions. One where the plane consists of a single GameObject with Mesh Renderer
and scripts for airplane physics. By keeping the model in just one GameObject, your game will run faster since you will be able to make the
model with less triangles and since it consists of fewer objects it will render faster than using many small parts. Also while your plane is happily
flying around there is no reason to have it in separate parts.
1. Model your airplane with lots of different parts in your favorite modeler
2. Create an empty Scene
3. Drag the model into the empty Scene
4. Add Rigidbodies to all parts, by selecting all the parts and choosing Component->Physics->Rigidbody
5. Add Box Colliders to all parts by selecting all the parts and choosing Component->Physics->Box Collider
6. For an extra special effect, add a smoke-like Particle System as a child GameObject to each of the parts
7. Now you have an airplane with multiple exploded parts, they fall to the ground by physics and will create a Particle trail due to the
attached particle system. Hit Play to preview how your model reacts and do any necessary tweaks.
8. Choose Assets->Create Prefab
9. Drag the root GameObject containing all the airplane parts into the Prefab
// JavaScript
// As an example, we turn the game object into a wreck after 3 seconds automatically
function Start () {
yield WaitForSeconds(3);
KillSelf();
}
// Kill ourselves
Destroy(gameObject);
// C#
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
// As an example, we turn the game object into a wreck after 3 seconds automatically
IEnumerator Start() {
yield return new WaitForSeconds(3);
KillSelf();
}
// Kill ourselves
Destroy(gameObject);
}
The First Person Shooter tutorial explains how to replace a character with a ragdoll version and also synchronize limbs with the last state of the
animation. You can find that tutorial on the Tutorials page.
1. Building an object completely from code. This is tedious! Entering values from a script is both slow, unintuitive and not worth the hassle.
2. Make the fully rigged object, duplicate it and place it multiple times in the scene. This is tedious, and placing objects accurately in a grid
is hard.
So use Instantiate() with a Prefab instead! We think you get the idea of why Prefabs are so useful in these scenarios. Here's the code
necessary for these scenarios:
// JavaScript
function Start () {
for (var i = 0; i < numberOfObjects; i++) {
var angle = i * Mathf.PI * 2 / numberOfObjects;
var pos = Vector3 (Mathf.Cos(angle), 0, Mathf.Sin(angle)) * radius;
Instantiate(prefab, pos, Quaternion.identity);
}
}
// C#
// Instantiates a prefab in a circle
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
void Start() {
for (int i = 0; i < numberOfObjects; i++) {
float angle = i * Mathf.PI * 2 / numberOfObjects;
Vector3 pos = new Vector3(Mathf.Cos(angle), 0, Mathf.Sin(angle)) * radius;
Instantiate(prefab, pos, Quaternion.identity);
}
}
// JavaScript
function Start () {
for (var y = 0; y < gridY; y++) {
for (var x=0;x<gridX;x++) {
var pos = Vector3 (x, 0, y) * spacing;
Instantiate(prefab, pos, Quaternion.identity);
}
}
}
// C#
void Start() {
for (int y = 0; y < gridY; y++) {
for (int x = 0; x < gridX; x++) {
Vector3 pos = new Vector3(x, 0, y) * spacing;
Instantiate(prefab, pos, Quaternion.identity);
}
}
}
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Input
Desktop
Virtual axes and buttons can be created in the Input Manager, and end users can configure Keyboard input in a nice screen configuration
dialog.
You can setup joysticks, gamepads, keyboard, and mouse, then access them all through one simple scripting interface.
Every project has the following default input axes when it's created:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Fire1, Fire2, Fire3 are mapped to Control, Option (Alt), and Command, respectively.
Mouse X and Mouse Y are mapped to the delta of mouse movement.
Window Shake X and Window Shake Y is mapped to the movement of the window.
You map each axis to two buttons on a joystick, mouse, or keyboard keys.
Name The name of the string used to check this axis from a script.
Descriptive Name Positive value name displayed in the input tab of the Configuration dialog for standalone builds.
Descriptive Negative Negative value name displayed in the Input tab of the Configuration dialog for standalone builds.
Name
Negative Button The button used to push the axis in the negative direction.
Positive Button The button used to push the axis in the positive direction.
Alt Negative Button Alternative button used to push the axis in the negative direction.
Alt Positive Button Alternative button used to push the axis in the positive direction.
Gravity Speed in units per second that the axis falls toward neutral when no buttons are pressed.
Dead Size of the analog dead zone. All analog device values within this range result map to neutral.
Sensitivity Speed in units per second that the the axis will move toward the target value. This is for digital devices only.
Snap If enabled, the axis value will reset to zero when pressing a button of the opposite direction.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Invert If enabled, the Negative Buttons provide a positive value, and vice-versa.
Type The type of inputs that will control this axis.
Axis The axis of a connected device that will control this axis.
Joy Num The connected Joystick that will control this axis.
Use these settings to fine tune the look and feel of input. They are all documented with tooltips in the Editor as well.
An axis has a value between -1 and 1. The neutral position is 0. This is the case for joystick input and keyboard input.
However, Mouse Delta and Window Shake Delta are how much the mouse or window moved during the last frame. This means it can be larger
than 1 or smaller than -1 when the user moves the mouse quickly.
It is possible to create multiple axes with the same name. When getting the input axis, the axis with the largest absolute value will be returned.
This makes it possible to assign more than one input device to one axis name. For example, create one axis for keyboard input and one axis
for joystick input with the same name. If the user is using the joystick, input will come from the joystick, otherwise input will come from the
keyboard. This way you don't have to consider where the input comes from when writing scripts.
Button Names
To map a key to an axis, you have to enter the key's name in the Positive Button or Negative Button property in the Inspector.
The names used to identify the keys are the same in the scripting interface and the Inspector.
Mobile Input
On iOS and Android, the Input class offers access to touchscreen, accelerometer and geographical/location input.
Multi-Touch Screen
The iPhone and iPod Touch devices are capable of tracking up to five fingers touching the screen simultaneously. You can retrieve the status
of each finger touching the screen during the last frame by accessing the Input.touches property array.
Android devices don't have a unified limit on how many fingers they track. Instead, it varies from device to device and can be anything from two-
touch on older devices to five fingers on some newer devices.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Following is an example script which will shoot a ray whenever the user taps on the screen:
Mouse Simulation
On top of native touch support Unity iOS/Android provides a mouse simulation. You can use mouse functionality from the standard Input class.
Accelerometer
As the mobile device moves, a built-in accelerometer reports linear acceleration changes along the three primary axes in three-dimensional
space. Acceleration along each axis is reported directly by the hardware as G-force values. A value of 1.0 represents a load of about +1g along
a given axis while a value of -1.0 represents -1g. If you hold the device upright (with the home button at the bottom) in front of you, the X axis is
positive along the right, the Y axis is positive directly up, and the Z axis is positive pointing toward you.
You can retrieve the accelerometer value by accessing the Input.acceleration property.
The following is an example script which will move an object using the accelerometer:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
// Move object
transform.Translate (dir * speed);
}
Low-Pass Filter
Accelerometer readings can be jerky and noisy. Applying low-pass filtering on the signal allows you to smooth it and get rid of high frequency
noise.
The following script shows you how to apply low-pass filtering to accelerometer readings:
The greater the value of LowPassKernelWidthInSeconds, the slower the filtered value will converge towards the current input sample (and
vice versa).
I'd like as much precision as possible when reading the accelerometer. What should I do?
Reading the Input.acceleration variable does not equal sampling the hardware. Put simply, Unity samples the hardware at a frequency of 60Hz
and stores the result into the variable. In reality, things are a little bit more complicated -- accelerometer sampling doesn't occur at consistent
time intervals, if under significant CPU loads. As a result, the system might report 2 samples during one frame, then 1 sample during the next
frame.
You can access all measurements executed by accelerometer during the frame. The following code will illustrate a simple average of all the
accelerometer events that were collected within the last frame:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
acc *= 1.0/period;
return acc;
Further Reading
The Unity mobile input API is originally based on Apple's API. It may help to learn more about the native API to better understand Unity's Input
API. You can find the Apple input API documentation here:
Note: The above links reference your locally installed iPhone SDK Reference Documentation and will contain native ObjectiveC code. It is not
necessary to understand these documents for using Unity on mobile devices, but may be helpful to some!
iOS
Transforms
Transforms are a key Component in every GameObject. They dictate where the GameObject is positioned, how it is rotated, and its scale. It
is impossible to have a GameObject without a Transform. You can adjust the Transform of any GameObject from the Scene View, the
Inspector, or through Scripting.
The remainder of this page's text is from the Transform Component Reference page.
Transform
The Transform Component determines the Position, Rotation, and Scale of each object in the scene. Every object has a Transform.
The Transform Component is editable in the Scene View and in the Inspector
Properties
Position Position of the Transform in X, Y, and Z coordinates.
Rotation Rotation of the Transform around the X, Y, and Z axes, measured in degrees.
Scale Scale of the Transform along X, Y, and Z axes. Value "1" is the original size (size at which the object was
imported).
All properties of a Transform are measured relative to the Transform's parent (see below for further details). If the Transform has no parent, the
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Using Transforms
Transforms are always manipulated in 3D space in the X, Y, and Z axes. In Unity, these axes are represented by the colors red, green, and
blue respectively. Remember: XYZ = RGB.
Transforms can be directly manipulated in the Scene View or by editing properties in the Inspector. In the scene, you can modify Transforms
using the Move, Rotate and Scale tools. These tools are located in the upper left-hand corner of the Unity Editor.
The tools can be used on any object in the scene. When you click on an object, you will see the tool gizmo appear within it. The appearance of
the gizmo depends on which tool is selected.
When you click and drag on one of the three gizmo axes, you will notice that its color changes. As you drag the mouse, you will see the object
translate, rotate, or scale along the selected axis. When you release the mouse button, the axis remains selected. You can click the middle
mouse button and drag the mouse to manipulate the Transform along the selected axis.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Around the centre of the Transform gizmo are three coloured squares. These allow you to drag the Transform in a single plane (ie, the object
will move in two axes but be held still in the third axis).
Parenting
Parenting is one of the most important concepts to understand when using Unity. When a GameObject is a Parent of another GameObject, the
Child GameObject will move, rotate, and scale exactly as its Parent does. Just like your arms are attached to your body, when you turn your
body, your arms move because they're attached. Any object can have multiple children, but only one parent.
You can create a Parent by dragging any GameObject in the Hierarchy View onto another. This will create a Parent-Child relationship between
the two GameObjects.
Example of a Parent-Child hierarchy. GameObjects with foldout arrows to the left of their names are parents.
In the above example, we say that the arms are parented to the body and the hands are parented to the arms. The scenes you make in Unity
will contain collections of these Transform hierarchies. The topmost parent object is called the Root object. When you move, scale or rotate
a parent, all the changes in its Transform are applied to its children as well.
It is worth pointing out that the Transform values in the Inspector of any Child GameObject are displayed relative to the Parent's Transform
values. These are also called the Local Coordinates. Through scripting, you can access the Global Coordinates as well as the local
coordinates.
You can build compound objects by parenting several separate objects together, for example, the skeletal structure of a human ragdoll. You
can also achieve useful effects with simple hierarchies. For example, if you have a horror game that takes place at night, you can create an
effective atmosphere with a flashlight. To create this object, you would parent a spotlight Transform to the flashlight Transform. Then, any
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
alteration of the flashlight Transform will affect the spotlight, creating a convincing flashlight effect.
Non-uniform scaling has a negative impact on rendering performance. In order to transform vertex normals correctly, we transform the mesh on
the CPU and create an extra copy of the data. Normally we can keep the mesh shared between instances in graphics memory, but in this case
you pay both a CPU and memory cost per instance.
There are also certain limitations in how Unity handles non-uniform scaling:
Certain components do not fully support non-uniform scaling. For example, for components with a radius property or similar, such as a
Sphere Collider, Capsule Collider, Light, Audio Source etc., the shape will never become elliptical but remain circular/spherical
regardless of non-uniform scaling.
A child object that has a non-uniformly scaled parent and is rotated relative to that parent may have a non-orthogonal matrix, meaning that
it may appear skewed. Some components that do support simple non-uniform scaling still do not support non-orthogonal matrices. For
example, a Box Collider cannot be skewed so if its transform is non-orthogonal, the Box Collider will not match the shape of the rendered
mesh accurately.
For performance reasons, a child object that has a non-uniformly scaled parent will not have its scale/matrix automatically updated while
rotating. This may result in popping of the scale once the scale is updated, for example if the object is detached from its parent.
Importance of Scale
The scale of the Transform determines the difference between the size of your mesh in your modeling application and the size of your mesh in
Unity. The mesh's size in Unity (and therefore the Transform's scale) is very important, especially during physics simulation. There are three
factors that can affect the scale of your object:
Ideally, you should not adjust the Scale of your object in the Transform Component. The best option is to create your models at real-life scale
so you won't have to change your Transform's scale. The next best option is to adjust the scale at which your mesh is imported in the Import
Settings for your individual mesh. Certain optimizations occur based on the import size, and instantiating an object that has an adjusted scale
value can decrease performance. For more information, see the section about optimizing scale on the Rigidbody component reference page.
Hints
When parenting Transforms, set the parent's location to <0,0,0> before adding the child. This will save you many headaches later.
Particle Systems are not affected by the Transform's Scale. In order to scale a Particle System, you need to modify the properties in the
System's Particle Emitter, Animator and Renderer.
If you are using Rigidbodies for physics simulation, there is some important information about the Scale property on the Rigidbody
component reference page.
You can change the colors of the Transform axes (and other UI elements) from the preferences (Menu: Unity > Preferences and then
select the Colors & keys panel).
It is best to avoid scaling within Unity if possible. Try to have the scales of your object finalized in your 3D modeling application, or in the
Import Settings of your mesh.
Page last updated: 2007-11-16
Physics
Unity has NVIDIA PhysX physics engine built-in. This allows for unique emergent behaviour and has many useful features.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Basics
To put an object under physics control, simply add a Rigidbody to it. When you do this, the object will be affected by gravity, and can collide
with other objects in the world.
Rigidbodies
Rigidbodies are physically simulated objects. You use Rigidbodies for things that the player can push around, for example crates or loose
objects, or you can move Rigidbodies around directly by adding forces to it by scripting.
If you move the Transform of a non-Kinematic Rigidbody directly it may not collide correctly with other objects. Instead you should move a
Rigidbody by applying forces and torque to it. You can also add Joints to rigidbodies to make the behavior more complex. For example, you
could make a physical door or a crane with a swinging chain.
You also use Rigidbodies to bring vehicles to life, for example you can make cars using a Rigidbody, 4 Wheel Colliders and a script applying
wheel forces based on the user's Input.
You can make airplanes by applying forces to the Rigidbody from a script. Or you can create special vehicles or robots by adding various Joints
and applying forces via scripting.
Tips:
Kinematic Rigidbodies
A Kinematic Rigidbody is a Rigidbody that has the isKinematic option enabled. Kinematic Rigidbodies are not affected by forces, gravity or
collisions. They are driven explicitly by setting the position and rotation of the Transform or animating them, yet they can interact with other non-
Kinematic Rigidbodies.
Kinematic Rigidbodies correctly wake up other Rigidbodies when they collide with them, and they apply friction to Rigidbodies placed on top of
them.
1. Sometimes you want an object to be under physics control but in another situation to be controlled explicitly from a script or animation.
For example you could make an animated character whose bones have Rigidbodies attached that are connected with joints for use as a
Ragdoll. Most of the time the character is under animation control, thus you make the Rigidbody Kinematic. But when he gets hit you
want him to turn into a Ragdoll and be affected by physics. To accomplish this, you simply disable the isKinematic property.
2. Sometimes you want a moving object that can push other objects yet not be pushed itself. For example if you have an animated
platform and you want to place some Rigidbody boxes on top, you should make the platform a Kinematic Rigidbody instead of just a
Collider without a Rigidbody.
3. You might want to have a Kinematic Rigidbody that is animated and have a real Rigidbody follow it using one of the available Joints.
Static Colliders
A Static Collider is a GameObject that has a Collider but not a Rigidbody. Static Colliders are used for level geometry which always stays at
the same place and never moves around. You can add a Mesh Collider to your already existing graphical meshes (even better use the Import
Settings Generate Colliders check box), or you can use one of the other Collider types.
You should never move a Static Collider on a frame by frame basis. Moving Static Colliders will cause an internal recomputation in PhysX that
is quite expensive and which will result in a big drop in performance. On top of that the behaviour of waking up other Rigidbodies based on a
Static Collider is undefined, and moving Static Colliders will not apply friction to Rigidbodies that touch it. Instead, Colliders that move should
always be Kinematic Rigidbodies.
Character Controllers
You use Character Controllers if you want to make a humanoid character. This could be the main character in a third person platformer, FPS
shooter or any enemy characters.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
These Controllers don't follow the rules of physics since it will not feel right (in Doom you run 90 miles per hour, come to halt in one frame and
turn on a dime). Instead, a Character Controller performs collision detection to make sure your characters can slide along walls, walk up and
down stairs, etc.
Character Controllers are not affected by forces but they can push Rigidbodies by applying forces to them from a script. Usually, all humanoid
characters are implemented using Character Controllers.
Character Controllers are inherently unphysical, thus if you want to apply real physics - Swing on ropes, get pushed by big rocks - to your
character you have to use a Rigidbody, this will let you use joints and forces on your character. Character Controllers are always aligned along
the Y axis, so you also need to use a Rigidbody if your character needs to be able to change orientation in space (for example under a
changing gravity). However, be aware that tuning a Rigidbody to feel right for a character is hard due to the unphysical way in which game
characters are expected to behave. Another difference is that Character Controllers can slide smoothly over steps of a specified height, while
Rigidbodies will not.
If you parent a Character Controller with a Rigidbody you will get a "Joint" like behavior.
Rigidbody
Rigidbodies enable your GameObjects to act under the control of physics. The Rigidbody can receive forces and torque to make your objects
move in a realistic way. Any GameObject must contain a Rigidbody to be influenced by gravity, act under added forces via scripting, or interact
with other objects through the NVIDIA PhysX physics engine.
Properties
Mass The mass of the object (arbitrary units). It is recommended to make masses not more or less than 100 times that
of other Rigidbodies.
Drag How much air resistance affects the object when moving from forces. 0 means no air resistance, and infinity
makes the object stop moving immediately.
Angular Drag How much air resistance affects the object when rotating from torque. 0 means no air resistance. Note that setting
it to infinity will not make the object stop rotating immediately.
Use Gravity If enabled, the object is affected by gravity.
Is Kinematic If enabled, the object will not be driven by the physics engine, and can only be manipulated by its Transform. This
is useful for moving platforms or if you want to animate a Rigidbody that has a HingeJoint attached.
Interpolate Try one of the options only if you are seeing jerkiness in your Rigidbody's movement.
None No Interpolation is applied.
Interpolate Transform is smoothed based on the Transform of the previous frame.
Extrapolate Transform is smoothed based on the estimated Transform of the next frame.
Collision Detection Used to prevent fast moving objects from passing through other objects without detecting collisions.
Discrete Use Discreet collision detection against all other colliders in the scene. Other colliders will use Discreet collision
detection when testing for collision against it. Used for normal collisions (This is the default value).
Continuous Use Discrete collision detection against dynamic colliders (with a rigidbody) and continuous collision detection
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
against static MeshColliders (without a rigidbody). Rigidbodies set to Continuous Dynamic will use continuous
collision detection when testing for collision against this rigidbody. Other rigidbodies will use Discreet Collision
detection. Used for objects which the Continuous Dynamic detection needs to collide with. (This has a big impact
on physics performance, leave it set to Discrete, if you don't have issues with collisions of fast objects)
Continuous Dynamic Use continuous collision detection against objects set to Continuous and Continuous Dynamic Collision. It will also
use continuous collision detection against static MeshColliders (without a rigidbody). For all other colliders it uses
discreet collision detection. Used for fast moving objects.
Constraints Restrictions on the Rigidbody's motion:-
Freeze Position Stops the Rigidbody moving in the world X, Y and Z axes selectively.
Freeze Rotation Stops the Rigidbody rotating around the world X, Y and Z axes selectively.
Details
Rigidbodies allow your GameObjects to act under control of the physics engine. This opens the gateway to realistic collisions, varied types of
joints, and other very cool behaviors. Manipulating your GameObjects by adding forces to a Rigidbody creates a very different feel and look
than adjusting the Transform Component directly. Generally, you shouldn't manipulate the Rigidbody and the Transform of the same
GameObject - only one or the other.
The biggest difference between manipulating the Transform versus the Rigidbody is the use of forces. Rigidbodies can receive forces and
torque, but Transforms cannot. Transforms can be translated and rotated, but this is not the same as using physics. You'll notice the distinct
difference when you try it for yourself. Adding forces/torque to the Rigidbody will actually change the object's position and rotation of the
Transform component. This is why you should only be using one or the other. Changing the Transform while using physics could cause
problems with collisions and other calculations.
Rigidbodies must be explicitly added to your GameObject before they will be affected by the physics engine. You can add a Rigidbody to your
selected object from Components->Physics->Rigidbody in the menubar. Now your object is physics-ready; it will fall under gravity and can
receive forces via scripting, but you may need to add a Collider or a Joint to get it to behave exactly how you want.
Parenting
When an object is under physics control, it moves semi-independently of the way its transform parents move. If you move any parents, they will
pull the Rigidbody child along with them. However, the Rigidbodies will still fall down due to gravity and react to collision events.
Scripting
To control your Rigidbodies, you will primarily use scripts to add forces or torque. You do this by calling AddForce() and AddTorque() on the
object's Rigidbody. Remember that you shouldn't be directly altering the object's Transform when you are using physics.
Animation
For some situations, mainly creating ragdoll effects, it is neccessary to switch control of the object between animations and physics. For this
purpose Rigidbodies can be marked isKinematic. While the Rigidbody is marked isKinematic, it will not be affected by collisions, forces, or
any other part of physX. This means that you will have to control the object by manipulating the Transform component directly. Kinematic
Rigidbodies will affect other objects, but they themselves will not be affected by physics. For example, Joints which are attached to Kinematic
objects will constrain any other Rigidbodies attached to them and Kinematic Rigidbodies will affect other Rigidbodies through collisions.
Colliders
Colliders are another kind of component that must be added alongside the Rigidbody in order to allow collisions to occur. If two Rigidbodies
bump into each other, the physics engine will not calculate a collision unless both objects also have a Collider attached. Collider-less
Rigidbodies will simply pass through each other during physics simulation.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Add a Collider with the Component->Physics menu. View the Component Reference page of any individual Collider for more specific
information:
Compound Colliders
Compound Colliders are combinations of primitive Colliders, collectively acting as a single Collider. They come in handy when you have a
complex mesh to use in collisions but cannot use a Mesh Collider. To create a Compound Collider, create child objects of your colliding object,
then add a primitive Collider to each child object. This allows you to position, rotate, and scale each Collider easily and independently of one
another.
In the above picture, the Gun Model GameObject has a Rigidbody attached, and multiple primitive Colliders as child GameObjects. When the
Rigidbody parent is moved around by forces, the child Colliders move along with it. The primitive Colliders will collide with the environment's
Mesh Collider, and the parent Rigidbody will alter the way it moves based on forces being applied to it and how its child Colliders interact with
other Colliders in the Scene.
Mesh Colliders can't normally collide with each other. If a Mesh Collider is marked as Convex, then it can collide with another Mesh Collider.
The typical solution is to use primitive Colliders for any objects that move, and Mesh Colliders for static background objects.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
safety net to catch collisions in cases where objects would otherwise pass through each other, but will not deliver physically accurate collision
results, so you might still consider decreasing the fixed Time step value in the TimeManager inspector to make the simulation more precise, if
you run into problems with fast moving objects.
If you are modeling a human make sure he is around 2 meters tall in Unity. To check if your object has the right size compare it to the default
cube. You can create a cube using GameObject->Create Other->Cube. The cube's height will be exactly 1 meter, so your human should be
twice as tall.
If you aren't able to adjust the mesh itself, you can change the uniform scale of a particular mesh asset by selecting it in Project View and
choosing Assets->Import Settings... from the menubar. Here, you can change the scale and re-import your mesh.
If your game requires that your GameObject needs to be instantiated at different scales, it is okay to adjust the values of your Transform's scale
axes. The downside is that the physics simulation must do more work at the time the object is instantiated, and could cause a performance drop
in your game. This isn't a terrible loss, but it is not as efficient as finalizing your scale with the other two options. Also keep in mind that non-
uniform scales can create undesirable behaviors when Parenting is used. For these reasons it is always optimal to create your object at the
correct scale in your modeling application.
Hints
The relative Mass of two Rigidbodies determines how they react when they collide with each other.
Making one Rigidbody have greater Mass than another does not make it fall faster in free fall. Use Drag for that.
A low Drag value makes an object seem heavy. A high one makes it seem light. Typical values for Drag are between .001 (solid block of
metal) and 10 (feather).
If you are directly manipulating the Transform component of your object but still want physics, attach a Rigidbody and make it Kinematic.
If you are moving a GameObject through its Transform component but you want to receive Collision/Trigger messages, you must attach a
Rigidbody to the object that is moving.
Constant Force
Constant Force is a quick utility for adding constant forces to a Rigidbody. This works great for one shot objects like rockets, if you don't want
it to start with a large velocity but instead accelerate.
Properties
Force The vector of a force to be applied in world space.
Relative Force The vector of a force to be applied in the object's local space.
Torque The vector of a torque, applied in world space. The object will begin spinning around this vector. The longer the
vector is, the faster the rotation.
Relative Torque The vector of a torque, applied in local space. The object will begin spinning around this vector. The longer the
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Details
To make a rocket that accelerates forward set the Relative Force to be along the positive z-axis. Then use the Rigidbody's Drag property to
make it not exceed some maximum velocity (the higher the drag the lower the maximum velocity will be). In the Rigidbody, also make sure to
turn off gravity so that the rocket will always stay on its path.
Hints
To make an object flow upwards, add a Constant Force with the Force property having a positive Y value.
To make an object fly forwards, add a Constant Force with the Relative Force property having a positive Z value.
Sphere Collider
The Sphere Collider is a basic sphere-shaped collision primitive.
Properties
Is Trigger If enabled, this Collider is used for triggering events, and is ignored by the physics engine.
Material Reference to the Physics Material that determines how this Collider interacts with others.
Radius The size of the Collider.
Center The position of the Collider in the object's local space.
Details
The Sphere Collider can be resized to uniform scale, but not along individual axes. It works great for falling boulders, ping pong balls, marbles,
etc.
Colliders work with Rigidbodies to bring physics in Unity to life. Whereas Rigidbodies allow objects to be controlled by physics, Colliders allow
objects to collide with each other. Colliders must be added to objects independently of Rigidbodies. A Collider does not necessarily need a
Rigidbody attached, but a Rigidbody must be attached in order for the object to move as a result of collisions.
When a collision between two Colliders occurs and if at least one of them has a Rigidbody attached, three collision messages are sent out to
the objects attached to them. These events can be handled in scripting, and allow you to create unique behaviors with or without making use of
the built-in NVIDIA PhysX engine.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Triggers
An alternative way of using Colliders is to mark them as a Trigger, just check the IsTrigger property checkbox in the Inspector. Triggers are
effectively ignored by the physics engine, and have a unique set of three trigger messages that are sent out when a collision with a Trigger
occurs. Triggers are useful for triggering other events in your game, like cutscenes, automatic door opening, displaying tutorial messages, etc.
Use your imagination!
Be aware that in order for two Triggers to send out trigger events when they collide, one of them must include a Rigidbody as well. For a
Trigger to collide with a normal Collider, one of them must have a Rigidbody attached. For a detailed chart of different types of collisions, see
the collision action matrix in the Advanced section below.
Hints
To add multiple Colliders for an object, create child GameObjects and attach a Collider to each one. This allows each Collider to be
manipulated independently.
You can look at the gizmos in the Scene View to see how the Collider is being calculated on your object.
Colliders do their best to match the scale of an object. If you have a non-uniform scale (a scale which is different in each direction), only the
Mesh Collider can match completely.
If you are moving an object through its Transform component but you want to receive Collision/Trigger messages, you must attach a
Rigidbody to the object that is moving.
If you make an explosion, it can be very effective to add a rigidbody with lots of drag and a sphere collider to it in order to push it out a bit
from the wall it hits.
Advanced
Collider combinations
There are numerous different combinations of collisions that can happen in Unity. Each game is unique, and different combinations may work
better for different types of games. If you're using physics in your game, it will be very helpful to understand the different basic Collider types,
their common uses, and how they interact with other types of objects.
Static Collider
These are GameObjects that do not have a Rigidbody attached, but do have a Collider attached. These objects should remain still, or move
very little. These work great for your environment geometry. They will not move if a Rigidbody collides with them.
Rigidbody Collider
These GameObjects contain both a Rigidbody and a Collider. They are completely affected by the physics engine through scripted forces and
collisions. They might collide with a GameObject that only contains a Collider. These will likely be your primary type of Collider in games that
use physics.
This object can be used for circumstances in which you would normally want a Static Collider to send a trigger event. Since a Trigger must
have a Rigidbody attached, you should add a Rigidbody, then enable IsKinematic. This will prevent your Object from moving from physics
influence, and allow you to receive trigger events when you want to.
Kinematic Rigidbodies can easily be turned on and off. This is great for creating ragdolls, when you normally want a character to follow an
animation, then turn into a ragdoll when a collision occurs, prompted by an explosion or anything else you choose. When this happens, simply
turn all your Kinematic Rigidbodies into normal Rigidbodies through scripting.
If you have Rigidbodies come to rest so they are not moving for some time, they will "fall asleep". That is, they will not be calculated during the
physics update since they are not going anywhere. If you move a Kinematic Rigidbody out from underneath normal Rigidbodies that are at rest
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
on top of it, the sleeping Rigidbodies will "wake up" and be correctly calculated again in the physics update. So if you have a lot of Static
Colliders that you want to move around and have different object fall on them correctly, use Kinematic Rigidbody Colliders.
Box Collider
The Box Collider is a basic cube-shaped collision primitive.
Properties
Is Trigger If enabled, this Collider is used for triggering events, and is ignored by the physics
engine.
Material Reference to the Physics Material that determines how this Collider interacts with others.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Details
The Box Collider can be resized into different shapes of rectangular prisms. It works great for doors, walls, platforms, etc. It is also effective as a
human torso in a ragdoll or as a car hull in a vehicle. Of course, it works perfectly for just boxes and crates as well!
Colliders work with Rigidbodies to bring physics in Unity to life. Whereas Rigidbodies allow objects to be controlled by physics, Colliders allow
objects to collide with each other. Colliders must be added to objects independently of Rigidbodies. A Collider does not necessarily need a
Rigidbody attached, but a Rigidbody must be attached in order for the object to move as a result of collisions.
When a collision between two Colliders occurs and if at least one of them has a Rigidbody attached, three collision messages are sent out to
the objects attached to them. These events can be handled in scripting, and allow you to create unique behaviors with or without making use of
the built-in NVIDIA PhysX engine.
Triggers
An alternative way of using Colliders is to mark them as a Trigger, just check the IsTrigger property checkbox in the Inspector. Triggers are
effectively ignored by the physics engine, and have a unique set of three trigger messages that are sent out when a collision with a Trigger
occurs. Triggers are useful for triggering other events in your game, like cutscenes, automatic door opening, displaying tutorial messages, etc.
Use your imagination!
Be aware that in order for two Triggers to send out trigger events when they collide, one of them must include a Rigidbody as well. For a
Trigger to collide with a normal Collider, one of them must have a Rigidbody attached. For a detailed chart of different types of collisions, see
the collision action matrix in the Advanced section below.
Mesh Collider
The Mesh Collider takes a Mesh Asset and builds its Collider based on that mesh. It is far more accurate for collision detection than using
primitives for complicated meshes. Mesh Colliders that are marked as Convex can collide with other Mesh Colliders.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Properties
Is Trigger If enabled, this Collider is used for triggering events, and is ignored by the physics engine.
Material Reference to the Physics Material that determines how this Collider interacts with others.
Mesh Reference to the Mesh to use for collisions.
Smooth Sphere Collisions When this is enabled, collision mesh normals are smoothed. You should enable this on smooth surfaces eg. rolling
terrain without hard edges to make sphere rolling smoother.
Convex If enabled, this Mesh Collider will collide with other Mesh Colliders. Convex Mesh Colliders are limited to 255
triangles.
Details
The Mesh Collider builds its collision representation from the Mesh attached to the GameObject, and reads the properties of the attached
Transform to set its position and scale correctly.
Collision meshes use backface culling. If an object collides with a mesh that will be backface culled graphically it will also not collide with it
physically.
There are some limitations when using the Mesh Collider. Usually, two Mesh Colliders cannot collide with each other. All Mesh Colliders can
collide with any primitive Collider. If your mesh is marked as Convex, then it can collide with other Mesh Colliders.
Colliders work with Rigidbodies to bring physics in Unity to life. Whereas Rigidbodies allow objects to be controlled by physics, Colliders allow
objects to collide with each other. Colliders must be added to objects independently of Rigidbodies. A Collider does not necessarily need a
Rigidbody attached, but a Rigidbody must be attached in order for the object to move as a result of collisions.
When a collision between two Colliders occurs and if at least one of them has a Rigidbody attached, three collision messages are sent out to
the objects attached to them. These events can be handled in scripting, and allow you to create unique behaviors with or without making use of
the built-in NVIDIA PhysX engine.
Triggers
An alternative way of using Colliders is to mark them as a Trigger, just check the IsTrigger property checkbox in the Inspector. Triggers are
effectively ignored by the physics engine, and have a unique set of three trigger messages that are sent out when a collision with a Trigger
occurs. Triggers are useful for triggering other events in your game, like cutscenes, automatic door opening, displaying tutorial messages, etc.
Use your imagination!
Be aware that in order for two Triggers to send out trigger events when they collide, one of them must include a Rigidbody as well. For a
Trigger to collide with a normal Collider, one of them must have a Rigidbody attached. For a detailed chart of different types of collisions, see
the collision action matrix in the Advanced section below.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Hints
Mesh Colliders cannot collide with each other unless they are marked as Convex. Therefore, they are most useful for background objects
like environment geometry.
Convex Mesh Colliders must be fewer than 255 triangles.
Primitive Colliders are less costly for objects under physics control.
When you attach a Mesh Collider to a GameObject, its Mesh property will default to the mesh being rendered. You can change that by
assigning a different Mesh.
To add multiple Colliders for an object, create child GameObjects and attach a Collider to each one. This allows each Collider to be
manipulated independently.
You can look at the gizmos in the Scene View to see how the Collider is being calculated on your object.
Colliders do their best to match the scale of an object. If you have a non-uniform scale (a scale which is different in each direction), only the
Mesh Collider can match completely.
If you are moving an object through its Transform component but you want to receive Collision/Trigger messages, you must attach a
Rigidbody to the object that is moving.
Physics Material
The Physics Material is used to adjust friction and bouncing effects of colliding objects.
To create a Physics Material select Assets->Create->Physics Material from the menu bar. Then drag the Physics Material from the Project
View onto a Collider in the scene.
Properties
Dynamic Friction The friction used when already moving. Usually a value from 0 to 1. A value of zero feels like ice, a value of 1 will
make it come to rest very quickly unless a lot of force or gravity pushes the object.
Static Friction The friction used when an object is laying still on a surface. Usually a value from 0 to 1. A value of zero feels like
ice, a value of 1 will make it very hard to get the object moving.
Bounciness How bouncy is the surface? A value of 0 will not bounce. A value of 1 will bounce without any loss of energy.
Friction Combine Mode How the friction of two colliding objects is combined.
Average The two friction values are averaged.
Min The smallest of the two values is used.
Max The largest of the two values is used.
Multiply The friction values are multiplied with each other.
Bounce Combine How the bounciness of two colliding objects is combined. It has the same modes as Friction Combine Mode
Friction Direction 2 The direction of anisotropy. Anisotropic friction is enabled if this direction is not zero. Dynamic Friction 2 and Static
Friction 2 will be applied along Friction Direction 2.
Dynamic Friction 2 If anisotropic friction is enabled, DynamicFriction2 will be applied along Friction Direction 2.
Static Friction 2 If anisotropic friction is enabled, StaticFriction2 will be applied along Friction Direction 2.
Details
Friction is the quantity which prevents surfaces from sliding off each other. This value is critical when trying to stack objects. Friction comes in
two forms, dynamic and static. Static friction is used when the object is lying still. It will prevent the object from starting to move. If a large
enough force is applied to the object it will start moving. At this point Dynamic Friction will come into play. Dynamic Friction will now attempt
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Hints
Don't try to use a standard physics material for the main character. Make a customized one and get it perfect.
Hinge Joint
The Hinge Joint groups together two Rigidbodies, constraining them to move like they are connected by a hinge. It is perfect for doors, but can
also be used to model chains, pendulums, etc.
Properties
Connected Body Optional reference to the Rigidbody that the joint is dependent upon. If not set, the joint connects to the world.
Anchor The position of the axis around which the body swings. The position is defined in local space.
Axis The direction of the axis around which the body swings. The direction is defined in local space.
Use Spring Spring makes the Rigidbody reach for a specific angle compared to its connected body.
Spring Properties of the Spring that are used if Use Spring is enabled.
Spring The force the object asserts to move into the position.
Damper The higher this value, the more the object will slow down.
Target Position Target angle of the spring. The spring pulls towards this angle measured in degrees.
Use Motor The motor makes the object spin around.
Motor Properties of the Motor that are used if Use Motor is enabled.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Details
A single Hinge Joint should be applied to a GameObject. The hinge will rotate at the point specified by the Anchor property, moving around the
specified Axis property. You do not need to assign a GameObject to the joint's Connected Body property. You should only assign a
GameObject to the Connected Body property if you want the joint's Transform to be dependent on the attached object's Transform.
Think about how the hinge of a door works. The Axis in this case is up, positive along the Y axis. The Anchor is placed somewhere at the
intersection between door and wall. You would not need to assign the wall to the Connected Body, because the joint will be connected to the
world by default.
Now think about a doggy door hinge. The doggy door's Axis would be sideways, positive along the relative X axis. The main door should be
assigned as the Connected Body, so the doggy door's hinge is dependent on the main door's Rigidbody.
Chains
Multiple Hinge Joints can also be strung together to create a chain. Add a joint to each link in the chain, and attach the next link as the
Connected Body.
Hints
You do not need to assign a Connected Body to your joint for it to work.
Use Break Force in order to make dynamic damage systems. This is really cool as it allows the player to break a door off its hinge by
blasting it with a rocket launcher or running into it with a car.
The Spring, Motor, and Limits properties allow you to fine-tune your joint's behaviors.
Spring Joint
The Spring Joint groups together two Rigidbodies, constraining them to move like they are connected by a spring.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Properties
Connected Body Optional reference to the Rigidbody that the joint is dependent upon.
Anchor Position in the object's local space (at rest) that defines the center of the joint. This is not the point that the object
will be drawn toward.
X Position of the joint's local center along the X axis.
Y Position of the joint's local center along the Y axis.
Z Position of the joint's local center along the Z axis.
Spring Strength of the spring.
Damper Amount that the spring is reduced when active.
Min Distance Distances greater than this will not cause the Spring to activate.
Max Distance Distances less than this will not cause the Spring to activate.
Break Force The force that needs to be applied for this joint to break.
Break Torque The torque that needs to be applied for this joint to break.
Details
Spring Joints allows a Rigidbodied GameObject to be pulled toward a particular "target" position. This position will either be another
Rigidbodied GameObject or the world. As the GameObject travels further away from this "target" position, the Spring Joint applies forces that
will pull it back to its original "target" position. This creates an effect very similar to a rubber band or a slingshot.
The "target" position of the Spring is determined by the relative position from the Anchor to the Connected Body (or the world) when the Spring
Joint is created, or when Play mode is entered. This makes the Spring Joint very effective at setting up Jointed characters or objects in the
Editor, but is harder to create push/pull spring behaviors in runtime through scripting. If you want to primarily control a GameObject's position
using a Spring Joint, it is best to create an empty GameObject with a Rigidbody, and set that to be the Connected Rigidbody of the Jointed
object. Then in scripting you can change the position of the Connected Rigidbody and see your Spring move in the ways you expect.
Connected Rigidbody
You do not need to use a Connected Rigidbody for your joint to work. Generally, you should only use one if your object's position and/or
rotation is dependent on it. If there is no Connected Rigidbody, your Spring will connect to the world.
Damper is the resistance encountered by the Spring force. The lower this is, the springier the object will be. As the Damper is increased, the
amount of bounciness caused by the Joint will be reduced.
Hints
You do not need to assign a Connected Body to your Joint for it to work.
Set the ideal positions of your Jointed objects in the Editor prior to entering Play mode.
Spring Joints require your object to have a Rigidbody attached.
iOS
iOS physics optimization hints can be found here .
RandomNumbers
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Randomly chosen items or values are important in many games. This sections shows how you can use Unity's built-in random functions to
implement some common game mechanics.
Note that Random.Range returns a value from a range that includes the first parameter but excludes the second, so using myArray.Length here
gives the correct result.
You can visualise these different outcomes as a paper strip divided into sections each of which occupies a fraction of the strip's total length.
The fraction occupied is equal to the probability of that outcome being chosen. Making the choice is equivalent to picking a random point along
the strip's length (say by throwing a dart) and then seeing which section it is in.
In the script, the paper strip is actually an array of floats that contain the different probabilities for the items in order. The random point is
obtained by multiplying Random.value by the total of all the floats in the array (they need not add up to 1; the significant thing is the relative size
of the different values). To find which array element the point is "in", firstly check to see if it is less than the value in the first element. If so, then
the first element is the one selected. Otherwise, subtract the first element's value from the point value and compare that to the second element
and so on until the correct element is found. In code, this would look something like the following:-
return probs.Length - 1;
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Note that the final return statement is necessary because Random.value can return a result of 1. In this case, the search will not find the
random point anywhere. Changing the line
...to a less-than-or-equal test would avoid the extra return statement but would also allow an item to be chosen occasionally even when its
probability is zero.
Shuffling a List
A common game mechanic is to choose from a known set of items but have them arrive in random order. For example, a deck of cards is
typically shuffled so they are not drawn in a predictable sequence. You can shuffle the items in an array by visiting each element and swapping
it with another element at a random index in the array:-
As an example, suppose that ten spawn points are available but only five must be chosen. The probability of the first item being chosen will be
5 / 10 or 0.5. If it is chosen then the probability for the second item will be 4 / 9 or 0.44 (ie, four items still needed, nine left to choose from).
However, if the first was not chosen then the probability for the second will be 5 / 9 or 0.56 (ie, five still needed, nine left to choose from). This
continues until the set contains the five items required. You could accomplish this in code as follows:-
if (numToChoose == 0)
break;
}
}
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
return result;
}
Note that although the selection is random, items in the chosen set will be in the same order they had in the original array. If the items are to be
used one at a time in sequence then the ordering can make them partly predictable, so it may be necessary to shuffle the array before use.
This gives a point inside a cube with sides one unit long. The cube can be scaled simply by multiplying the X, Y and Z components of the
vector by the desired side lengths. If one of the axes is set to zero, the point will always lie within a single plane. For example, picking a random
point on the "ground" is usually a matter of setting the X and Z components randomly and setting the Y component to zero.
When the volume is a sphere (ie, when you want a random point within a given radius from a point of origin), you can use
Random.insideUnitSphere multiplied by the desired radius:-
Note that if you set one of the resulting vector's components to zero, you will *not* get a correct random point within a circle. Although the point
is indeed random and lies within the right radius, the probability is heavily biased toward the edge of the circle and so points will be spread very
unevenly. You should use Random.insideUnitCircle for this task instead:-
Particle Systems
Note: This is the documentation for the new particle system (Shuriken). For documentation on the legacy particle system go to Legacy Particle
System.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
You can create a new particle system by creating a Particle System GameObject (menu GameObject -> Create Other -> Particle System) or
by creating an empty GameObject and adding the ParticleSystem component to it (in Component->Effects)
Individual particle systems can take on various complex behaviors by using Modules.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
They can also be extended by being grouped together into Particle Effects.
If you press the button Open Editor ..., this will open up the Extended Particle Editor, that shows all of the particle systems under the same
root in the scene tree. For more information on particle system grouping, see the section on Particle Effects.
Scrubbing play back time can be performed by dragging on the label Playback Time. All Playback controls have shortcut keys which can be
customized in the Preferences window.
MinMax curves
Many of the properties in the particle system modules describe a change of a value with time. That change is described via MinMax Curves.
These time-animated properties (for example size and speed), will have a pull down menu on the right hand side, where you can choose
between:
Constant: The value of the property will not change with time, and will not be displayed in the Curve Editor
Curve: The value of the property will change with time based on the curve specified in the Curve Editor
Random between constants: The value of the property will be selected at random between the two constants
Random between curves: A curve will be generated at random between the min and the max curve, and the value of the property will change
in time based on the generated curve
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
In the Curve Editor, the x-axis spans time between 0 and the value specified by the Duration property, and the y-axis represents the value of
the animated property at each point in time. The range of the y-axis can be adjusted in the number field in the upper right corner of the Curve
Editor. The Curve Editor currently displays all of the curves for a particle system in the same window.
Note that the "-" in the bottom-right corner will remove the currently selected curve, while the "+" will optimize it (that is make it into a
parametrized curve with at most 3 keys).
For animating properties that describe vectors in 3D space, we use the TripleMinMax Curves, which are simply curves for the x-,y-, and z-
dimensions side by side, and it looks like this:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
A detached Curve Editor that can be docked like any other window
For more information on working with curves, take a look at the Curve Editor documentation
For properties that deal with color, the Particle System makes use of the Color and Gradient Editor. It works in a similar way to the Curve
Editor.
The color-based properties will have a pull down menu on the right hand side, where you can choose between:
Gradient: The gradient (RGBA) will vary throughout time, edited in the Gradient Editor
Random Between Two Gradients: The gradient (RGBA) varies with time and is chosen at random between two values specified Gradient
Editor
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
MinMax curves
Many of the properties in the particle system modules describe a change of a value with time. That change is described via MinMax Curves.
These time-animated properties (for example size and speed), will have a pull down menu on the right hand side, where you can choose
between:
Constant: The value of the property will not change with time, and will not be displayed in the Curve Editor
Curve: The value of the property will change with time based on the curve specified in the Curve Editor
Random between constants: The value of the property will be selected at random between the two constants
Random between curves: A curve will be generated at random between the min and the max curve, and the value of the property will change
in time based on the generated curve
In the Curve Editor, the x-axis spans time between 0 and the value specified by the Duration property, and the y-axis represents the value of
the animated property at each point in time. The range of the y-axis can be adjusted in the number field in the upper right corner of the Curve
Editor. The Curve Editor currently displays all of the curves for a particle system in the same window.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Note that the "-" in the bottom-right corner will remove the currently selected curve, while the "+" will optimize it (that is make it into a
parametrized curve with at most 3 keys).
For animating properties that describe vectors in 3D space, we use the TripleMinMax Curves, which are simply curves for the x-,y-, and z-
dimensions side by side, and it looks like this:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
A detached Curve Editor that can be docked like any other window
For more information on working with curves, take a look at the Curve Editor documentation
For properties that deal with color, the Particle System makes use of the Color and Gradient Editor. It works in a similar way to the Curve
Editor.
The color-based properties will have a pull down menu on the right hand side, where you can choose between:
Gradient: The gradient (RGBA) will vary throughout time, edited in the Gradient Editor
Random Between Two Gradients: The gradient (RGBA) varies with time and is chosen at random between two values specified Gradient
Editor
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Gradient editor
The Gradient Editor is used for describing change of gradient with time. It animates the color (RGB-space, described by the markers at the
bottom), and Alpha (described by the markers at the top).
You can add new markers for Alpha values by clicking near the top of the rectangle, and new ticks for Color by clicking near the bottom. The
markers can be intuitively dragged along the timeline.
If an Alpha tick is selected, you can edit the value for that tick by dragging the alpha value.
If a Color tick is selected, the color can be modified by double clicking on the tick or clicking on the color bar.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Individual particle systems can take on various complex behaviors by using Modules.
They can also be extended by being grouped together into Particle Effects.
If you press the button Open Editor ..., this will open up the Extended Particle Editor, that shows all of the particle systems under the same
root in the scene tree. For more information on particle system grouping, see the section on Particle Effects.
Initially only a few modules are enabled. Addding or removing modules changes the behavior of the particle system. You can add new modules
by pressing the (+) sign in the top-right corner of the Particle System Inspector. This pops up a selection menu, where you can choose the
module you want to enable.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
An alternative way to work with modules is to select "Show All Modules", at which point all of the modules will show up in the inspector.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Then you can enable / disable modules directly from the inspector by clicking the checkbox to the left.
Most of the properties are controllable by curves (see Curve Editor). Color properties are controlled via gradients which define an animation for
color (see Color Editor).
For details on individual modules and their properties, see Particle System Modules
Initial Module
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Emission Module
Controls the rate of particles being emitted and allows spawning large groups of particles at certain moments (over Particle System duration
time). Useful for explosions when a bunch of particles need to be created at once.
Rate Amount of particles emitted over Time (per second) or Distance (per meter) (see MinMaxCurve).
Bursts (Time option only) Add bursts of particles that occur within the duration of the Particle System.
Time and Number of Specify time (in seconds within duration) that a specified amount of particles should be emitted. Use the + and - for
Particles adjusting number of bursts.
Shape Module
Defines the shape of the emitter: Sphere, Hemishpere, Cone, Box and Mesh. Can apply initial force along the surface normal or random
direction.
Sphere
Radius Radius of the sphere. (Can also be manipulated by handles in the Scene View).
Emit from Shell Emit from shell of the sphere. If disabled, particles will be emitted from the volume of the sphere.
Random Direction Should particles have a random direction when emitted or a direction along the surface normal of the sphere?
Hemisphere
Radius Radius of the hemisphere. (Can also be manipulated by handles in the Scene View).
Emit from Shell Emit from shell of the hemisphere. If disabled particles will be emitted from the volume of the hemisphere.
Random Direction Should particles have a random direction when emitted or a direction along the surface normal of the hemisphere?
Cone
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Angle Angle of the cone. If angle is 0 then particles will be emitted in one direction. (Can also be manipulated by handles
in the Scene View).
Radius The radius at the point of emission. If the value is near zero emission will be from a point. A larger value basically
creates a capped cone, emission coming from a disc rather than a point. (Can also be manipulated by handles in
the Scene View).
Length Length of the emission volume. Only available when emitting from a Volume or Volume Shell. (Can also be
manipulated by handles in the Scene View).
Emit From Determines where emission originates from. Possible values are Base, Base Shell, Volume and Volume Shell.
Random Direction Should particles have a random direction when emitted or a direction along the cone?
Box
Box X Scale of box in X. (Can also be manipulated by handles in the Scene View).
Box Y Scale of box in Y. (Can also be manipulated by handles in the Scene View).
Box Z Scale of box in Z. (Can also be manipulated by handles in the Scene View).
Random Direction Should particles have a random direction when emitted or a direction along the Z-axis of the box?
Mesh
Type Particles can be emitted from either Vertex, Edge or Triangle.
Mesh Select Mesh that should be used as emission shape.
Random Direction Should particles have a random direction when emitted or a direction along the surface of the mesh?
Directly animates velocity of the particle. Mostly useful for particles which has complex physical, but simple visual behavior (like smoke with
turbulence and temperature loss) and has little interaction with physical world.
XYZ Use either constant values for curves or random between curves for controlling the movement of the particles. See
MinMaxCurve.
Space Local / World: Are the velocity values in local space or world space?
Basically can be used to simulate drag. Dampens or clamps velocity, if it is over certain threshold. Can be configured per axis or per vector
length.
XYZ Use either constant values for curves or random between curves for controlling the force applied to the particles.
See MinMaxCurve.
Space Local / World: Are the velocity values in local space or world space
Randomize Randomize the force applied to the particles every frame.
Color Controls the color of each particle during its lifetime. If some particles have a shorter lifetime than others, they will
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
animate faster. Use constant color, random between two colors, animate it using gradient or specify a random color
using two gradients (see Gradient). Note that this colour will be multiplied by the value in the Start Color property -
if the Start Color is black then Color Over Lifetime will not affect the particle.
Animates particle color based on its speed. Remaps speed in the defined range to a color.
Color Color used for remapping of speed. Use gradients for varying colors. See MinMaxGradient.
Speed Range The min and max values for defining the speed range which is used for remapping a speed to a color.
Size Controls the size of each particle during its lifetime. Use constant size, animate it using a curve or specify a random
size using two curves. See MinMaxCurve.
Size Size used for remapping of speed. Use curves for varying sizes. See MinMaxCurve.
Speed Range The min and max values for defining the speed range which is used for remapping a speed to a size.
Angular Velocity Controls the rotational speed of each particle during its lifetime. Use constant rotational speed, animate it using a
curve or specify a random rotational speed using two curves. See MinMaxCurve.
Angular Velocity Rotational speed used for remapping of a particle's speed. Use curves for varying rotational speeds. See
MinMaxCurve.
Speed Range The min and max values for defining the speed range which is used for remapping a speed to a rotational speed.
Multiplier Scale factor that determines how much the particles are affected by wind zones (i.e., the wind force vector is
multiplied by this value).
Collision Module
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Set up collisions for the particles of this Particle System. World and planar collisions are supported. Planar collision is very efficient for simple
collision detection. Planes are set up by referencing an existing transform in the scene or by creating a new empty GameObject for this
purpose. Another benefit of planar collision is that particle systems with collision planes can be set up as prefabs. World collision uses raycasts
so must be used with care in order to ensure good performance. However, for cases where approximate collisions are acceptable world
collision in Low or Medium quality can be very efficient.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
This is a powerful module that enables spawning of other Particle Systems at the follwing particle events: birth, death or collision of a particle.
Birth Spawn another Particle System at birth of each particle in this Particle System.
Death Spawn another Particle System at death of each particle in this Particle System.
Collision Spawn another Particle System at collision of each particle in this Particle System. IMPORTANT: Collision needs
to be set up using the Collision Module. See Collision Module.
Animates UV coordinates of the particle over its lifetime. Animation frames can be presented in a form of a grid or every row in the sheet can be
separate animation. The frames are animated with curves or can be a random frame between two curves. The speed of the animation is defined
by "Cycles".
IMPORTANT: The texture used for animation is the one used by the material found in the Renderer module.
Tiles Define the tiling of the texture.
Animation Specify the animation type: Whole Sheet or Single Row.
Whole Sheet Uses the whole sheet for uv animation.
- Frame over Time Controls the uv animation frame of each particle during its lifetime over the whole sheet. Use constant, animate it
using a curve or specify a random frame using two curves. See MinMaxCurve.
Single Row Uses a single row of the texture sheet for uv animation.
- Random Row If checked, the start row will be random, and if unchecked, the row index can be specified (first row is 0).
- Frame over Time Controls the uv animation frame of each particle during its lifetime within the specified row. Use constant, animate it
using a curve or specify a random frame using two curves. See MinMaxCurve.
- Cycles Specify speed of animation.
Renderer Module
The renderer module exposes the ParticleSystemRenderer component's properties. Note that even though a GameObject has a
ParticleSystemRenderer component, its properties are only exposed here. When this module is removed/added, it is actually the
ParticleSystemRenderer component that is added or removed.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
For managing complex particle effects, Unity provides a Particle Editor, which can be accessed from the Inspector, by pressing Open Editor
You can toggle between Show: All and Show: Selected in this Editor. Show: All will render the entire particle effect. Show: Selected will only
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
render the selected particle systems. What is selected will be highlighted with a blue frame in the Particle Editor and also shown in blue in the
Hierarchy view. You can also change the selection both from the Hierarchy View and the Particle Editor, by clicking the icon in the top-left
corner of the Particle System. To do a multiselect, use Ctrl+click on windows and Command+click on the Mac.
You can explicitly control rendering order of grouped particles (or otherwise spatially close particle emitters) by tweaking Sorting Fudge
property in the Renderer module.
Particle Systems in the same hierarchy are considered as part of the same Particle Effect. This hierarchy shows the setup of the effect shown
above.
Typical setup in the Visual Programming Tool and the Animation Preview window
Mecanim workflow
Workflow in Mecanim can be split into three major stages.
1. Asset preparation and import. This is done by artists or animators, with 3rd party tools, such as Max or Maya. This step is
independent of Mecanim features.
2. Character setup for Mecanim, which can be done in 2 ways:
Humanoid character setup. Mecanim has a special workflow for humanoid models, with extended GUI support and retargeting.
The setup involves creating and setting up an Avatar and tweaking Muscle definitions.
Generic character setup. This is for anything like creatures, animated props, four-legged animals, etc. Retargeting is not possible
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
here, but you can still take advantage of the rich feature set of Mecanim, including everything described below.
3. Bringing characters to life. This involves setting up animation clips, as well as interactions between them, and involves setup of State
Machines and Blend Trees, exposing Animation Parameters, and controlling animations from code.
Mecanim comes with a lot of new concepts and terminology. If at any point, you need to find out what something means, go to our Animation
Glossary.
Unity intends to phase out the Legacy animation system over time for all cases by merging the workflows into Mecanim.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Inverse The ability to control the character's body parts based on various objects in the
Kinematics world.
(IK)
Non-Mecanim animation terms
Animation The component needed for non-Mecanim animations Component
Component
Page last updated: 2012-11-07
Humanoid meshes
In order to take full advantage of Mecanim's humanoid animation system and retargeting, you need to have a rigged and skinned humanoid
type mesh.
1. A character model is generally made up of polygons in a 3D package or converted to polygon or triangulated mesh, from a more
complex mesh type before export.
2. A joint hierarchy or skeleton which defines the bones inside the mesh and their movement in relation to one another, must be created
to control the movement of the character. The process for creating the joint hierarchy is known as rigging.
3. The mesh or skin must then be connected to the joint hierarchy in order to define which parts of the character mesh move when a given
joint is animated. The process of connecting the skeleton to the mesh is known as skinning.
1. Use a procedural character system or character generator such as Poser, Makehuman or Mixamo. Some of these systems will rig and
skin your mesh (eg, Mixamo) while others will not. Furthermore, these methods may require that you reduce the number of polygons in
your original mesh to make it suitable for use in Unity.
2. Purchase demo examples and character content from the Unity Asset Store.
3. Also, you can of course prepare your own character from scratch.
Export the mesh with the skeleton hierarchy, normals, textures and animation
Re-import into your 3D package to verify your animated model has exported as you expected
Export animations without meshes
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Further details
The following pages cover the stages of preparing and importing animation assets in greater depth
Modelling
This is the process of creating your own humanoid mesh in a 3D modelling package - 3DSMax, Maya, Blender, etc. Although this is a whole
subject in its own right, there are a few guidelines you can follow to ensure a model works well with animation in a Unity project.
Observe a sensible topology. The exact nature of a "sensible" structure for your mesh is rather subtle but generally, you should bear in
mind how the vertices and triangles of the model will be distorted as it is animated. A poor topology will not allow the model to move without
unsightly distortion of the mesh. A lot can be learned by studying existing 3D character meshes to see how the topology is arranged and
why.
Be mindful of the scale of your mesh. Do a test import and compare the size of your imported model with a "meter cube" (the standard Unity
cube primitive has a side length of one unit, so it can be taken as a 1m cube for most purposes). Check the units your 3D package is using
and adjust the export settings so that the size of the model is in correct proportion to the cube. Unless you are careful, it is easy to create
models without any notion of their scale and consequently end up with a set of objects that are disproportionate in size when they are
imported into Unity.
Arrange the mesh so that the character's feet are standing on the local origin or "anchor point" of the model. Since a character typically
walks upright on a floor, it is much easier to handle if its anchor point (ie, its transform position) is directly on that floor.
Model in a T-pose if you can. This will help allow space to refine polygon detail where you need it (e.g. underarms). This will also make it
easier to position your rig inside the mesh.
Clean up your model. Where possible, cap holes, weld verts and remove hidden faces, this will help with skinning, especially automated
skinning processes.
Rigging
This is the process of creating a skeleton of joints to control the movements of your model.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
3D packages provide a number of ways to create joints for your humanoid rig. These range from ready-made biped skeletons that you can
scale to fit your mesh, right through to tools for individual bone creation and parenting to create your own bone structure. Although the details
are outside the scope of Unity, here are some general guidelines:
Study existing humanoid skeletons hierarchies (eg, bipeds) and where possible use or mimic the bone structure.
Make sure the hips are the parent bone for your skeleton hierarchy.
A minimum of fifteen bones are required in the skeleton.
The joint/bone hierachy should follow a natural structure for the character you are creating. Given that arms and legs come in pairs, you
should use a consistent convention for naming them (eg, "arm_L" for the left arm, "arm_R" for the right arm, etc). Possible hierarchies
include:
HIPS - spine - chest - shoulders - arm - forearm - hand
HIPS - spine - chest - neck - head
HIPS - UpLeg - Leg - foot - toe - toe_end
Skinning
This is the process of attaching the mesh to the skeleton
Skinning involves binding vertices in your mesh to bones, either directly (rigid bind) or with blended influence to a number of bones (soft bind).
Different software packages use different methods, eg, assigning individual vertices and painting the weighting of influence per bone onto the
mesh. The initial setup is typically automated, say by finding the nearest influence or using "heatmaps". Skinning usually requires a fair amount
of work and testing with animations in order to ensure satisfactory results for the skin deformation. Some general guidelines for this process
include:
Using an automated process initially to set up some of the skinning (see relevant tutorials on 3DMax, Maya, etc.)
Creating a simple animation for your rig or importing some animation data to act as a test for the skinning. This should give you a quick way
to evaluate whether or not the skinning looks good in motion.
Incrementally editing and refining your skinning solution.
Sticking to a maximum of four influences when using a soft bind, since this is the maximum number that Unity will handle. If more than four
influences affect part of the mesh then at least some information will be lost when playing the animation in Unity.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
(back to AssetPreparationandImport)
Importing Animations
Before a character model can be used, it must first be imported into your project. Unity can import native Maya (.mb or .ma) and Cinema 4D
(.c4d) files, and also generic FBX files which can be exported from most animation packages (see this page for further details on exporting). To
import an animation, simply drag the model file to the Assets folder of your project. When you select the file in the Project View you can edit
the Import Settings in the inspector:-
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
See the FBX importer page for a full description of the available import options.
Splitting animations
Splitting animations
An animated character typically has a number of different movements that are activated in the game in different circumstances. These
movements are called Animation Clips. For example, we might have separate animation clips for walking, running, jumping, throwing, dying,
etc. Depending on the way the model was animated, these separate movements might be imported as distinct animation clips or as one single
clip where each movement simply follows on from the previous one. In cases where there is only a single clip, the clip must be split into its
component animation clips within Unity, which will involve some extra steps in your workflow.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
You will see a list available clips which you can preview by pressing Play in the Preview Window (lower down in the inspector). The frame
ranges of the clips can be edited, if needed.
In cases like this, you can define the frame ranges that correspond to each of the separate animation sequences (walking, jumping, etc). You
can create a new animation clip by pressing (+) and selecting the range of frames that are included in it.
For example:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
In the Import Settings, the Split Animations table is where you tell Unity which frames in your asset file make up which Animation Clip. The
names you specify here are used to activate them in your game.
For further information about the animation inspector, see the Animation Clip component reference page.
For models that have muscle definitions (Mecanim), the process is different:-
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Only the animation data from these files will be used, even if the original files are exported with mesh data.
An example of four animation files for an animated character (note that the .fbx suffix is not shown within Unity)
Unity automatically imports all four files and collects all animations to the file without the @ sign in. In the example above, the goober.mb file
will be set up to reference idle, jump, walk and wallJump automatically.
For FBX files, simply export a model file with no animation ticked (eg, goober.fbx) and the 4 clips as [email protected] by exporting the
desired keyframes for each (enable animation in the FBX dialog).
Because of the similarity in bone structure, it is possible to map animations from one humanoid skeleton to another, allowing retargeting and
inverse kinematics.
With rare exceptions, humanoid models can be expected to have the same basic structure, representing the major articulate parts of the body,
head and limbs. The Mecanim system makes good use of this idea to simplify the rigging and control of animations. A fundamental step in
creating a animation is to set up a mapping between the simplified humanoid bone structure understood by Mecanim and the actual bones
present in the skeleton; in Mecanim terminology, this mapping is called an Avatar. The pages in this section explain how to create an Avatar
for your model.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Humanoid animations
For a Humanoid rig, select Humanoid and click Apply. Mecanim will attempt to match up your existing bone structure to the Avatar bone
structure. In many cases, it can do this automatically by analysing the connections between bones in the rig.
If the match has succeeded, you will see a check mark next to the Configure... menu
Also, in the case of a successful match, an Avatar sub-asset is added to the model asset, which you will be able to see in the project view
hierarchy.
If Mecanim was unable to create the Avatar, you will see a cross next to the Configure ... button, and no Avatar sub-asset will be added. When
this happens, you need to configure the avatar manually.
Non-humanoid animations
Two options for non-humanoid animation are provided: Generic and Legacy. Generic animations are imported using the Mecanim system but
don't take advantage of the extra features available for humanoid animations. Legacy animations use the the animation system that was
provided by Unity before Mecanim. There are some cases where it is still useful to work with legacy animations (most notably with legacy
projects that you don't want to update fully) but they are seldom needed for new projects. See this section of the manual for further details on
legacy animations.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
If the automatic Avatar creation fails, you will see a cross next to the Configure button.
Here, success simply means all of the required bones have been matched but for better results, you might want to match the optional bones as
well and get the model into a proper T-pose.
When you go to the Configure ... menu, the editor will ask you to save your scene. The reason for this is that in Configure mode, the Scene
View is used to display bone, muscle and animation information for the selected model alone, without displaying the rest of the scene.
Once you have saved the scene, you will see a new Avatar Configuration inspector, with a bone mapping.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The inspector shows which of the bones are required and which are optional - the optional ones can have their movements interpolated
automatically. For Mecanim to produce a valid match, your skeleton needs to have at least the required bones in place. In order to improve
your chances for finding a match to the Avatar, name your bones in a way that reflects the body parts they represent (names like "LeftArm",
"RightForearm" are suitable here).
If the model does NOT yield a valid match, you can manually follow a similar process to the one used internally by Mecanim:-
1. Sample Bind-pose (try to get the model closer to the pose with which it was modelled, a sensible initial pose)
2. Automap (create a bone-mapping from an initial pose)
3. Enforce T-pose (force the model closer to T-pose, which is the default pose used by Mecanim animations)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
If the auto-mapping (Mapping->Automap) fails completely or partially, you can assign bones by either draging them from the Scene or from the
Hierarchy. If Mecanim thinks a bone fits, it will show up as green in the Avatar Inspector, otherwise it shows up in red.
Finally, if the bone assignment is correct, but the character is not in the correct pose, you will see the message "Character not in T-Pose". You
can try to fix that with Enforce T-Pose or rotate the remaining bones into T-pose.
Muscle Definitions
Mecanim allows you to control the range of motion of different bones using Muscles.
Once the Avatar has been properly configured, Mecanim will "understand" the bone structure and allow you to start working in the Muscles tab
of the Avatar Inspector. Here, it is very easy to tweak the character's range of motion and ensure the character deforms in a convincing way,
free from visual artifacts or self-overlaps.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
You can either adjust individual bones in the body (lower part of the view) or manipulate the character using predefined deformations which
operate on several bones at once (upper part of the view).
Muscle Clips
In the Animation tab, you can set up Muscle Clips, which are animations for specific muscles and muscle groups.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
You can also define which body parts these muscle clips apply to.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Specific body parts can be selectively enabled or disabled in an animation using a so-called Body Mask. Body masks are used in the
Animation tab of the mesh import inspector and Animation Layers. Body masks enable you to tailor an animation to fit the specific requirements
of your character more closely. For example, you may have a standard walking animation that includes both arm and leg motion, but if a
character is carrying a large object with both hands then you wouldn't want his arms to swing by his sides as he walks. However, you could still
use the standard walking animation by switching off the arm movements in the body mask.
The body parts included are: Head, Left Arm, Right Arm, Left Hand, Right Hand, Left Leg, Right Leg and Root (which is denoted by the
"shadow" under the feet). In the body mask, you can also toggle inverse kinematics (IK) for hands and feet, which will determine whether or
not IK curves will be included in animation blending.
In the Animation tab of the mesh import inspector, you will see a list entitled Clips that contains all the object's animation clips. When you select
an item from this list, options for the clip will be shown, including the body mask editor.
You can also create Body Mask Assets (Assets->Create->Avatar Body Mask), which show up as .mask files on disk.
The BodyMask assets can be reused in Animator Controllers, when specifying Animation Layers
A benefit of using body masks is that they tend to reduce memory overheads since body parts that are not active do not need their associated
animation curves. Also, the unused curves need not be calculated during playback which will tend to reduce the CPU overhead of the
animation.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Retargeting
One of the most powerful features of Mecanim is retargeting of humanoid animations. This means that with relative ease, you can apply the
same set of animations to various character models. Retargeting is only possible for humanoid models, where an Avatar has been configured,
because this gives us a correspondence between the models' bone structure.
Your project should also contain another character model with a valid Avatar.
Put the model as a child of the GameObject, together with the Animator component
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Make sure scripts referencing the Animator are looking for the animator in the children instead of the root (use
GetComponentInChildren<Animator>() instead of GetComponent<Animator>())
Then in order to reuse the same animations on another model, you need to:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Make sure the Animator Controller property for the new model is referencing the same controller asset
Tweak the character controller, the transform, and other properties on the top-level GameObject, to make sure that the animations work
smoothly with the new model.
You're done!
Inverse Kinematics
Most animation is produced by rotating the angles of joints in a skeleton to predetermined values. The position of a child joint changes
according to the rotation of its parent and so the end point of a chain of joints can be determined from the angles and relative positions of the
individual joints it contains. This method of posing a skeleton is known as forward kinematics.
However, it is often useful to look at the task of posing joints from the opposite point of view - given a chosen position in space, work backwards
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
and find a valid way of orienting the joints so that the end point lands at that position. This can be useful when you want a character to touch an
object at a point selected by the user or plant its feet convincingly on an uneven surface. This approach is known as Inverse Kinematics (IK)
and is supported in Mecanim for any humanoid character with a correctly configured Avatar.
To set up IK for a character, you typically have objects around the scene that a character interacts with, and then set up the IK thru script, in
particular, Animator functions like SetIKPositionWeight, SetIKRotationWeight, SetIKPosition, SetIKRotation, SetLookAtPosition, bodyPosition,
bodyRotation
In the illustration above, we show a character grabbing a cylindrical object. How do we make this happen?
We start out with a character that has a valid Avatar, and attach to it a script that actually takes care of the IK, let's call it IKCtrl:
using UnityEngine;
using System;
using System.Collections;
[RequireComponent(typeof(Animator))]
void Start ()
{
animator = GetComponent<Animator>();
}
//if the IK is active, set the position and rotation directly to the goal.
if(ikActive) {
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
//weight = 1.0 for the right hand means position and rotation will be at the IK goal (the place the character
wants to grab)
animator.SetIKPositionWeight(AvatarIKGoal.RightHand,1.0f);
animator.SetIKRotationWeight(AvatarIKGoal.RightHand,1.0f);
//set the position and the rotation of the right hand where the external object is
if(rightHandObj != null) {
animator.SetIKPosition(AvatarIKGoal.RightHand,rightHandObj.position);
animator.SetIKRotation(AvatarIKGoal.RightHand,rightHandObj.rotation);
}
//if the IK is not active, set the position and rotation of the hand back to the original position
else {
animator.SetIKPositionWeight(AvatarIKGoal.RightHand,0);
animator.SetIKRotationWeight(AvatarIKGoal.RightHand,0);
}
}
}
}
As we do not intend for the character to grab the entire object with his hand, we position a sphere where the hand should be on the cylinder,
and rotate it accordingly.
This sphere should then be placed as the "Right Hand Obj" property of the IKCtrl script
Observe the character grabbing and ungrabbing the object as you click the IKActive checkbox
Generic Animations
The full power of Mecanim is most evident when you are working with humanoid animations. However, non-humanoid animations are also
supported although without the avatar system and other features. In Mecanim terminology, non-humanoid animations are referred to as
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Generic Animations.
To start working with a generic skeleton, go to the Rig tab in the FBX importer and choose Generic from the Animation Type menu.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
motions. Mecanim provides convenient tools for this. Animation clips can loop based on pose, rotation, and position.
If you drag the Start or End points of the animation clip, you will see the Looping fitness curves for all of the paramers based on which it is
possible to loop. If you place the Start / End marker in a place where the curve for the property is green, it is more likely that the clip can loop
properly. The loop match indicator will show how good the looping is for the selected ranges.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Once the loop match indicator is green, Enabling Loop Pose (for example) will make sure the looping of the pose is artifact-free.
For more details on animation clip options, see Animation Clip reference
(back to Mecanim introduction)
Animator Component
Any GameObject that has an avatar will also have an Animator component, which is the link between the character and its behavior.
The Animator component references an Animator Controller which is used for setting up behavior on the character. This includes setup for
State Machines, Blend Trees, and events to be controlled from script.
Properties
Controller The animator controller attached to this character
Avatar The Avatar for this character.
Apply Root Motion Should we control the character's position from the animation itself or from script.
Animate Physics Should the animation interact with physics?
Culling Mode Culling mode for animations
Always animate Always animate, don't do culling
Based on Renderers When the renderers are invisible, only root motion is animated. All other body parts will remain static while the
character is invisible.
Animator Controller
You can view and set up character behavior from the Animator Controller view (Menu: Window > Animator Controller).
An Animator Controller can be created from the Project View (Menu: Create > Animator Controller). This creates a .controller asset on
disk, which looks like this in the Project Browser
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
After the state machine setup has been made, you can drop the controller onto the Animator component of any character with an Avatar in the
Hierarchy View.
Note that the Animator Controller Window will always display the state machine from the most recently selected .controller asset,
regardless of what scene is currently loaded.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The states and transitions of a state machine can be represented using a graph diagram, where the nodes represent the states and the arcs
(arrows between nodes) represent the transitions. You can think of the current state as being a marker or highlight that is placed on one of the
nodes and can then only jump to another node along one of the arrows.
The importance of state machines for animation is that they can be designed and updated quite easily with relatively little coding. Each state
has a Motion associated with it that will play whenever the machine is in that state. This enables an animator or designer to define the possible
sequences of character actions and animations without being concerned about how the code will work.
Animation State Machines can be set up from the Animator Controller Window, and they look something like this:
State Machines consist of States, Transitions and Events and smaller Sub-State Machines can be used as components in larger machines.
Animation States
Animation Transitions
Animation Parameters
Animation States
Animation States are the basic building blocks of an Animation State Machine. Each state contains an individual animation sequence (or
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
blend tree) which will play while the character is in that state. When an event in the game triggers a state transition, the character will be left in
a new state whose animation sequence will then take over.
When you select a state in the Animator Controller, you will see the properties for that state in the inspector:-
The default state, displayed in brown, is the state that the machine will be in when it is first activated. You can change the default state, if
necessary, by right-clicking on another state and selecting Set As Default from the context menu. The solo and mute checkboxes on each
transition are used to control the behaviour of animation previews - see this page for further details.
A new state can be added by right-clicking on an empty space in the Animator Controller Window and selecting Create State->Empty from the
context menu. Alternatively, you can drag an animation into the Animator Controller Window to create a state containing that animation. (Note
that you can only drag Mecanim animations into the Controller - non-Mecanim animations will be rejected.) States can also contain Blend
Trees.
Any State
Any State is a special state which is always present. It exists for the situation where you want to go to a specific state regardless of which state
you are currently in. This is a shorthand way of adding the same outward transition to all states in your machine. Note that the special meaning
of Any State implies that it cannot be the end point of a transition (ie, jumping to "any state" cannot be used as a way to pick a random state to
enter next).
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Animation Transitions
Animation Transitions define what happens when you switch from one Animation State to another. There can be only one transition active at
any given time.
An event parameter
Instead of a parameter, you can also use Exit Time, and specify a number which represents the normalized time of the source state
(e.g. 0.95 means the transition will trigger, when we've played the source clip 95% through).
A conditional predicate, if needed (for example Less/Greater for floats).
A parameter value (if needed).
You can adjust the transition between the two animation clips by dragging the start and end values of the overlap.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Animation Parameters
Animation Parameters are variables that are defined within the animation system but can also be accessed and assigned values from scripts.
For example, the value of a parameter can be updated by an animation curve and then accessed from a script so that, say, the pitch of a sound
effect can be varied as if it were a piece of animation. Likewise, a script can set parameter values to be picked up by Mecanim. For example, a
script can set a parameter to control a Blend Tree.
Default parameter values can be set up using the Parameters widget in the bottom left corner of the Animator window. They can be of four
basic types:
Parameters can be assigned values from a script using functions in the Animator class: SetVector, SetFloat, SetInt, and SetBool.
using UnityEngine;
using System.Collections;
void Start ()
{
animator = GetComponent<Animator>();
}
void Update ()
{
if(animator)
{
//get the current state
AnimatorStateInfo stateInfo = animator.GetCurrentAnimatorStateInfo(0);
//if we're in "Run" mode, respond to input for jump, and set the Jump parameter accordingly.
if(stateInfo.nameHash == Animator.StringToHash("Base Layer.RunBT"))
{
if(Input.GetButton("Fire1"))
animator.SetBool("Jump", true );
}
else
{
animator.SetBool("Jump", false);
}
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
float h = Input.GetAxis("Horizontal");
float v = Input.GetAxis("Vertical");
It is important to distinguish between Transitions and Blend Trees. While both are used for creating smooth animation, they are used for
different kinds of situations.
Transitions are used for transitioning smoothly from one Animation State to another over a given amount of time. Transitions are specified
as part of an Animation State Machine. A transition from one motion to a completely different motion is usually fine if the transition is quick.
Blend Trees are used for allowing multiple animations to be blended smoothly by incorporating parts of them all to varying degrees. The
amount that each of the motions contributes to the final effect is controlled using a blending parameter, which is just one of the numeric
animation parameters associated with the Animator Controller. In order for the blended motion to make sense, the motions that are blended
must be of similar nature and timing. Blend Trees are a special type of state in an Animation State Machine.
Examples of similar motions could be various walk and run animations. In order for the blend to work well, the movements in the clips must
take place at the same points in normalized time. For example, walking and running animations can be aligned so that the moments of contact
of foot to the floor take place at the same points in normalized time (e.g. the left foot hits at 0.0 and the right foot at 0.5). Since normalized time
is used, it doesn't matter if the clips are of different length.
The Animator Window now shows a graph of the entire Blend Tree while the Inspector shows the currently selected node and its immediate
children.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Animator Window shows a graph of the entire Blend Tree. To the left is a Blend Tree with only the root Blend Node. To the right is a Blend
Tree with a root Blend Node and three Animation Clips as child nodes.
This gives a graphical visualization of how the animations are combined as the parameter value changes (as you drag the slider, the arrows
from the tree root change their shading to show the dominant animation clip).
You can select any of the nodes in the Blend Tree graph to inspect it in the Inspector. If the selected node is an Animation Clip the Inspector for
that Animation Clip will be shown. The settings will be read-only if the animation is imported from a model. If the node is a Blend Node, the
Inspector for Blend Nodes will be shown.
A Blend Node shown in the Inspector before any motions have been added.
The Blend Type drop-down is used to select one of the different blend types that can blend according to one or two parameters. You can read
more about the different blend types and other Blend Tree options on the following pages.
1D Blending
2D Blending
Additional Blend Tree Options
1D Blending
The first option in the Inspector of a Blend Node is the The Blend Type. This drop-down is used to select one of the different blend types that
can blend according to one or two parameters. 1D Blending blends the child motions according to a single parameter.
After setting the Blend Type, the first thing you need is to select the Animation Parameter that will control this Blend Tree. In this example, the
parameter is direction which varies between -1.0 (left) and +1.0 (right), with 0.0 denoting a straight run without leaning.
Then you can add individual animations by clicking + -> Add Motion Field to add an Animation Clip to the blend tree. When you're done, it
should look something like this:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The diagram at the top of the Inspector shows the influence of each of the child motions as the parameter varies between its minimum and
maximum values. Each motion is shown as a little blue pyramid (the first and last are only shown in half), and if you click and hold down the left
mouse button on one them, the corresponding motion is highlighted in the motion list below. The peak of each pyramid defines the parameter
value where the motion has full influence, meaning that its animation weight is 1 and the other animations have a weight of 0. This is also called
the threshold of the motion.
The diagram at the top of the Blend Node Inspector visualizes the weights of the child motions over the range of the parameter values.
The red vertical bar indicates the value of the Parameter. If you press Play in the Preview at the bottom of the Inspector and drag the red bar in
the diagram left and right, you can see how the value of the parameter is controlling the blending of the different motions.
Parameter Range
The range of the parameter used by the Blend Node is shown below the diagram as two numbers to the left and right. Either one of them can
be changed by clicking on the number and dragging left or right with the mouse. Note that the values correspond to the threshold of the first
and last motion in the motion list.
Thresholds
You can change the threshold value of a motion by clicking on its corresponding blue pyramid in the diagram and dragging it left or right. If the
"Automate Thresholds" toggle is not enabled, you can also edit the threshold value of a motion in the motion list by typing in a number in the
number field in the Threshold column.
Below the motion list is the checkbox Automate Thresholds. Enabling it will distribute the thresholds of the motions evenly across the parameter
range. For example, if there are five clips and the parameter ranges from -90 to +90, the thresholds will be set to -90, -45, 0, +45 and +90 in
order.
The Compute Thresholds drop-down will set the thresholds from data of your choice obtained from the root motions in the Animation Clips.
The data that is available to choose from is speed, velocity x, y, or x, and angular speed in degrees or radians. If your parameter corresponds
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
to one of these properties, you can compute the thresholds using the Compute Thresholds drop-down.
Speed Sets the threshold of each motion according to its speed (the magnitude of the velocity).
Velocity X Sets the threshold of each motion according to its velocity.x.
Velocity Y Sets the threshold of each motion according to its velocity.y.
Velocity Z Sets the threshold of each motion according to its velocity.z.
Angular Speed (Rad) Sets the threshold of each motion according to its angular speed in radians per second.
Angular Speed (Deg) Sets the threshold of each motion according to its angular speed in degrees per second.
Say, for example, you had a walk animation that covered 1.5 units per second, a jog at 2.3 units per second, and a run at 4 units per second,
choosing the Speed option from the drop-down would set the parameter range and thresholds for the three animations based on these values.
So, if you set the speed parameter to 3.0, it would blend the jog and run with a slight bias toward the jog.
2D Blending
The first option in the Inspector of a Blend Node is the The Blend Type. This drop-down is used to select one of the different blend types that
can blend according to one or two parameters. The 2D blending types blends the child motions according to two parameters.
The different 2D Blend Types have different uses that they are suitable for. They differ in how the influence of each motion is calculated.
2D Simple Directional
Best used when your motions represent different directions, such as "walk forward", "walk backward", "walk left", and "walk right", or
"aim up", "aim down", "aim left", and "aim right". Optionally a single motion at position (0, 0) can be included, such as "idle" or "aim
straight". In the Simple Directional type there should not be multiple motions in the same direction, such as "walk forward" and "run
forward".
2D Freeform Directional
This blend type is also used when your motions represent different directions, however you can have multiple motions in the same
direction, for example "walk forward" and "run forward". In the Freeform Directional type the set of motions should always include a
single motion at position (0, 0), such as "idle".
2D Freeform Cartesian
Best used when your motions do not represent different directions. With Freeform Cartesian your X parameter and Y parameter can
represent different concepts, such as angular speed and linear speed. An example would be motions such as "walk forward no turn",
"run forward no turn", "walk forward turn right", "run forward turn right" etc.
After setting the Blend Type, the first thing you need is to select the two Animation Parameters that will control this Blend Tree. In this example,
the parameters are velocityX (strafing) and velocityZ (forward speed).
Then you can add individual animations by clicking + -> Add Motion Field to add an Animation Clip to the blend tree. When you're done, it
should look something like this:
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The positions in 2D blending are like the thresholds in 1D blending, except that there are two values instead of one, corresponding to each of
the two parameters. Their positions along the horizontal X axis correspond to the first parameter, and their positions along the vertical Y axis
correspond to the second parameter. A walking forward animation might have a velocityX of 0 and a velocityZ of 1.5, so those values should
be typed into the Pos X and Pos Y number fields for the motion.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The diagram at the top of the Blend Node Inspector visualizes the weights of the child motions over the extends of the parameter values.
The red dot indicates the values of the two Parameters. If you press Play in the Preview at the bottom of the Inspector and drag the red dot in
the diagram around, you can see how the values of the parameters are controlling the blending of the different motions. In the diagram you can
also see the influence of each motion represented as circles around each motion. You will see that if you move the red dot on top of one of the
blue dots representing a motion, the circle for that motion gains its maximum radius and the circles for all other motions shrink down to nothing.
At positions that are in between several motions, multiple of the nearby motions will have an influence on the blend. If you select one of the
motions in order to see the blue influence field of that motion, you can see that as you move the red dot around, the circle size of the motion
corresponds exactly with how strong the influence field is at various positions.
When no motion is selected, the diagram shows a mix of all the influence fields that is more blue where a single motion dominates and less
blue where many motions contribute to the blend.
Positions
You can change the positions of a motion by clicking on its corresponding blue dot in the diagram and dragging it around. You can also edit
position coordinates of a motion in the motion list by typing in numbers in the number fields in the Pos X and Pos Y columns.
The Compute Positions drop-down will set the positions from data of your choice obtained from the root motions in the Animation Clips. The
data that is available to choose from is speed, velocity x, y, or x, and angular speed in degrees or radians. If one or both of your parameters
correspond to one of these properties, you can compute the Pos X and/or Pos Y using the Compute Positions drop-down.
Velocity XZ Sets the Pos X of each motion according to its velocity.x and the Pos Y according to its velocity.z.
Speed And Angular Speed Sets the Pos X of each motion according to its angular speed (in radians per second) and the Pos Y according to
its speed.
Furthermore you can mix and match by choosing Compute Position -> X Position From and/or Compute Position -> Y Position From to
only auto-compute one of them at a time, leaving the other unchanged.
Speed Sets the Pos X or Pos Y of each motion according to its speed (the magnitude of the velocity).
Velocity X Sets the Pos X or Pos Y of each motion according to its velocity.x.
Velocity Y Sets the Pos X or Pos Y of each motion according to its velocity.y.
Velocity Z Sets the Pos X or Pos Y of each motion according to its velocity.z.
Angular Speed (Rad) Sets the Pos X or Pos Y of each motion according to its angular speed in radians per second.
Angular Speed (Deg) Sets the Pos X or Pos Y of each motion according to its angular speed in degrees per second.
Say, for example, that your parameters correspond to sideways velocity and forward velocity, and that you have an idle animation with an
average velocity (0, 0, 0), a walk animation with (0, 0, 1.5), and two strafe animations with velocities of (-1.5, 0, 0) and (1.5, 0, 0) respectively.
Choosing the Velocity XZ option from the drop-down would set the positions of the motions according to the X and Z coordinates of those
velocities.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Time Scale
You can alter the "natural" speed of the animation clips using the animation speed number fields (the columns with a clock icon at the top), so
you could make the walk twice as fast by using a value of 2.0 as its speed. The Adjust Time Scale > Homogeneous Speed button rescales
the speeds of the clips so that they correspond with the chosen minimum and maximum values of the parameter but keep the same relative
speeds they initially had.
Note that the Adjust Time Scale drop-down is only available if all the motions are Animation Clips and not child Blend Trees.
Mirroring
You can mirror any humanoid Animation Clip in the motions list by enabling the mirror toggle at the far right. This feature enables you to use the
same animation in its original form and in a mirrored version without needing twice the memory and space.
Advanced topics
The following section covers the features Mecanim provides for controlling and managing complex sets of animations.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The curve's X-axis represents normalized time and always ranges between 0.0 and 1.0 (corresponding to the beginning and the end of the
animation clip respectively, regardless of its duration).
Double-clicking an animation curve will bring up the standard Unity curve editor (see Editing Value Properties for further details) which you can
use to add keys to the curve. Keys are points along the curve's timeline where it has a value explicitly set by the animator rather than just using
an interpolated value. Keys are very useful for marking important points along the timeline of the animation. For example, with a walking
animation, you might use keys to mark the points where the left foot is on the ground, then both feet on the ground, right foot on the ground,
etc. Once the keys are set up, you can move conveniently between key frames by pressing the Previous/Next Key Frame buttons. This will
move the vertical red line and show the normalized time at the keyframe; the value you enter in the text box will then set the value of the curve
at that time.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Although this is useful for control purposes, the downside is that the state machine will become large and unwieldy as more of these complex
actions are added. You can simplify things somewhat just by separating the groups of states visually with empty space in the editor. However,
Mecanim goes a step further than this by allowing you to collapse a group of states into a single named item in the state machine diagram.
These collapsed groups of states are called Sub-state machines.
You can create a sub-state machine by right clicking on an empty space within the Animator Controller window and selecting Create Sub-State
Machine from the context menu. A sub-state machine is represented in the editor by an elongated hexagon to distinguish it from normal states.
A sub-state machine
When you double-click the hexagon, the editor is cleared to let you edit the sub-state machine as though it were a completely separate state
machine in its own right. The bar at the top of the window shows a "breadcrumb trail" to show which sub-state machine is currently being edited
(and note that you can create sub-state machines within other sub-state machines, and so on). Clicking an item in the trail will focus the editor
on that particular sub-state machine.
External transitions
As noted above, a sub-state machine is just a way of visually collapsing a group of states in the editor, so when you make a transition to a sub-
state machine, you have to choose which of its states you want to connect to.
You will notice an extra state in the sub-state machine whose name begins with Up.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Up state represents the "outside world", the state machine that encloses the sub-state machine in the view. If you add a transition from a
state in sub-state machine to the Up state, you will be prompted to choose one of the states of the enclosing machine to connect to.
Animation Layers
Unity uses Animation Layers for managing complex state machines for different body parts. An example of this is if you have a lower-body
layer for walking-jumping, and an upper-body layer for throwing objects / shooting.
You can manage animation layers from the Layers Widget in the top-left corner of the Animator Controller.
You can add a new layer by pressing the + on the widget. On each layer, you can specify the body mask (the part of the body on which the
animation would be applied), and the Blending type. Override means information from other layers will be ignored, while Additive means that
the animation will be added on top of previous layers.
The Mask property is there to specify the body mask used on this layer. For example if you want to use upper body throwing animations, while
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
having your character walk or run, you would use an upper body mask, like this:
For more on Avatar Body Masks, you can read this section
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Soloed transitions will be shown in green, while muted transitions in red, like this:
In the example above, if you are in State 0, only transitions to State A and State B will be available.
The basic rule of thumb is that if one Solo is ticked, the rest of the transitions from that state will be muted.
If both Solo and Mute are ticked, then Mute takes precedence.
Known issues:
The controller graph currently doesn't always reflect the internal mute states of the engine.
Target Matching
Often in games, a situation arises where a character must move in such a way that a hand or foot lands at a certain place at a certain time. For
example, the character may need to jump across stepping stones or jump and grab an overhead beam.
You can use the Animator.MatchTarget function to handle this kind of situation. Say, for example, you want to arrange an situation where the
character jumps onto a platform and you already have an animation clip for it called Jump Up. To do this, follow the steps below.
Find the place in the animation clip at which the character is beginning to get off the ground, note in this case it is 14.1% or 0.141 into the
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Find the place in the animation clip at which the character is about to land on his feet, note in this case the value is 78.0% or 0.78.
using UnityEngine;
using System;
[RequireComponent(typeof(Animator))]
public class TargetCtrl : MonoBehaviour {
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
animator = GetComponent<Animator>();
}
void Update () {
if(animator) {
if(Input.GetButton("Fire1"))
animator.MatchTarget(jumpTarget.position, jumpTarget.rotation, AvatarTarget.LeftFoot,
new MatchTargetWeightMask(Vector3.one, 1f), 0.141f, 0.78f);
}
}
}
The script will move the character so that it jumps from its current position and lands with its left foot at the target. Bear in mind that the result of
using MatchTarget will generally only make sense if it is called at the right point in gameplay.
Root Motion
Body Transform
The Body Transform is the mass center of the character. It is used in Mecanim's retargeting engine and provides the most stable displacement
model. The Body Orientation is an average of the lower and upper body orientation relative to the Avatar T-Pose.
The Body Transform and Orientation are stored in the Animation Clip (using the Muscle definitions set up in the Avatar). They are the only
world-space curves stored in the Animation Clip. Everything else: muscle curves and IK goals (Hands and Feet) are stored relative to the
body transform.
Root Transform
The Root Transform is a projection on the Y plane of the Body Transform and is computed at runtime. At every frame, a change in the Root
Transform is computed. This change in transform is then applied to the Game Object to make it move.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Only AnimationClips that have similar start and stop Root Orientation should use this option. You will have a Green Light in the UI telling you
that an AnimationClip is a good candidate. A suitable candidate would be a straight walk or a run.
Based Upon: This let you set the orientation of the clip. Using Body Orientation, the clip will be oriented to follow the forward vector of body.
This default setting works well for most Motion Capture (Mocap) data like walks, runs, and jumps, but it will fail with motion like strafing where
the motion is perpendicular to the body's forward vector. In those cases you can manually adjust the orientation using the Offset setting. Finally
you have Original that will automatically add the authored offset found in the imported clip. It is usually used with Keyframed data to respect
orientation that was set by the artist.
Offset: used to enter the offset when that option is chosen for Based Upon.
Bake Into Pose: The Y component of the motion will stay on the Body Transform (Pose). The Y component of the Root Transform will be
constant and Delta Root Position Y will be 0. This means that this clip won�t change the Game Object Height. Again you have a Green Light
telling you that a clip is a good candidate for baking Y motion into pose.
Most of the AnimationClips will enable this setting. Only clips that will change the GameObject height should have this turned off, like jump up
or down.
Note: the Animator.gravityWeight is driven by Bake Into Pose position Y. When enabled, gravityWeight = 1, when disable = 0.
gravityWeight is blended for clips when transitioning between states.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Based Upon: In a similar way to Root Transform Rotation you can choose from Original or Mass Center (Body). There is also a Feet option
that is very convenient for AnimationClips that change height (Bake Into Pose disabled). When using Feet the Root Transform Position Y will
match the lowest foot Y for all frames. Thus the blending point always remains around the feet which prevents floating problem when blending
or transitioning.
Offset: In a similar way to Root Transform Rotation, you can manually adjust the AnimationClip height using the Offset setting.
Bake Into Pose will usually be used for �Idles� where you want to force the delta Position (XZ) to be 0. It will stop the accumulation of small
deltas drifting after many evaluations. It can also be used for a Keyframed clip with Based Upon Original to force an authored position that
was set by the artist.
Loop Pose
Loop Pose (like Pose Blending in Blend Trees or Transitions) happens in the referential of Root Transform. Once the Root Transform is
computed, the Pose becomes relative to it. The relative Pose difference between Start and Stop frame is computed and distributed over the
range of the clip from 0-100%.
Open the inspector for the FBX file that contains the in-place animation, and go to the Animation tab
Make sure the Muscle Definition is set to the Avatar you intend to control (let's say this avatar is called Dude, and he has already been
added to the Hierarchy View).
Select the animation clip from the available clips
Make sure Loop Pose is properly aligned (the light next to it is green), and that the checkbox for Loop Pose is clicked
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Preview the animation in the animation viewer to make sure the beginning and the end of the animation align smoothly, and that the
character is moving "in-place"
On the animation clip create a curve that will control the speed of the character (you can add a curve from the Animation Import inspector
Curves-> +)
Name that curve something meaningful, like "Runspeed"
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Select the character Dude in the Hierarchy, whose inspector should already have an Animator component.
Drag RootMotionController onto the Controller property of the Animator
If you press play now, you should see the "Dude" running in place
Finally, to control the motion, we will need to create a script (RootMotionScript.cs), that implements the OnAnimatorMove callback.
using UnityEngine;
using System.Collections;
[RequireComponent(typeof(Animator))]
void OnAnimatorMove()
{
Animator animator = GetComponent<Animator>();
if (animator)
{
Vector3 newPosition = transform.position;
newPosition.z += animator.GetFloat("Runspeed") * Time.deltaTime;
transform.position = newPosition;
}
}
}
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
Now you should see that the character is moving at the speed specified.
Character Setup
Number of Bones
In some cases you will need to create characters with a large number of bones, for example when you want a lot of customizable
attachments. These extra bones will increase the size of the build, and you could expect to have a relative processing cost for each
additional bone. For example, 15 additional bones on a rig that already has 30 bones will take 50% longer to solve in Generic mode.
Note that you can have additional bones in Generic and in Humanoid mode. When you have no animations playing using the additional
bones, the processing cost should be negligible. This cost will be even lower if their attachments are non existent or hidden.
Animation System
Controllers
The Animator doesn't spend time processing when a Controller is not set to it.
Simple Animation
Playing a single Animation Clip with no blending can make Mecanim slower than the legacy animation system. The old system is very
direct, sampling the curve and directly writing into the transform. Mecanim has temporary buffers it uses for blending, and there is
additional copying of the sampled curve and other data. The Mecanim layout is optimized for animation blending and more complex
setups.
Scale Curves
Make sure that there is not a single scale curve on any animation clip. You can write an asset post-processor to remove or warn
about them. See the Asset Bundles section for more information.
Layers
Most of the time Mecanim is evaluating animations, and the overhead for AnimationLayers and AnimationStateMachines is kept to the
minimum. The cost of adding another layer to the animator, synchronized or not, depends on what animations and blend trees are
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
played by the layer. When the weight of the layer is zero, the layer update will be skipped.
Mecanim Scene
There are many optimizations that can be made, some useful tips include:
Use hashes instead of strings to query the Animator.
Implement a small AI Layer to control the Animator. You can make it provide simple callbacks for OnStateChange,
OnTransitionBegin, etc.
Use State Tags to easily match your AI StateMachine to the Mecanim StateMachine.
Use additional curves to simulate Events.
Use additional curves to markup your animations, for example in conjunction with target matching.
Runtime Optimizations
Visibility and Updates
Always optimize animations by setting the animators's Culling Mode to Based on Renderers, and disable the skinned mesh renderer's
Update When Offscreen property. This way animations won't be updated when the character is not visible. See the skinned mesh
renderer for further information.
Page last updated: 2013-03-22
Mecanim FAQ
General questions
We are using the animation Legacy System for the player animations, do you advise us to use Mecanim instead?
Mecanim is the current animation tech that we are developing, and it will improve continuously. The legacy system is "as is" To check out the
features and cool stuff you can do with Mecanim check here: http://unity3d.com/unity/mecanim/
Most functions in Mecanim can be controlled by script, but We are exposing the entire API in increments over the 4.x development cycle
You can animate in any of the 3D animation packages listed here http://docs.unity3d.com/Documentation/Manual/HOWTO-importObject.html
and import them into unity
You can use a write a controller script (or use one from our demos) to interact with the State Machine which plays the animations. Here ares
some links:
Import
Why is an animator added automatically to every mesh that's imported?
Currently there are no methods by which to set import defaults, but if you set the Rig to �None� in the import settings, then the Animator
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
component will not be added - you can do this with multiple files at once
Layers
Does it matter in what order the layers are in?
Yes. Layers are evaluated from the top one to the bottom one. Layers set to �override� will always override the previous layers (based on
their mask, if they have mask)
There is no �automatic� mode for layer weights. Using your own fade in/out can be good approach *
Is the base layer weight always supposed to be one or should the weight be set to zero when another synced layer is run?
The base layer weight is always 1, however setting layers to *override* will completely override the base layer.
What happens if a synced layer has a different length to the corresponding state in the base layer?
If layers would have different lengths, the synced layers would become unsynced.
Is there any way to get a variable value from the controller without the name in text format?
You can use integers to identify the states and parameters. Use the Animator::StringToHash to get the int identifiers. For example:
Body mask are tightly bound to the humanoid re-targeting solver, so currently it is not possible.
Is there a way to create more AvatarIKGoals than LeftFoot, RightFoot, LeftHand, RightHand?
This is on our high priority roadmap. We suggest using Additional animation curves on animation to �simulate� events. Its not exactly the
same, but many of our users have had success with this!
When you have an animation with a curve and another animation without a curve, Unity will use the default value of the parameter connected to
the curve to do blending. You can set default values for your Parameters, so when blending occurs between a State that has the a �Curve�
Parameter and one that does not have one, it will blend between curve value and default parameter. To set a default value for a Parameter,
simply set its value in the Animator Tool window while not in LiveLink.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
reason for using legacy animation is to continue working with an old project without the work of updating it for Mecanim. However, it is not
recommended that you use the legacy system for new projects.
The Animation tab on the importer will then look something like this:
Below the properties in the inspector is a list of animation clips. When you click on a clip in the list, an additional panel will appear below it in the
inspector:-
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
The Start and End values can be changed to allow you to use just a part of the original clip (see the page on |splitting animations for further
details). The Add Loop Frame option adds an extra keyframe to the end of the animation that is exactly the same as the keyframe at the start.
This enables the animation to loop smoothly even when the last frame doesn't exactly match up with the first. The Wrap Mode setting is
identical to the master setting in the main animation properties but applies only to that specific clip.
See the pages about Animation import and Animation Scripting for further information about these subject.
The Animation View Guide is broken up into several pages that each focus on different areas of the View:-
This section covers the basic operations of the Animation View, such as creating and editing Animations Clips.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
This section explains how to create Animation Curves, add and move keyframes and set WrapModes. It also offers tips for using Animation
Curves to their full advantage.
Editing Curves
This section explains how to navigate efficienlty in the editor, create and move keys, and edit tangents and tangent types.
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
This section explains how to animate Game Objects with multiple moving parts and how to handle cases where there is more than one
Animation Component that can control the selected Game Object.
This section explains how to add Animation Events to an Animation Clip. Animation Events allow you call a script function at specified points
in the animation's timeline.
Making an animated character involves two things; moving it through the world and animating it accordingly. If you want to learn more about
moving characters around, take a look at the Character Controller page. This page focuses on the animation. The actual animating of
characters is done through Unity's scripting interface.
You can download example demos showing pre-setup animated characters. Once you have learned the basics on this page you can also see
the animation script interface.
Animation Blending
Animation Layers
Animation Mixing
Additive Animation
Procedural Animation
Animation Playback and Sampling
Animation Blending
In today's games, animation blending is an essential feature to ensure that characters have smooth animations. Animators create separate
http://docs.unity3d.com/Documentation/printable.html[11/06/13 23:50:58]
Unity Manual (printable)
animations, for example, a walk cycle, run cycle, idle animation or shoot animation. At any point in time during your game you need to be able
to transition from the idle animation into the walk cycle and vice versa. Naturally, you want the transition to be smooth and avoid sudden jerks in
the motion.
This is where animation blending comes in. In Unity you can have any number of animations playing on the same character. All animations are
blended or added together to generate the final animation.
Our first step will be to make a character blend smoothly between the idle and walk animations. In order to make the scripter's job easier, we
will first set the Wrap Mode of the animation to Loop. Then we will turn off Play Automatically to make sure our script is the only one playing
animations.
Our first script for animating the character is quite simple; we only need some way to detect how fast our character is moving, and then fade
between the walk and idle animations. For this simple test, we will use the standard input axes:-
function Update () {
if (Input.GetAxis("Vertical") > 0.2)
animation.CrossFade ("walk");
else
animation.CrossFade ("idle");
}
When you hit the Play button, the character will start walking in place when you hold the up arrow key and return to the idle pose when you
release it.
Animation Layers
Layers are an incredibly useful concept that allow you to group animations and prioritize weighting.
Unity's animation system can blend between as many animation clips as you want. You can assign blend weights manually or simply use
animation.CrossFade(), which will animate the weight automatically.
However, you will generally want to prioritize which animation receives most weight when there are two animations playing. It is certainly
possible to ensure that the weight sums up to 100% manually, but it is easier just to use layers for this purpose.
Layering Example
As an example, you might have a shoot animation, an idle and a walk cycle. The walk and idle animat