For a successful hiring, you first need to start with a job ad that will attract Unity developers. You will also need to know how to shortlist candidates, what to ask them during the interview, and how to prepare for possible challenges when you hire a Unity developer.
We discussed the ins and outs of Unity development with Daniz Aliyev, a Game and Simulation Developer at Proxify, and he gave us multiple valuable insights below.
Let's begin.
About Unity
Unity is a 2D and 3D game engine and real-time development platform. Developers use it to create simulations, virtual reality (VR), augmented reality (AR), as well as mixed reality (MR) experiences. It was developed by Unity Technologies, and it provides a broad range of robust tools for game development.
Visual Studio offers all the Unity tools and is suitable for developers of any expertise level, making game development easy and enjoyable. Developers can also collaborate here to develop and deploy apps and games across 25+ platforms, such as consoles, PCs, TVs, mobile devices, and the web.
Developers must first use the .NET framework and C# programming language as a prerequisite to using Unity. Unity is available for both macOS and Windows and is free (at the beginning of using it). Once the trial version ends, the subscription becomes a paid subscription automatically, unless you cancel it.
Sourcing and interviewing a Unity developer
This is the part where having a planned roadmap is crucial. After you post a job ad, it's time to focus on the recruitment technicalities before moving forward.
Technical skills checklist
The more you can tick off from this list, the better. The must-know prerequisite is Unity 3D itself, but the rest of the technical ones are:
-
.NET framework – A Unity developer needs to work with .NET because the scripting environment of Unity is based on .NET (or .NET Core, depending on which Unity version is used).
-
C# programming language – A Unity game developer needs to know C# because this is the primary scripting language commonly used for logic, interactions, and gameplay mechanics in Unity. With C#, developers can create and manage:
-
Well-designed UI, menus, buttons, and how game characters behave
-
HUD game elements (corner mini-map, health items, tips for the player, etc.)
-
AI behaviors of many entities, main characters, NPCs (non-player characters), and more.
-
3D modelling and animation – A Unity developer shouldn't strictly be a 3D artist, but they need a basic understanding of texturing, animation principles, 3D modeling, and similar. It's also beneficial to know Maya or Blender software.
-
Mathematics and vector calculations – The Unity developer should understand trigonometry, algebra, matrix transformations, and vector operations to ensure optimized and accurate game world mechanics and dynamics.
-
Significant experience in Unity game development – A minimum of three years is a good starting point, depending on whether you need an aspiring mid-level or a senior developer. If you need the latter, look for more than five years of experience in the field.
-
Knowledge in rendering performance – Including GPU (Graphics Processing Units) & CPU (Central Processing Unit), game profiling, and game optimizing (RAM–Random Access Memory, size, FPS–Frames Per Second).
Optional technical skills
-
Physics and AI programming.
-
Shader programming (write and understand Unity shaders for improved visual quality).
-
Familiarity with VR/AR SDKs (such as Oculus SDK, ARKit, or ARCore).
-
Experience with performance optimization (or knowing how to optimize apps and games for different platforms).
-
Version control (such as Git and others, to track and manage code efficiently).
-
SOLID principles, OOP (object-oriented programming), and design patterns – These concepts provide an efficient and well-structured approach to game design.
-
Knowledge of Unity's networking features or third-party tools (such as Photon for multiplayer settings).
-
Knowledge in cross-platform development for deployment to numerous platforms (understanding the nuances of console, PC, Android, and iOS can be beneficial)
-
Good knowledge of graphics APIs because Unity is compatible with multiple APIs graphics (such as DirectX 11, DirectX 12, OpenGL, and Vulkan).
-
UX (user experience) design process – UX elements make the game an intuitive, unforgettable, and enjoyable experience. The developers need a good grasp of those for:
-
A player-centric design that caters to the player's behavior.
-
Providing good intuitive interaction by creating easy-to-understand controls, and gameplay mechanics.
-
Providing clear, concise tutorials and onboarding for the players.
-
Accessibility and inclusivity for a broad audience (this includes people with disabilities as well).
-
Receiving user feedback to improve the UX over time.
Checklist of preferred requirements
Some of the following are good to have but not mandatory:
- A degree in Computer Science or Game Development (as a foundation for principles, algorithms, and similar)
- Experience with SCCS (source code control system)
- Knowledge of Game Design Principles
- Experience with other game engines (e.g., Unreal Engine, Godot, or CryEngine)
- Graphics programming experience (graphics libraries, APIs; OpenGL, Vulkan, DirectX 11, or DirectX 12 )
- Knowledge in continuous integration and deployment (familiarity with CI/CD tools)
Interview questions and answers
Use the following questions to assess the knowledge of the Unity developer:
1. Can you explain the interaction and primary functions of scripts, components, game objects, and scenes to one another in Unity?
Example answer:
-
The scenes are unique areas in the game app. They show all the game objects, cameras, and environments for that specific area.
-
The game objects are scenery, props, and characters. Their behavior is undefined, serving as containers for grouping components.
-
The components are specific properties and behaviors of a particular game object, and they can be colliders, renderers, or custom scripts shaping the object's behavior.
-
The scripts represent custom components written in C#. We can attach them to a game object for behavior controlling or interacting with other components and game objects within a scene.
Scenes set the stage, game objects populate the stage, components define properties and behaviors for game objects, and scripts enable developers to make custom interactions and behaviors.
2. Can you describe the Unity Prefab system?
Example answer: The Unity Prefabs are reusable game objects that can be pre-configured and saved for repeated use across many scenes. This makes the scenes perfect for objects that appear multiple times. Prefabs also keep the link to their original file, allowing them to batch updates on numerous instances.
3. How would you handle UI scaling for different resolutions in Unity?
Example answer: Unity provides a Canvas Scaler as a component on the Canvas object, and we use this to ensure UI elements are scaling proportionally on different aspect ratios and screen resolutions. We do this by setting the UI Scale Mode to "Scale With Screen Size.”
4. Can you explain what Unity Coroutines is?
Example answer: The Unity Coroutines enables easy execution of sequential actions on multiple frames. They are functions that can pause execution, and after that, they return control to Unity. But they continue to the next frame, where they left off.
5. Imagine you use Unity's Coroutine system to write a script and make a game object fade out (reduce its opacity) over a specific duration before it fades back in. The object should have a 'SpriteRenderer' component. What is your solution for this?
Example answer: To make a game object with a 'SpriteRenderer' fade in and out, we can adjust its alpha value over time. The Coroutine system is perfect for this kind of time-dependent operation. This is the solution:
- First, we use the 'Color' property on 'SpriteRenderer' to adjust the alpha value.
- Then, the 'StartFadeCoroutine' function initiates the fade-out and fade-in
- Within the Coroutine, we gradually reduce alpha to 0 (fade out) and then increase back to 1 (fade in).
using UnityEngine;
using System.Collections;
[RequireComponent(typeof(SpriteRenderer))]
public class FadeEffect : MonoBehaviour
{
private SpriteRenderer spriteRenderer;
private void Start()
{
spriteRenderer = GetComponent<SpriteRenderer>();
StartCoroutine(StartFadeCoroutine(2.0f)); // fades over 2 seconds as an example
}
private IEnumerator StartFadeCoroutine(float duration)
{
// Fade out
for (float t = 0; t < duration; t += Time.deltaTime)
{
float alpha = Mathf.Lerp(1, 0, t / duration);
spriteRenderer.color = new Color(1, 1, 1, alpha);
yield return null;
}
spriteRenderer.color = new Color(1, 1, 1, 0); // Ensure alpha is set to 0 after loop
// Fade in
for (float t = 0; t < duration; t += Time.deltaTime)
{
float alpha = Mathf.Lerp(0, 1, t / duration);
spriteRenderer.color = new Color(1, 1, 1, alpha);
yield return null;
}
spriteRenderer.color = new Color(1, 1, 1, 1); // Ensure alpha is set to 1 after loop
}
}
6. Can you elaborate on the functions Start(), Update(), and FixedUpdate() in Unity and differentiate between them?
Example answer: The Start(), Update(), and FixedUpdate() are three of many MonoBehaviour callback methods used for structuring game object behavior.
-
When Start() is called, it is executed once in the lifespan of a scripting language before the first frame where the script was active. We use this to initialize variables, establish connections to objects, or set initial object states.
-
When Update() is called, it happens once per frame. The frequency varies depending on the game frame rate (e.g. if the game runs at 60 fps, Update() will be invoked 60 times in a second). We use this for regular updates such as moving non-physics objects, input checking, updating game logic, etc.
-
When we call FixedUpdate(), it is different than Update(), and in this case, FixedUpdate() runs at a consistent, fixed interval. By default, it is called every 0.02 seconds or 50 times per second, regardless of the frame rate. This interval is customizable via Unity's time settings. We use this for updating physics-based objects since Unity's physics system (PhysX) updates at a fixed rate. We ensure consistent and reliable physics simulations by placing physics-related code in FixedUpdate(). When we work with Rigidbody (and other physics components), we need to manipulate them in FixedUpdate() and not in Update() to avoid erratic behavior.
Update() and FixedUpdate() are used for periodic updates but differ in execution frequency. Update() is frame rate-dependent and could vary, while FixedUpdate() is consistent and preferable for physics-related operations. On the other hand, Start() is just used for the initial setup before any updates begin.
7. Imagine you need to move an object on the scene in Unity. How would you do that? If there's more than one way, can you explain each? And how would your code look like if you moved an object located at (0,0,0) to (1,1,1)?
Example answer: In Unity, we can move an object smoothly in several ways.
- Transform.Translate – This method lets us move an object by specifying a direction and magnitude. It is quick and straightforward for simple movements but might not be as smooth as other methods, especially for continuous movement or coding with other objects.
void Update() {
float moveSpeed = 5.0f;
Vector3 targetPosition = new Vector3(1, 1, 1);
if (transform.position != targetPosition) {
Vector3 moveDirection = (targetPosition - transform.position).normalized;
transform.Translate(moveDirection * moveSpeed * Time.deltaTime);
}
}
- Vector3.Lerp – Lerp stands for "linear interpolation," and with this method, we transition an object's position from one point to another. We use this for gradual, smooth movements.
private Vector3 startPoint = new Vector3(0, 0, 0);
private Vector3 endPoint = new Vector3(1, 1, 1);
private float lerpTime = 0;
private float duration = 2.0f; // time taken to move from start to end
void Update() {
lerpTime += Time.deltaTime / duration;
transform.position = Vector3.Lerp(startPoint, endPoint, lerpTime);
}
- Rigidbody – We use this when dealing with physics-based movement or need collision detection. The Rigidbody component, along with the 'MovePosition’ forces, is the go-to choice here, and this way, Unity's physics engine handles movement to make it realistic.
private Rigidbody rb;
public float moveSpeed = 5.0f;
private Vector3 targetPosition = new Vector3(1, 1, 1);
void Start() {
rb = GetComponent<Rigidbody>();
}
void FixedUpdate() {
if (rb.position != targetPosition) {
Vector3 moveDirection = (targetPosition - rb.position).normalized;
rb.MovePosition(rb.position + moveDirection * moveSpeed * Time.fixedDeltaTime);
}
}
- CharacterController.Move – This is for character movement, and this method considers things like collisions and gravity, so it is ideal for making NPCs and player characters.
private CharacterController controller;
public float speed = 5.0f;
private Vector3 targetPosition = new Vector3(1, 1, 1);
void Start() {
controller = GetComponent<CharacterController>();
}
void Update() {
if (transform.position != targetPosition) {
Vector3 moveDirection = (targetPosition - transform.position).normalized;
controller.Move(moveDirection * speed * Time.deltaTime);
}
}
- Animations and Tweens – We can set up an animation or use tweening libraries such as DOTween for predefined movements or paths. This is used for NPCs or in-game events that need choreographed, specific movements.
using DG.Tweening;
private Vector3 targetPosition = new Vector3(1, 1, 1);
void Start() {
transform.DOMove(targetPosition, 2.0f); // moves to target in 2 seconds
}
8. Imagine you made an app in Unity that works well on a PC. But, after switching the target platform to Android, the app crashed upon launch on a mobile device, yet it kept working well on a PC. How would you debug this?
Example answer: Debugging cross-platform issues when transitioning from PC to a mobile platform like Android is tricky. But with a systematic approach, it's possible to resolve it quickly.
-
Unity console & Logcat to check the console for errors or warnings. If it still needs to be clarified what the issue is, we use Android's Logcat tool, which captures real-time log outputs from the device to get more information about the crash.
-
Next, check the build settings to ensure we correctly set up the Unity project for Android development. This includes checking permissions, API levels, and other Android-specific settings.
-
Platform-specific APIs ensure we don't use functionalities or platform-specific APIs without checking them. Some features for PC might not be available for Android.
-
Next is memory & performance. Mobile devices usually have less processing power and memory compared to PCs. Check if the app consumes much memory or CPU, leading to crashes.
-
Then, shader & graphics Issues. Not all graphics settings and shaders for PC will work on Android devices. Check if shaders are compatible with mobile GPUs and consider mobile-specific ones if needed.
-
Third-party assets or plugins must be compatible with Android, and some assets might be designed exclusively for PC.
-
Regarding dependencies & SDKs, ensure that all SDKs and libraries are compatible with Android and well set up. Ensure you also have the latest JDK and Android SDK versions.
-
Next is testing on multiple devices. Sometimes, issues can be specific to a device due to variations in software and hardware. Test the app on various Android devices to see if the problem persists.
-
Profiling in Unity is done with the built-in profiler to check for performance spikes or issues while Android runs.
-
And iterative debugging is a last resort to remove disabling game parts and identify the core problem.
9. Can you describe the Unity animation states and how to transition between them?
Example answer: The animation states are individual motions or animations that an object or character performs. For example, the character can have “walk”, “idle”, “jump”, “run” or “attack” states. We manage these states through Animator Controller in Unity, a tool for previewing, setting up, and controlling animations. The transitions define an animation's progress from one state to another, and in the Animation Controller, we can draw arrows between the states. We control these transitions with parameters (variables) evaluated by the system to decide where to move next. Changing a state can be done with parameters (float, int, bool, trigger), direct scripting (SetBool(), SetFloat(), SetInteger(), and SetTrigger()), and Blend Trees (for blending multiple animations based on the value of single or multiple parameters).
10. How would you implement a camera movement controller suitable for a third-person game?
Example answer: For a third-person game, the camera follows the player and allows for camera rotation around the character with a 360-degree view. The considerations for such a system include:
-
Distance from the player – Keep an adjustable or fixed distance from the player character to have a clear view.
-
Vertical and horizontal rotation – Allow the player to rotate the camera horizontally and vertically with the thumbstick or mouse.
-
Collision detection – The camera shouldn't intersect with other objects in the game.
-
Smooth movement – Ensure the camera adjustments and movement are smooth to provide the best player experience.
A basic implementation would look like this:
using UnityEngine;
public class ThirdPersonCameraController : MonoBehaviour
{
public Transform playerTarget;
public float distanceFromTarget = 5.0f;
public Vector2 pitchMinMax = new Vector2(-40, 85);
public float rotationSpeed = 10;
private float yaw;
private float pitch;
void Update()
{
// Get mouse input
yaw += Input.GetAxis("Mouse X") * rotationSpeed;
pitch -= Input.GetAxis("Mouse Y") * rotationSpeed;
// Clamp the vertical rotation
pitch = Mathf.Clamp(pitch, pitchMinMax.x, pitchMinMax.y);
// Calculate the rotation and apply to the camera
Vector3 targetRotation = new Vector3(pitch, yaw);
transform.eulerAngles = targetRotation;
// Set camera position
transform.position = playerTarget.position - transform.forward * distanceFromTarget;
// Collision detection (simple approach)
RaycastHit hit;
if (Physics.Linecast(playerTarget.position, transform.position, out hit))
{
distanceFromTarget = Mathf.Clamp(hit.distance, 0.5f, distanceFromTarget);
}
}
}
Recognizing and selecting the best Unity developer
There will be some differences between a good and great Unity developer, but also many similarities. One way to know you are dealing with a great developer is their excellent technical test performance. Daniz advises paying attention to the following:
"A great Unity developer has stellar technical proficiency, especially with advanced scripting, optimization, APIs, and shader programming. They will have more years of experience and the versatility of past engagements. Their problem-solving should be superb, and they pay great attention to even the tiniest details. They are also renowned specialists in certain areas, such as AR/VR, AI, or graphic optimization."
As important as technical skills are, when you evaluate developer qualities, there is more to it. Daniz reiterates how a great Unity developer is never satisfied with the status quo of their knowledge of gaming development; they upskill continuously, are receptive to feedback, and are highly adaptable to various workflow requests.
Possible challenges during the hiring of a Unity developer
As with any hiring process, these are the common challenges that you could expect and prepare for ahead of time:
-
No roadmap – It's almost impossible to conduct a successful hiring process without a plan or roadmap. Organize everything in each hiring stage, and have some interview questions (and answers) ready.
-
Budget restrictions – Ensure you have enough budget for recruiters, hiring managers, potential new developers, and the overall process. It's also possible that the hiring process prolongs for some reason, and it will cost more money to continue with it.
-
Hard-to-find qualified candidates – A shortage of skilled developers is not uncommon, and the best tech experts are hired by someone else or require higher compensation than the one you currently offer for the role. A quick and optimal solution would be to rely on services to find and hire vetted developers for you in a matter of days.
Industries and uses of Unity
Unity is mainly used for creating games, but it also has many other creative and versatile uses in other industries, says Daniz.
“Unity is used for film and animation, automotive, transportation and manufacturing, and education. Unity is heavily used in architecture, engineering, construction, VR and AR, marketing, e-commerce, and retail. Not surprisingly, the military, defense, and aerospace industries rely on Unity for various training simulations in risk-free environments.”
What can businesses make with Unity?
Generate digital replicas for testing and research of IoT products
A digital twin represents a virtual version of a product or object that mimics the actual properties of that object. Usually, these tools are used for product monitoring or diagnostics for optimized performance. When you create a digital twin like this, you use a safe testing environment without damaging the real object.
Consumers will always like a personalized item, especially if that allows them to try it before purchasing. Businesses that rely on customized products can use this to create an unforgettable customer experience. Unity allows for real-time iteration, fast prototyping, and creating web configurators, VR and AR for more platforms (macOS and Windows, for example).
Make a 3D virtual solution for marketing and sales presentations
You can advertise any product through practical and interactive presentations, no matter how complex. By doing this, you encourage customers to browse through your products and designs.
Produce artificial datasets for training machine learning algorithms
Sometimes data can be hard to manage. Machine learning, predictive modeling, and statistics analysis can reveal specific patterns that business leaders need. Using synthetic datasets can reinvent data solutions by reducing the need for expensive data procuring. With these artificial datasets, you can test products more efficiently, and machine learning models can work more accurately. You simplify the connection between machine learning algorithms and artificial data.
Make a VR training for onboarding purposes
With work environments being predominantly remote, onboarding is conducted remotely too. The team member experience can improve by including VR training and onboarding, making prospective team members retain the information they see much better.
Create an HMI (Human-Machine Interface) for products and machines
An HMI represents hardware or software with a visual interface we use to control a system, device, or machine to communicate with it. A well-built HMI will make many operations more transparent, giving real-time data access for the machines and products through highly intuitive consoles.
Streamline space arrangements for events and manufacturing
Many space planning tools can be made with Unity, allowing for 3D renderings, design of spaces, and similar. For example, a virtual showroom, cocktail room, or gala dinner space design – is helpful when presenting your products without customers being physically there.
Utilize AR for construction
Imagine bringing a building or other construction to life before physical work begins. This is all too possible nowadays, especially with Unity's cloud-based software VisualLive used for stunning 3D visualizations. It can help with critical data maintenance (such as materials, textures, and similar architecture-related concerns) in this context.
Create various games
The great thing about Unity is that it can be used for numerous kinds of games of any genre or complexity and is suitable for mobile and PC.
These are almost all the leading gaming genres out there, and Unity covers them all:
- Card games
- Action-adventure
- First-person shooter
- Arcade
- Puzzles
- Quests
- RPG (role-playing game)
- Sports simulator
- RTS (real-time strategy)
- City building simulation
- Action roguelike
- Sandbox
Business benefits of using Unity
"The best part about Unity is the unparalleled accessibility and versatility. It is straightforward to use, whether you're a novice or a seasoned developer."
-
AR/VR ready – Unity's native support is unparalleled, especially in marketing, training simulations, and gaming. Its native support for these technologies puts businesses at the forefront of the shift toward AR and VR.
-
Scalability – Whether you create a simple 2D game, a unique VR experience, or an interactive architectural visualization, Unity scales and meets your needs.
-
Cross-platform development – One standout feature is the deployment on various platforms, from desktop to mobile, game consoles, and VR/AR headsets. With just one codebase, you can reach a varied audience.
-
Rich ecosystem – You can find various tools, resources, and assets in the Unity Asset Store, along with the vast online community of Unity developers.
-
Cost-effective – Unity offers competitive pricing models and a free version, which is attractive for startups and established businesses.
-
Regular updates & cutting-edge features – Unity is constantly evolving and innovating, ensuring businesses and developers access the latest advancements in game tech.
-
Adaptable for non-gaming solutions – Beyond games, Unity has many other uses in entirely different industries from one another.