In Game Controls
LMB: Fire Weapon
1: Select Weapon One
2: Select Weapon Two
ESC: Open System Menu (you can change graphics settings, and tint your robot)
Gameplay Notes: When you lose limbs, you lose certain functionality, like losing a leg makes you move more slowly, etc. Remember, the game is running only a server in US East, and is graphically intense. It requires far better hardware than the HeroEngine minimum spec, and if you are geographically distant from Washington DC, your connection may be poor.
As a proof-of-concept, the FPS Demo illustrates several key features of first-person shooters and offers simple example implementations of each. Iconic in FPS games are the use of a first person camera, an on-screen HUD, the concept of a leaderboard, and the skill associated both with managing resources (ammo, health, etc) and lithely slaughtering opponents with high-precision weaponry. We designed around these features while adding some additional artistic and creative touches like a dynamic day/night cycle, dynamic character part damage, several custom FX and a power-up/pick-up system.
- Smooth first-person camera that accurately tracks to a camera bone on the character's rig while facing in the player's chosen direction
- Fast-paced, client-driven gameplay that leverages replication to keep player's synchronized with one-another
- Multiple weapons, each with its own specified range, damage and ammo/clip capacity
- Dynamic characters whose parts can be damaged/destroyed (from a game-play perspective) and tinted/customized (from an art perspective)
- A robust match and event system to manage entering/leaving matches and connecting producers of events with their consumers
Advanced Customizable Character Controller
A key feature of any FPS is how the character moves around in the world. The default Heroengine ACCC is setup well for a mmorpg character but not for a FPS character. The first step to that goal was that a new character model and animation set were developed. Then we overrode certain server scripts to make sure characters start with a FPS character controller rather than the default one.
The character controller then needed to turn inputs from the user in to specific animations and behaviors for the new character model. The basic ACCC was gutted of most of its features to be more streamlined with our needs. In some cases this just meant deleting large sections of code that just were not relevant to FPS,for example swimming logic was taken out. There is no water in our game design so there is no need for the character controller to know how to swim. In other cases code need to be added for new behaviors. One of the things that needed to be added was a tracking of which limbs you are missing. The major hook of our game was that missing limbs would result in your character controlling differently so the ACCC needed to be keep track of which ones you had or did not.
Another major concept that the character controller needed to handle was the fact that character rotation would no longer be dictated by the direction the character was moving. In the HeroEngine default controller depending on where your camera is facing your character will animate and turn toward that direction. In a FPS movement almost never controls where you are directly aiming. This took handling of the character rotation out of the controllers hands.
In most cases you do not want the ACCC to be involved in game logic. The ACCC should sit between your game logic and animating the character. This means that the ACCC should react to commands from your game logic and interpret all the different signals its getting in to one final action. There are cases of where the FPS ACCC does not do this. While it technically works it means that there is tight coupling between different game systems and the character controller.
Server - Client Communication
In order to maximize responsiveness to player input, each client is the authoritative source for its own data. A client is responsible for damaging itself, for updating its own ammo/clip-count and for determining whether it has hit (or not hit) other players or bots in the game. Note that - while the traditional server-authoritative design has been replaced with a client-authoritative model - we still have several options to enforce security and correctness of data. We can use third-party software to ensure the integrity of the client-side game executable, and we can verify the integrity of events generated from the client by running them through a verification step on the server prior to delivering them to other players.
In order to keep players synchronized and well-informed about the state of the game, we leverage both replication and reverse replication to propagate data. Reverse replication is used to deliver client-authoritative data from the source client to the server, while replication is used to deliver bot data to clients as well as to deliver reverse-replicated data from the server to all other players. In the end, each client in a match will have replicated to it a set of player nodes and player event nodes which can accurately convey all state and action data required to simulate the match. Responses to events delivered this way occur on an individual client and can range from 'taking damage' to 'updating a GUI'.
Additionally worth noting is that replication - unlike remote call traffic - is capable of leveraging a more robust set of features which can (for example) prioritize certain types of data ahead-of (or behind) others (positional data being prioritized over chat messages, for instance). Bandwidth shaping is also supported, and responding to replication events is - conceptually - far easier for a developer to wrap one's mind around than manually managing remote calls.
Finally, there are several explicit remote calls made from the server to the client (and vice-versa). Registering with the match/game is initiated by a remote call from the client, while delivering certain high-priority data out-of-band from the server is also delivered via remote call.
Weapons and Attacking
Weapons and Attacking Weapons in FPS Games can have many different behaviors and properties. That is why weapon objects are created from the Spec System. It allows the fundamental behavior of a weapon to be varied. The creation of these complex objects is handled in the Spec System. Weapons may contain many properties but most of them are immutable which fits well with the Spec System.
Weapons are created as non-persistent nodes and added to a player's Weapon Inventory, which is replicated to the client. The fields on a Weapon are set for reverse replication which makes the client authoritative over the behavior of the Weapon.
There are two types of Weapons: Ranged and Melee. The way they attack is what seperates the two types of weapons. Both types of weapons use client side Raycasting to determine what nodes were hit by an attack.
All Raycasting used for weapon attacking uses the external function
Raycats3D is used because the weapon may not always raycast in a direction perpendicular to the viewport.
Raycast3D is also used because it can return the name of a dynamic character shape name for the node that was hit. The start position of the raycast is the active camera's node position. The direction of the ray is determined by the camera node's rotation. The length or distance of the the ray is determined by the Weapon Spec.
The Ranged weapon does a raycast from the camera node position in a direction perpendicular to the viewport. If neither a Player or Non Player Character is hit the weapon will perform a few more raycasts. These raycasts start from the camera node position and are randomly rotated slightly from the viewport direction. If none of the additional raycasts hit a Player or Non Player Character then the first raycast is returned. This gives the weapon a spread behavior to help in hitting the target.
The Melee weapon does a raycast from the camera node position in a direct perpendicular to the viewport. If neither a Player or Non Player Character is hit the weapon will perform a few more raycasts. These raycasts are performed in an arc from left to right. This is done by rotating the initial raycast direction about the y-axis of the camera by a specific amount for each raycast. This behavior simulates the physical swing of the melee weapon.
In order to provide a single-player demo experience consistent with that of a live multi-player game, we decided to implement a bare-bones 'AI bots' system to simulate real players. These bots use state-based AI to flip in-and-out of 'wander' and 'opportunity fire' behaviors and grossly (but not intelligently) approximate the actions of a real player.
These AI entities leverage the same event and match systems that human players make use of during the course of a match. Match join events, attack events, got hit events, respawn events, etc -- all are generated and responded to in the same way as human players', with event objects replicating down to human players' clients (to be interpreted and responded to on a per-player basis) and outside events being listened-for and interpreted on the server.
The 'wander' state makes use of the server-side pathfinding capabilities of HeroEngine and - by querying for paths along the area's navmesh - wander along complex paths toward an arbitrary point within some radius R of the bot. Once it arrives at the destination, the 'opportunity fire' state is pushed onto the stack, temporarily disabling the wander state.
The 'opportunity fire' state operates by selecting a player at random in the arena and firing at them. If there's no line-of-sight (LoS) between the bot and its target, its attacks will harmlessly ping the terrain or an object in the arena; however, in the case of valid LoS, the attack event generated by the bot will be interpreted by each client as a 'hit' and will deal damage to the target. Then, after firing (and either hitting or not hitting a target), the bot will pop its 'opportunity fire' state off the stack and the 'wander' state will resume. This process repeats infinitely until the bot is either disabled or destroyed.
When a bot is destroyed, all AI states are popped (rendering the AI immobile) and the bot's death animation plays. After several seconds, the bot 'respawns' in a new location and acquires new instances of the wander and opportunity fire AI states.
The same character controller used for player (client-controlled) characters is used for bot (server-controlled) characters.
Server Scripts Involved:
- FPS_EventNPC... suite of class methods scripts
Client Scripts Involved:
- FPS_EventNPC... suite of class methods scripts
Modern FPS games track stats for many different actions. Those things can range from bullets fired, accuracy, steps taken, head shots, and the list goes on. In this implementation for simplicity stats are not persisted and we only track kills and deaths. We centralized stat tracking in to a system node that would get updates on the server for players various actions. The stats system could be easily extended to include more stats via the observer/listener pattern and adding in more fields.
The way it works right now is when a player joins the match they also register themselves with the stats system. Once registered their account ID is added in to a lookuplist on a statnode on the server. The stat node is tracked by the stats system and replicated to all clients registered with the stat system. When updates to the node happen the stat system is notified and then sends out messages to any systems that care about the update. Most notably the leaderboard on the client tracks stats and displays the proper information in real time.
The main drawback with the current implementation of the stat system is that it is not persisted. Stats are tracked for the duration of a player being in a match and then discarded if the player leaves the game ever. To implement a system that will permanently keep stats the server would need to write out files to the repository or make a arbitrary root stat node that is associated to the account root node. Since a system like that was beyond the scope of our demo we did not go with that implementation and kept it simple.
The camera in any FPS needs to feel smooth and responsive to allow for good accuracy. The rotating of the camera was kept very basic. As the player moves the mouse the camera is rotated based on the user's mouse sensitivity settings. This does make the camera feel very responsive, but it does mean that small corrections in aim can be hard to do. Some form of mouse acceleration would need to be used to combat lower DPI mouses and inaccuracies with mouse movement.
New command layers were added in to the gamekeybindings.ini file in the repository to handle mouse movement. This allowed us to turn on and off mouse camera control as needed. Also the external functions
SetIgnoreCursor( ignore as Boolean ) Allowed us to lock the mouse to the center of the screen. With out using that external function another method of moving and locking the cursor would have been needed. In most cases
SetIgnoreCursor will be what you want to use to keep the mouse in the viewport for a fps.
Lastly we added in a death cam for when a player dies. This leverages the external functions involving a camera to provide a cinematic feel. Again we kept it simple just to show how a death camera could be done. In its current implementation the death camera will position its self exactly where the FPS camera is. Once it is positioned the camera will pan down and move up to show the player their exploding robot. If a death cam was not put in the player's view would be be locked in to the same position as the robot producing a weird view.
We also made a First_Person_Camera_Tutorial
Pick-Ups allow the player to regain health and ammo while playing. We oped to go with a simple a pick up system. On the server it queries for all trigger volumes in the area. It then checks every 30 seconds if there is a pick up class glommed on to it. If there is a pick up class glommed on it does nothing otherwise it gloms on a new pickup class. Triggers each have a preset polling period to check if a player is "inside" it. The poll rate on our triggers is set low so triggers are responsive. Based on our characters movement speed and the triggers polling rate it should be impossible to go through a trigger with out it detecting you. Depending on what type of pick-up class is glommed on to the trigger the callback method for that class will be called and perform pickup specific behavior.
This is not a robust example of how to do pick-ups because it uses all trigger volumes in a level to generate pick up locations. An alternate way to do a more robust pick up system would to leverage the spacial awareness system in conjunction with the prop system. This would make adding new pick-up spawners easy for world builders. Using trigger volumes instead limits how useful trigger volumes are in a more robust game.
We reused all, old, existing art from our HJ Reference world. We didn't set out to impress anyone with our model making, but the world looks nice and is relying on a great deal of graphical features instead of high resolution textures.
The character itself, the robot, is a dynamic character with different parts and tinting options. We used this fact to hide parts of the character as limbs are destroyed through the game play. With tinting we allowed the player to choose the color of their robot with predefined color schemes selectable from the games option menu.
Particle Effects and FX Specs
We tried to use one of every "type" of particle effect used in most games, just to provide a variety. All of the particle effects exist in the ParticleEffect Area, so you can travel to it and take them apart on your own. The basics of particles is to make an emitter and a particle effect. The emitter puts the particle in the world, and then the particle does its thing. When you start combining emitters that emit other emitters, and motion and size changes, trails, color effects, meshes and textures, you can create some pretty amazing effects pretty easily. We then instantiate most of the particles in the world by creating FX entries for them. The FX System also allows you to combine a number of things into one "event", like lighting, particles, beams, meshes, etc.
On Firing the Weapons: The HeroEngine Particle System allows you to set up particles to "track" towards a target. Because the bullets move so fast, we decided to have tracers round that fire relative to the gun in the Z direction, and then track to a static point in space. This gives a nice identical feel to every tracer round. You could also pass the "TARGET" location into the FX and then have the emitter track to that dynamic point to make things perfectly accurate. We also play an explosion of energy particles at the target location. There's also an expended shell casing particle, a light that illuminated the area around the muzzle, and two muzzle flare particles.
Getting Hit and Damage State: We added some simple randomized fire effects, turned into explosions by fading the color from orange to black, the size from small to large, and the the alpha from full to done. The FX spec for Robots getting hit also included a separate particle that emits randomly sized modeled meshes of gears and bits of scrap metal. Missing a limb also triggers a "static charge steam" effect to play from the empty socket of the robot.
Power-Ups: The power ups themselves are an easy combination of a png file with text in it, and a rotating mesh particle of a gear. There's a light shining on the ground as well. When the repair event is triggered we also attach a particle effect to the camera that animates a simple blue and pink texture to make it look like electrical static is covering the viewport.
Ambient: You'll notice a ship in the air with a fire trail of particles that don't look exactly right. This is to show what particles can do when they are inheriting the motion of their parents, as well as exhibiting their own motion.
World Building and Level Design
Dynamic Lighting - We are using a very fast speed of time to illustrate the power of dynamic sky and shadows. Dynamic Grass - We kept a simple dry grass motif using Dynamic Detail in the terrain panel. Vortex Nodes - We used vortex nodes, mainly just for fun, to show how versatile they can be in the game world, and don't have to be used only as vortexes.
Otherwise, we kept the arena simple. A few texture layers and bump maps to prevent texture tiling, some black vertex color for highlight and object to terrain seams. Just for show we did a mirrored layout so you could imagine a red versus blue gameplay experience.