Monday, December 21, 2015

Unreal Engine DevLog #22: Reaction Shots & Prototype Cover System

I've been meaning to write down this dev log for quite some time. It's the last update on the X-COM:EU clone prototype that I was working on a while back. I would love to return to working on it sometime later, but for now, I'm concentrating on making tool kits for the Unreal Engine Marketplace.

The last update on the project was about an LoS based visibility system, which is covered here: http://unrealpossibilities.blogspot.in/2015/07/unreal-engine-devlog-19-line-of-sight.html

Since then, I had implemented Reaction Shots for the player controlled units, as well as a prototype cover system that displays the cover value of each corner of every tile.

Reaction Shots Preview Video:



Prototype Cover System Preview Video:



So that's going to be the last update on the project for quite some time. It's been a great learning experience for me as a starter experience in Unreal Engine 4. Also special thanks to Knut Overbye for creating one of the best products in the Unreal Engine Marketplace, without which the project wouldn't have made it this far. I've provided a link below to his toolkit. Feel free to check it out: ttps://www.unrealengine.com/marketplace/advanced-turn-based-tile-toolkit

Saturday, December 19, 2015

Unreal Engine Diaries #10

  • To display the AI Perception range in the editor, go to Editor Preferences >> General >> Gameplay Debugger & tick the 'Perception' parameter. Also in post processing, set AA to 'FXAA' or none to display the debug lines better.

  • In the widget blueprint, select multiple widgets from the Hierarchy panel & then Right click >> 'Wrap with' to wrap the selected widgets within another widget like Canvas Panel, Border, etc.

  • Add 'Rotating Movement' component to actors to have them rotate automatically. The rotation rate for each axis can be set from the component. This could be used to create interactive objects like weapon pick ups in a shooter or effects like rotating coins in a side scrolling game.

  • Wrapping widgets with 'Invalidation Boxes' can be used to increase performance as they get executed only at the time of creation & not every frame unlike other widgets. This could be especially useful when there are lots of static UI elements that do not get updated at run time.

  • The 'Random unit vector in cone' node can be used to get random line trace target locations for creating shotgun spread patterns.

Tuesday, December 15, 2015

Unreal Engine Diaries #9


  • 'Shift + F1' can be used to gain mouse control & jump between the different game instance windows during multiplayer testing in the editor.

  • While working on VR, try to match the size of in game objects to their real life counterparts, as not doing so could make them stand out and reduce the immersion.

  • In the Material Editor details panel, turn on 'Fully Rough' [prevents reflection rendering pipelines from executing] & turn off 'Light Map Directionality' [both under the the 'Mobile' category] to make materials that are less expensive to render.  This is a pretty good option when dealing with far away objects in the level that do not require a lot of detail. Setting the Shading Model to 'Unlit' can also increase performance in instances where the additional detail is not required.

  • In PIE mode, press 'Ctrl + Shift + .' to bring up the GPU Profile. It would be a good idea to start looking for elements that cost more than a millisecond.

  • 'Switch has Authority' can be used to determine who is executing the script: the server or the client.

Tuesday, December 8, 2015

Unreal Engine Diaries #8


  • When adding new input parameters to a function that's already being called multiple times throughout the project, it's always better to immediately check every instance of the function call to make sure that the new input parameter is connected as required.

  • Drag & drop a variable from the variables list onto a get/set node of another variable to automatically replace the second variable with the first.

  • When attaching moving physics actors to the player character without physics handles, disable it's gravity & set the linear/angular velocities of all of it's components to zero in order to have it simulate physics & collision on the move.

  • Under default conditions, when a character changes it's direction of movement, it instantaneously turns to face the new direction. To change this behavior and enable smooth rotation based on direction changes, first go to the AI character blueprint >> Character Movement Component >> Enable "Orient Rotation to Movement" & set "Yaw" of the "Rotation Rate" based on how smooth the bot turning movement has to be. Then under the default attributes of the blueprint, disable "Use Controller Rotation Yaw" and it should now start having smoother turning movements.

  • If you're experiencing automatic brightness changes in your game, you can disable this effect by going to your viewing camera component >> Post process settings >> Auto Exposure >> Set min and max brightness to the same value.

Friday, December 4, 2015

Unreal Engine Diaries #7

  • While working on the Top Down Stealth Toolkit, I noticed that sometimes the character animations that worked in the PIE mode did not work in the Standalone mode. One of the solutions that worked for me was to connect the 'Event Blueprint Update Animations' in all the child anim BPs to their parent update animation events.
  • To find the angle between two rotator variables, it is better not to use normal subtraction to get the difference as this could give odd results in certain scenarios owing to the fact that the rotator values for actor rotation and world rotation follow the (0,180) & (-180,0) range. For example, when you add two positive values, it could produce a negative output and vice versa. In order to work around this situation, the 'Delta (Rotator)' node can be used to get the absolute angular difference between the two rotators.
  • When working on Top Down games, the 'Orient rotation to movement' parameter in the character movement component of the player character can be unticked to have it face the direction of mouse cursor instead of the movement direction.
  • The following method can be used to get the dimensions of a landscape actor:
    1. First create a reference to the landscape actor either through the level blueprint or using the 'Get All Actors of Class' function.
    2. Get the landscape actor reference and then use 'Get Actor Bounds' function to get the box extent.
    3. Break the box extent vector into it's float values representing the magnitude on each axis and then multiply each by 2 in order to get the length, width and height of the landscape actor.
  • In the default First Person template, if we do a line trace towards the world space equivalent of the center of the screen, it could be seen that the impact location of the trace and the crosshair location on the screen are offset by a certain amount. This is because the logic used to draw the crosshair on the screen from the HUD class does not take the texture size of the crosshair into account during calculations. To rectify this issue and display the crosshair at the true center, we can subtract both x and y location by half the corresponding dimensions of the texture used for crosshair, before plugging it into the draw texture function. In the default case, that would mean subtracting both by 8 units. Doing so should make sure that the trace hit locations match with the crosshair location.
    [ExtendedFirstPersonTemplate_PreciseAim Project Files link: http://unrealpossibilities.blogspot.in/2015/10/extended-first-person-template-improved.html]

Tuesday, October 6, 2015

Unreal Engine Diaries #6

  • In the editor, you can select actors from the 'World Outliner' panel, right click and select 'Move To' >> 'Create New Folder' to group your actors into a folders. 
  • The 'Project World to Screen' function can be used to check if any point in world space lies within the screen space of a player's camera view. Just pass on the said world location and the player controller reference as input parameters and you can get the corresponding screen position data as the output. Break this struct into it's x and y values, then use the 'Get Viewport Size' node to get the actual screen bounds and check it the aforementioned screen position values lie within 0 and the screen bounds values that we just received using the viewport size. If both x and y values lie within this range, then the point is within the visible screen space, else it's outside the camera view.
  • When adding a vector to an actor's world space location to get the vector result of a location near the actor, do not add them using the values that you'd want to increase in the x, y and z directions. It only works in the relative location calculations. What you see as the forward direction in the actor blueprint viewport may not be the same as the forward direction in the world. So in this case, what we need to do is get the forward, right and up vectors. Then multiply them with the required distance along each direction and add/subtract this vectors from the main world space location.
  • The console commands 'stat startfile' and 'stat stopfile' can be used to record the performance stats of all gameplay logic that happens between the commands. On completion, it saves the data to a new file in the HDD. In order to retrieve this data, go to the 'Windows' tab in the editor >> Developer Tools >> Session FrontEnd >> Profiler Tab and click on the 'Load' Button. It'll take you to the folder location where the file was saved. Open the most recent file in the folder to see the visual representation of the performance stats of the CPU [Game & Rendering thread] as a graph in the editor. Select any part of the graph where it's spiking and see all the game logic and rendering processes that's being called within that timeframe, to get an idea of what's causing the performance drops in your project.

Sunday, September 27, 2015

Unreal Engine Diaries #5: GPU Profiling & Rendering Optimizations

  • The 'profilegpu' console command can be used to profile the GPU for a single frame.
  • The 'r.xxxxx ?' command can be used to get the tool tip for the particular rendering command that is being passed on as parameter.
  • Shaders can get really expensive when using World Position Offset and Tessellation. And speaking of World Position Offset, it can be used to manipulate the vertices of a mesh from it's material.
  • If there are lots of skeletal meshes in a game, the 'SkinnedMeshComp Tick' can get expensive. Reducing the number of bones of the skeletons or the complexity of the anim blueprints can help improve the performance in these scenarios. Also if you do not need the animation to update when you can't see the skeletal mesh consider setting the 'Mesh Component Update Flag' in the details panel of skeletal mesh component to 'Only Tick Pose when Rendered'.
  • The 'Smoothed Frame Rate' option in the Project Settings [under General Settings category] is used to render a No VSync frame capped version of the rendering. While doing GPU Profiling, it's a good practice to test it out without using this Smoothed Frame Rate.

Friday, September 25, 2015

Unreal Engine Diaries #4


Material Editor: In the material editor, we can add the 'Noise' node to display noisy patterns on the surface of a mesh. However it's very expensive and hence best used for prototyping or in rare scenarios where using an actual texture isn't justified.

Material Editor: The 'Particle Position' node can be used to get the location of a particle in space.

AI: The 'Bind Event to ReceiveMoveCompleted' in the AI controller can be used to automatically receive the status of a bot once it has completed it's movement order. It's got different states like success, aborted, blocked, invalid, etc and these can be used to have the AI respond to different situations with different behaviors. But if we have multiple events waiting for it's reply from different places, like say from different tasks in a Behavior Tree, all these bound events will start executing. And that might not be a desirable outcome. So in order to work around such a scenario, it would be a good idea to have a check on the current state of the bot in all these events and proceed based on that result. This could help ensure that even thought multiple events maybe fired, only the one that is suitable for the current state of the AI will see through to it's completion.

AI Perception: When using the 'OnPerceptionUpdated' event to detect potential targets, you may have noticed it does not give any details regarding the location of the source of the stimuli. But there is actually a method to retrieve this data. Just loop through this array of actors and for each actor, get it's actors perception ['Get Actors Perception' node], then break down it's output 'info' struct and loop through the 'Info Last Sensed Stimuli' to get all the necessary details like stimulus location, age, tag, etc.

Editor: Press 'Alt+C' in the map editor to see the collision data for all the meshes.

General: In Unreal Engine, most of the physics calculations are done by the CPU.

Wednesday, September 23, 2015

Unreal Engine Diaries #3

  • If we're displaying the game over screen as a widget that's added on to the viewport while the game is running, make sure that the game is paused using the 'Set Game Paused' command. Not doing this would mean that the actors in the level are continuously being updated in the background. Now sometimes it's fine to have the enemy actors move around the screen in the background, but even in those scenarios, it'd be a good practice to make sure that any constantly updating game element that's part of the player character/controller are turned off. An easy example to think of would be an actor in the level that is responding to the mouse cursor. So it might move around the screen, even while we're trying to click that restart button.
  • When creating a widget from within a task in a Behavior Tree, it's a good idea to make sure that it's not being called continuously. It's generally better to create widgets outside the task flow, within normal blueprints, but certain situations might demand widget construction from within Behavior trees in order to use some of it's functionalities that are not natively available in the blueprints. In such situations, there is a way to handle UI changes from the behavior tree tasks. Just add a 'Do Once' node inside the task before calling the widget construction logic. This makes sure that the subsequent iterations of the task don't create the widget unless explicitly specified. In my project, I've used this only in an end game scenario as one of the conditions for it was handled from within a Behavior tree task. It has since been replaced with a game pause call. So this makes sure that the Behavior Tree stops executing altogether, but the 'Do Once' node might be useful in other situations where you can't pause the game.
  • The world rotation of actors in a level, when broken down into their  x,y,z components lie within the range of (0,180) & (-180,0). When doing calculations based on actor rotation, this change in the values from positive to negative have to be taken into account. If we treat it like normal repeating cycles of (0,360), it might yield strange results in certain types of calculations.
  • When aborting a subtree in Behavior Trees, any task within that subtree that is actively running at that moment will not be aborted midway. It will see through to it's completion and if the task has delay conditions or code that changes some important data, this could lead to unexpected results if not taken care of. However it is possible to shut down these tasks at runtime through the use of conditional checks that gauge the current state of the game based on some external variable or blackboard values. Once we have determined that the task is to be aborted, we just call the 'Finish Execute' node to stop and get out of the task.

Monday, September 21, 2015

Unreal Engine Diaries #2

  • Useful Material Editor Hotkeys [Press these keys & click anywhere on the mat editor]: B = Bump Offset; E = Power; I = If condition; O = OneMinus; P = Panner; S = Scalar Parameter; U = Texture Coordinates; V = Vector Parameter
  • If you change any of the member names of a Struct in the Content Browser, the connections from the aforementioned members in all blueprints will get cut. As a result, you'll have to go around each of these blueprints and manually connect them again. So it helps to actually note down where you need to reconnect before making changes to a Struct.

    Also note that in v4.8 of Unreal Engine, these renamed members will have their input pins invisible the next time you check out those structs in your blueprints. In order to make make them visible again, click on the struct >> go to the details panel >> tick the checkbox next to the changed element to see it again. You'll however have to do this for every instance of the said struct.
  • An Unlit material means that we are only dealing with the Emissive and Opacity inputs.
  • Useful Material Nodes:

    The 'Radial Gradient Exponential' node can be used to create a circular gradient.

    The 'Particle Color' node provides data about the current color of a particle.

    The 'Depth Fade' node can be used in materials of particle emitters like Fire to have them smoothly fade away when they come in contact with any meshes. This node when connected to the opacity helps remove the hard edges which would be otherwise evident when a mesh obstructs the emitter.

Wednesday, September 16, 2015

Unreal Engine Diaries #1

  • When you're calling the function 'Get Random Point in Navigable Radius' [see link below], make sure that you store the vector return value inside a variable if you intend to use it in multiple scenarios later down the line. Otherwise, it ends up creating different randomized vectors each time it's used even though you're taking the same return value. It's kind of obvious as those are different function calls, but sometimes in the heat of the battle, it's easy to overlook small things like these.
  • Blackboards can be cleared from within blueprints using the 'Clear Value' function [see link below]. This is especially useful in scenarios where you're dealing with vector blackboards. Setting it to zero in this case is not ideal as it actually represents a location in the game space. 
  • Basic workflow for creating splines: Create a spline component in your blueprint >> Set it's world points by passing in the vector data [see link below] >> Add a spline mesh component >> Set static mesh >> Set material for the mesh >> Get the location (and tangents, if necessary) at the stored spline points [see link below] and use this data to set start/end points as well as tangent vectors for the spline mesh. [Do a for loop with the spline points data, if you have more than two points]
  • Select a group of blueprint nodes, right click and select 'Collapse to function' to have the editor automatically create a function encompassing those nodes.
  • It is possible to increase or decrease the rate at which an animation is run, by adjusting the Rate Scale in the details panel of the anim sequence.

Documentation Links: 

Monday, July 27, 2015

Unreal Engine DevLog #19: Line of Sight based Visibility

Hi, it's been quite a while since the last dev log on Project Horizon (yes, got changed to a less generic name). I had been working on my FPS Tower Defense Toolkit and some other experimental stuff for a few months. The toolkit has been submitted to Epic Games for Marketplace review and we're done with all the agreements. There has been some delay as I had to convert my project to v4.8 midway and there had been an engine bug that I had to find a workaround for. With the new code submitted, I'm awaiting their response regarding further details. 

Meanwhile there had also been some major updates to Knut's Advanced Turn Based Tile Toolkit, which makes it much more optimized and well structured than before. So rather than integrate all those changes into my project, it was much easier to transfer my code into his updated toolkit. And that took a while to implement and finally it was time to add some exciting new features to the project. First in line, was Line of Sight based visibility. An earlier unannounced version of this was already working partially in my old project. So I ended up using that, did some optimizations and finally we have something that works pretty fine. It's using a mix of grid based calculations and collision meshes to achieve this. The grid calculations are mainly used in places where the LoS will have a direct impact on player actions. For scenarios which are mainly important from an aesthetic viewpoint, I've used a collision mesh to reduce the time calculations. It's quite accurate, but I need to add a small line trace calculation at some point in the future. So here's a video demonstrating the new feature:


That's all for this update. Next time, we're gonna see if we can add the Overwatch ability from XCOM: EU. Until then, goodbye and stay classy.


PS: For anyone interested in Knut's turn based tile toolkit, you can check it out here: https://www.unrealengine.com/content/a22867b136314748af7437c635b9ddba

Thursday, June 25, 2015

The Precursors

I've been going through my Dropbox files to delete old unnecessary files and came across some of my first projects. They were all created using Unreal Development Kit and Blender and most were abandoned at some point. Just thought I'd put up some of the final screenshots here:

Helm's Deep and the Fortress of Hornburg




Temple Ruins




Castle on the Mountain


Sunday, April 19, 2015

Unreal Engine Experiments: Prototype Menu System v1.1 Update


About a month back, I released the source code for my prototype UMG menu in GitHub. The menu was basically intended to be a prototype that I could use on all my projects from the get go. So once I got the menu system set up and running, I had not worked on it apart for a couple of minor bug fixes. Since I got some free time this week, I decided to improve the menu system and fix some issues with it that has been bugging for the past few weeks. The idea of this post is to provide some sort of documentation related to the changes that had been included in this first major update for the menu.

For those of you who haven't seen the previous update, here's a video demonstrating the menu system in action:
        



Over the past few weeks, I've noticed from the unreal engine forum thread that some people have actually started using it in their projects. Assuming that the people who have started using the menu system have already made changes to meet their requirements, I'm documenting the changes made for the 'Update 1' in the form of screenshots wherever the code has been modified or added to.  

First of all, here are the major changes in brief:

1. Added support for manual default visual settings.

2. Fixed a bug with scroll bar not collapsing when interacting with other buttons.

3. Visual settings now shown as literal text instead of numerical values in the default version.

4. Improved the code for maintaining persistent visual settings throughout the active instance of the game.

5. Fixed the disappearing loading screen bug in standalone and packaged versions.

And here's the list of known bugs,limitations and plans to fix them:

1. Widget spacing works properly only for 16:9 resolutions. This will be rectified in future updates to accommodate 16:10 and 4:3 resolutions as well.

2. Default visual settings can only be set manually at the moment as the 'Scalability Auto' function in UE4 doesn't provide any means to retrieve the changed setting. I've been assured by Epic that a feature request has been added for the same.

3. Support for changing Audio and Gameplay settings aren't included out of the box. Will be added in future updates.

4. Soundtrack starts playing before the main menu loads. Will be fixed when audio settings are added.

Below listed are the changes made to different classes in detail. Textual cues are used in screenshots to describe the changes. 
  • Game Instance Blueprint 




  • Main Menu Actor




  • Video Options Screen Widget






  • Shadow Quality Widget [Similar changes in Texture Quality Widget, AA Level Widget and Effects Quality Widget]



Alright, I think that represents all the changes made in this update. Here's the link to the source code in GitHub:

https://github.com/Stormrage256/UMGMenuSystem

If you have any doubts related to the menu system, feel free to contact me here or in the forums. If you notice any bug apart from the the ones I've specified above, please do let me know as I haven't had much time to test it.

With that we come to the end of this update. Meanwhile feel free to check out my Youtube channel for more of my UE4 experiments.

https://www.youtube.com/channel/UCzvzhcyARRwQZ3YDnrujnEA

Sunday, March 15, 2015

Unreal Engine Experiments: Prototype Menu System

[Note: Read about the v2.0 edition of Prototype Menu System here:  https://unrealpossibilities.blogspot.in/2018/05/unreal-engine-experiments-prototype.html]

Hi, welcome back for another update. I've been working on a prototype menu system for my game using Unreal Motion Graphics (UMG). My aim was to create a basic menu interface with the following features:

1) Main Menu
2) Options Menu

3) Video Options Menu
4) Loading Screen
5) Game Screen
6) In-Game Menu

As I said, this is a prototype menu system. My main intention was to get an idea of how to put together a menu system using UMG. And to that end, I found this tutorial series very helpful for understanding UMG and it's features:


Youtube: Horus Heretic's UMG tutorials


Once I had a basic menu system, Zoombapup's Menu Flow tutorials helped a lot in creating a standardized workflow:


Youtube: Zoombapup's Menu Flow Tutorials


So moving on to the updates, I'll briefly explain the main features that I've implemented:


Main Menu




Well the main menu includes the following working functionalities at the moment:


1) New Game Button 

2) An options menu
3) Exit to Desktop button

I'm using the loading screen from Epic's Shooter template as the background.


Options Menu




The options menu has the following functionalities:

1) Video Options Button

2) Back to Main Menu Button

Video Options Menu




The video options allow the player to edit the video settings. At the moment it only supports 4 hard coded 16:9 resolutions, but all the other settings work fine. And then there is a 'Back to Options Menu' button as well.

Loading Screen


The loading screen includes a static progress bar and a throbber.

Game Screen


The game screen is the basic screen that you see when you play the game. Nothing fancy here at the moment, but I intend to move my Command UI from the HUD Blueprint into this widget. 


In-Game Menu




The In-Game menu includes the following working functionalities:

1) Resume Game Button
2) Return to Main Menu Button
3) Exit to Desktop Button

Well that sums up all the features in my menu system. For now I'm going to stick with this. Once I'm almost done with the core gameplay code, I'll get back to working on the menu. Here's a video demonstrating the menu system in it's current form:


With that we come to the end of this update. Feel free to check out my YouTube channel at: https://www.youtube.com/user/Stormrage256/videos

And one last thing. I was planning to release some free example project when the blog crossed the 10000 views mark about a month ago. However I got caught up with work back then. So now that I have a menu interface, I thought I might as well release the source code as part of the celebration. Better late than never. So here it is:

https://github.com/Stormrage256/UMGMenuSystem

Monday, February 23, 2015

Unreal Engine 4 Dev Update #14: Real Time Grid Generation & Unit Attribute Effects

Hi, I'm back with another update. The project went through a major design change with this iteration. As I had mentioned many times before, the design of my game was heavily inspired by XCOM: Enemy Unknown since the beginning. Maybe it was because of the fact that I was playing it in my iPad right around the time I started messing with Unreal Engine 4. The fact that it was one of the few mobile games that I really liked might have contributed to it as well. Anyways, anyone who has been following my blog would have already noticed that my design and gameplay decisions were mainly based on the iOS version. Recently, I tried out the PC version of XCOM. I really like it, some parts better than the iOS version, while some not so much. Playing the iOS version first probably  was the culprit, but I saw certain design elements that just felt right in that version. Like how there was a command prompt before issuing movement commands, instead of the real time movement command in the PC version. And that thought inspired me to align my design direction in a way, that I felt would be able to incorporate the best of both versions. It finally resulted in me creating some backups, making some logic changes, stumbling upon some issues, again making some changes and finally writing this blog post. So without further ado, I present to you the next update:

Real Time Grid Generation

Before getting into the details, I'll just show you a screenshot of what it looks like in-game. It'll make it a lot easier to understand for those who haven't played XCOM.


As you see here, I have now replaced the localized grid system (active grid with all the adjacent valid grids) with a singular grid system. Now this obviously provides less details, if I were to show the cover data. But the current design, being real time offers faster data interpretation compared to the earlier model. So you can just move the grid around instead of clicking on any particular location to get the data. I now understand why the developers Of XCOM decided to use the localized touch event based grid generation for the mobiles. In order to negate the lack of real time grid generation, they decided to present more data to the player in one go. I still have the old system ready, just in case, I want to try it out in mobile devices later. 

Anyways, getting back to the update, the command issuing works similar to what it was earlier. As in, you need to click somewhere, then issue the move command in order to move the unit. I decided to keep it because, a real time movement command sometimes feels too quick when compared to the pace of the rest of the game. When playing the PC version of XCOM, I came across situations where I wanted to assess the situation based on my command order. Having the grid anchored to any particular space then gave me the freedom to see the whole game space from multiple angles before making my decision. I've also added the option to cancel the order so that the player can check out other options before proceeding any further. My final inference is that the game feels a lot smoother now. Here's a video to showcase the new grid generation system:


Fire Action Points

In an earlier update, I had talked about my implementation of movement action points. I've just extended that to include Fire commands as well. Unlike the movement commands, issuing a fire command does not subtract a certain value from the action points pool. It just directly reduces it to zero, meaning that's going to be the last thing that particular unit will do in the said turn. I have not explicitly stated it in the video, but you can see it in action in the video above.

Unit Attribute Effects

Again as mentioned in a previous post, I had added three basic sets of unit attributes: Aim, Health & Movement Points/Dodge Chance. And back then only the health actually mattered as far as the gameplay was concerned. Now I factor in all these attributes in, when I make gameplay calculations. The 'Chance to Hit' an enemy unit is now based on probability. A unit with higher aim has an advantage in that matter. A unit with higher HP can stay longer in combat. A unit with more movement points can tread longer distances in any one turn. And finally a unit with increased dodge chance has a reduced chance of being hit when it is fired upon. All mainly theoretical stuff, hence not much to show in terms of gameplay at this point.

And with that, another update comes to an end. The next update will be about the menu that I created with Unreal Motion Graphics (UMG). Until then, you can check out my Youtube channel (link below) for more updates. Feel free to like/subscribe if you're interested in the development of my game. Thanks for your time, and have a nice day.

Stormrage256' Youtube Channel

Saturday, February 21, 2015

Unreal Engine Experiment #1: Gravity Gun

A few weeks back, on one fine saturday, I decided to take some time off from my main project and do a small side project. I decided to do something using Blueprints that I could wrap up within a few hours. The plan was to then put the source in GitHub so that anyone else interested in it could build on top of it. Since I've been watching some of the 'Spoiler Warning' Half Life 2 episodes around that time, I was kind of all excited about the game once again. Probably because of that, there was only one thing that I wanted to make: a Gravity Gun. But I knew that it needed physics, and I haven't done shit with physics in UE4. So I did some R&D on the basics, and came upon a tutorial by Epic's T.J. Ballard in the Unreal Wiki, for lifting objects and such. Here's the link for the tutorial:

Unreal Wiki: Pick Up Physics Object Tutorial


Most of the building blocks that I needed for the project, were already laid down in this tutorial. I went through it, studied what did what, and luckily he had explained everything quite well. I just made some changes to it, in order to reflect the primary and second fire of the Gravity Gun from Half Life 2.



Within a few hours, it was well and working, except for too much momentum and rotation calculation issues for the physics object. Increasing the mass of the meshes fixed the momentum issue, but the rotation issue would require more work. But I got the basic functionality working. Here's a video of it running in the editor:


It's still not complete. As I mentioned, it's still got some issues with physics calculations. About a week back or so, I uploaded the source code into GitHub. It's available as free download for anyone interested in it. So anyone wanting to check it out or fix it up and use in their projects, can get it from here:

GitHub: Unreal Engine 4 Gravity Gun Experiment

Alright, that's all for now. Feel free to ask in the comments, if you've got any doubts regarding it's implementation. 

Tuesday, February 17, 2015

Unreal Engine 4 Dev Update #13: Added Movement Action Points & Automated Turn End Logic

In my last update, I had talked about my implementation of path distance based grid mapping. Following in it's wake, I started working on implementing action points for unit movement. I'm working on a system similar to what's done in XCOM: Enemy Unknown. For those of you who haven't played that game, it means that the nearby grids have a movement cost of 1, while the long distance accessible grids cost 2 movement points. All units get 2 movement points by default at the start of a new turn.

So getting back to my implementation, I started off with adding an action point attribute for all units. Every unit will have 2 movement points at the start of the game and it will be refilled at the start of every new turn. The fire command is also supposed to draw from the same pool of points for it's execution. However, at the moment, the fire commands are free of cost. Only the movement is restricted to the availability of action points. Based on my grid mapping logic and the player input, I'm getting the cost of movement for any targeted location. If it's in the blue grid space, I subtract 1 action point and move the unit to the location. Since the unit will have 1 action point left, it can move once more, but it will be restricted to only the nearby grids this time. On the other hand, if the target location lies in the yellow grid space during the first movement command, the unit loses both it's action points on command execution. As soon as any single unit has depleted it's action points for the turn, the AI controller checks if any other units have action points remaining and possesses the first unit that satisfies the condition. The camera then moves to focus on the newly possessed unit.

However, if no unit has action points left during the check, the end turn logic is automatically called in. The enemy AI Logic is turned off at the moment, so it just prints out a string. Following that, control reverts back to the player as all of his/her units receive full stack of movement points. The AI controller then possesses one of the player units and camera moves to focus on the said unit. There's nothing new to be shown with screenshots here. So I'm going to leave a video to demonstrate the functionality:


Alright, that's all for this update. I'm kind of exhausted from trying to get some new features working. So I'm going to have the rest of the updates in another post, after I get some time to make the next dev video. Feel free to check out my Youtube channel for more videos. See you at the next post.

Sunday, February 15, 2015

Unreal Engine 4 Dev Update #12: Path Distance Based Grid Mapping, Inheritance & Unit Attributes

Welcome back for another update in a series of steady barrage of dev logs, where I'm trying to catch up with the actual development process of my game. We've got one major update for grid mapping and a couple of other minor updates. So let's get to the important one first:

Path Distance Based Grid Mapping

Well tackling this one has been in my peripheral vision since the beginning. Now that I've got some of the other systems implemented, it made sense to jump right into this. As you may have seen in some of my videos, pathing had only a single rule until now: grid based movement. It didn't differentiate between short distance movement and long distance movement. And now, the grids change color according to the distance. At an aesthetic functionality level, it's very similar to the mobile implementation of pathing display in XCOM: Enemy Unknown. When a player clicks anywhere on the navigable space, it calculates the distance and spawns grids in that region based on it. Grids closer to the player are depicted in blue, while the longer accessible ones are displayed in yellow color. Go even further, and no grids are spawned. One thing to note is that, grids accompanying the center blue grid will all be blue; meaning only those on the same category as the center grid are spawned. If any of the surrounding grids fall to the long distance category, it will only spawn if you click on that side of the fence. In order to show this more clearly, I've added a couple of screenshots.

Short Distance Grids (Blue Color)

Long Distance Grids (Yellow Color)

A working implementation of this feature is shown in the video at the end of the post. You may notice that I'm using the default models in the video. This is because I had uploaded the video quite a while back, so some of the aesthetic changes are not implemented in it. Now let's move on to the second update.

Inheritance

I have been using separate character blueprints for my player controlled units and AI units. Since I was planning to add unit attributes that function pretty much the same way for both parties, I decided to reparent both of them to a single custom base character. Similarly, I based both my AI Controllers off of a new base AI Controller. Same goes for the weapons, even though I have only one weapon at the moment. This is already making it easier for me to make changes that are reflected throughout the game. I also restructured the entire folder system, which was a royal pain. If it was just moving around to new folders, it would have been fine. But lot of the code got messed up with references going here and there. Since I had a backup, I checked that to create most of the Blueprints from scratch again, mapped them to folders and changed their names according to functionality. I'm just happy that I didn't delay this any further, because I really do not want to do this shit again. It is a totally boring process with no fun element or reward, except at a very abstract level. I highly recommend anyone planning to make a game in blueprints, to get this sorted out as soon as you have a base idea. Add more stuff keeping this in mind, even at a mid tier prototype phase. Always.

Unit Attributes

Well, this has been exciting. Being a fan of RPG games, it felt good to actually make custom attributes for my game characters by myself. Even the thought process that goes into it is very interesting. I wanted to restrict it to 4 sets of attributes from the beginning, leaving aside stuff like critical chance. I'm only talking about base attributes like health over here. It was more like a internal mind debate between which attributes made more sense, and which of them needed to be chucked out. Finally I settled with the following attributes: Aim, HP and Movement Range/Dodge Chance. Movement Range and Dodge Chance kind of fall under the same set for my game, since I've kind of made them as a single package feature. The fourth attribute which I've already decided hasn't been implemented yet. It's a bit higher level attribute that I want to test with EQS first. So I'm gonna wait for it until v4.7 hits the marketplace. These attributes do not impact the gameplay as of now, except for health which was very briefly shown in the video from my last update.

Oh and here's a video showing my implementation of Path Distance based Grid Mapping:
  
  
With that, we come to the end of another update. As always, feel free to check out my YouTube channel (link below) for more dev videos, and do like/subscribe if you find them interesting. Thanks for you time, and goodbye.



Saturday, February 14, 2015

Unreal Engine 4 Dev Update #11: New Character Models, Unit Highlights, Mouse Based Unit Selection & Context based HUD system

Finally, for once, I've managed to put a development update within the deadline mentioned from my previous post. Alright, we're having a cluster of small changes for this one. So let's get right down to it.

New Character Models

   
I had started off this project from the Top Down Template. So I've been using the default Unreal Engine character until now. A while back, I had bought the 'Prototype Characters' pack by Ying Pei Games from the marketplace. Since they fit the theme of my game better than the blue guy, I decided to use them for my units. Blue for the good guys, and red for the bad guys. If you've been following my posts, you might have come across the fact that I'm already using Epic's Animation Starter Pack in my project. I had some of the basic animations running already, so I did not have much work with adding the new models after re-targeting the skeletons. Here's an in-game screenshot:
   

And here's one up-close:




Unit Highlights and Health Display

Up until now, there was no cue when the cursor was focused on any interactive object, say, when I hover it over an enemy unit. So I decided to add unit highlights for enemy units. The idea was to have them highlighted, when the mouse cursor is over them. I'm using 'Hit Result under Cursor' and Custom Depth with a highlight material to achieve this. I actually got this idea from Tom Looman's Switch Development Blog. He's got a tutorial for the same as well. And btw Switch is probably the most awesome UE4 project that I've seen so far. It's hard to miss for a UE4 developer, but if you haven't seen it already, I'd highly recommend it. Here's a link:

Tom Looman - Switch Development Blog

Anyways, once I got the highlight working, I decided to add a Health Bar over the highlighted unit. So I got into UMG, made a health bar widget and added it to the character blueprint. At the moment, it's using a progress bar that changes according to damage received. You can see it in action in the video link towards the end of the post. I've also made the floating health bars to always face the player, no matter which angle he/she is looking from. For player units, I set these health bars to be visible all the time.
  
  
Mouse Cursor based Unit Selection

As mentioned in one of my previous posts, I've been using the number keys to select different units under the player's control. Being able to select units using the mouse was part of the plan for a while. So I just added that as well. Feels a lot better to use it this way, compared to the keyboard. Again, this is being demonstrated in the video as well.

Sounds Effects for Weapons Fire

As the title suggests, I've added the weapon fire sound effects from the Shooter template. I just play them along with the muzzle flash emitter when a unit starts firing upon another unit.

Context based Command HUD System

Lastly, I added a context based HUD display system. What that means is that, the interaction with the Command HUD happens based on what the player is trying to do. Right clicking on a location on the grid space to move the player only allows the player to interact with 'Move' button and 'End Turn' button. Similarly, right clicking on an enemy unit will allow the player to issue only 'Fire' commands and 'End Turn'.

Alright, so that's all for this update. Here's the video demonstrating the features that have been added, as per this update:


You can check out my other videos from my YouTube channel (link given below). Feel free to subscribe to my channel, if you're interested in the development of my project. And have a happy weekend.

Thursday, February 12, 2015

Unreal Engine Dev Update #10: Meet my new Player Character - The Camera

As usual, it's been a while since I put up a development update. In my previous update, I had mentioned that I made some major changes to my project. Oh and btw, since this is basically a learning project for stretching my understanding of game design, I'm just calling it Project Horizon. Current plans are that I'll be releasing it for free. This is kind of an experimental game project for me. So I'm just gonna through around some stuff at it and see if they work. Alright let's get back to the changes. First of all, I'll just note down the two major changes here:
  1. I've set the RTS Camera in my game as the default player character. 
    2.  A new AI Controller to control the player units.
                        
So I'll get to the first one in detail now. For starters, I've been using my player units as the default characters in the game. And that would seem like the sensible thing to do. Accessing the active player units are also a lot easier that way. And as mentioned in my previous posts, I've been using a custom RTS Camera (not the one that comes with the default character in Top Down Template) to move around the map. I noticed that when I'm playing the game, I'm just moving around the camera most of the time, while issuing commands based on a UI. It's almost as if the camera is the unit that the player is controlling. It's the only thing in the game over which I have almost constant control. The units that I command, move on their own and fire on their own, once I issue commands to them. And that kind of leads to the thought behind the reasoning for the second change. But getting back to the camera implementation, I'm still using a character blueprint for the camera. As a result, I can easily access the camera from any place. And I'm doing that a lot since I need it to automatically follow the units when they are executing their commands. But apart from this change, the core functionality of the camera remains the same as before. I had written a brief account of it's implementation in one of my earlier posts. Feel free to check it out here:

Unreal Engine 4 Dev Update #7: RTS Camera System for Top Down Turn-Based Games

Now let's get down to the second major change. As mentioned above, I'm currently using AI Controllers to control all the units in the map. One version for the AI  and another for the player units. And why would I do that? Two reasons: Utility and Efficiency. The Utility comes in the form of Behavior Trees. The AI Controller puts the power of Behavior Trees at my disposal. I did not have this luxury when I was using a player controller. The Behavior Tree enables me as a developer to not spend unnecessary resources on checking player state, since it can take care of it much more efficiently. The code that I had implemented prior to Behavior Trees for handling player state involved checking lots of conditions at every tick. Basically, the logic was not simple. It was not elegant. It was the type of code, that you want to change right from the moment you implement it. I was happy to push all of it into the Behavior Trees. So in hindsight, it is the transference of the player unit control to the AI Controllers that propelled me to use the camera as the default player character. It took some time and confusion to get through these changes. But things look a lot cleaner now. Plus the game is working fine. And I'm actually happy that I got rid of that code. Before I end this post, I hope this will do justice to the title:
                   
  
With that I'm rounding up this update. I don't have any videos directly related to this as both these changes were mainly under the hood. The gameplay basically remains the same as before. There have been a lot more changes since, but I think I'll have them in another post. Meanwhile feel free to browse through my Youtube channel (link below) as I've already uploaded some of the new gameplay videos following this update. 


Alright that's it for now. I'll hopefully have another update before the end of this week. See you soon.