The path is laid out to be followed

SCRUM

Well… the path. Yes, there was a Sprint AND it ended already. We met together and discussed the User Stories. None of them were reached:
From the programmer’s point of view, Daniel and I implemented the least amount of quality needed to reach each goal. So we were quite quick and done with every story, but that was not really doable by the artists. They can not produce a shitty model, start rigging and animating and expect to be able to continue and refine later. Also, our designers had some problems, but that is clearly not the right place to explain that in detail. We agreed on excluding some of the old stories and refined others that made it into that second sprint, still not really sure what were are actually doing. Silent voices appear that did not believe in that process…

Cleaning up

After the exhausting presentation phase, there were still some bugs left that needed my attention, that would belong to the just past week, originally:
1) The Sentry Drone had some minor Behavior Tree issues. Sometimes they were stuck in a loop, sometimes they execute some behavior too often, etc. Most of my changes were some value balancing to increase the overall performance and look of their behavior.
2) The AI performed weirdly in close proximity to the hero. Sometimes, they come very very close and just stare at him, without shooting. It turned out, that the hero character’s blueprint was considered as a collision while generating the 3D NavMesh. Excluding the player collision’s sphere from the generation process enabled the drones to move freely near the player again. I also, allowed the drones to consider their navigation point as being reached over a longer distance than prior. I increased the value from 15 Unreal Units to 120. That way, a whole drone or the player itself could stand at that navigation point itself, but that should not prevent the AI agent to move on with its task, anymore.
3) Some engine crash appears here and there, and they related to the 3D Pathfinding Plugin. I made some minor changes, rearrange some nodes, and it seems to fix these problems.

General Drone Improvements

Each drone got a new widget implemented over its head, showing its current amount of health points. The information bases on the implemented health component. I only needed to add an update routine to the health component that refreshes the widget of the drone in case there was a change to the value. For short time, I had the feeling of the need to look into Interfaces, as I am going to add every single HUD element that needs to be updated into the health component. Sounds like THE example to get into interfaces, but I didn’t… for whatever reason.

Carrier Drone Update

The original Delivery Drone got a new name: the Carrier Drone. With that change comes a new mechanic:
Rebecca asked me to change the behavior so that the drones do spawn at the mainframe automatically, go for the nearest energy resource field, gather it up, bring it back home and in case they got shot down, automatically respawn after a delay. These changes were no big challenge.

Sentry Drone Update

The Sentries should actually do damage to the attacker. “Well, attacker?” you might ask and yes, we rebranded some of our terms. The FPS player, the hero, got renamed to “the attacker” and the mobile device player, the villain, got renamed to “the defender”. Nevertheless, the Sentries shoot projectiles since the last week, but that projectile was not able to do damage. Daniel implemented the health component into the attacker so that it now has health points and I gave the projectile the possibility to alter that component.

To add some immersion, I told the Sentry to aim towards the attacker’s camera, instead of his belly. But that made the projectiles nearly unnoticeable, as you look directly inside them most of the time, and they were not solid (like every 3D model). So I added some random offset, that ensures that the drones miss the head, slightly.

Also, I got a request from the designers if the Sentry would be able to keep a certain distance towards the attacker. I created a new Environment Querry that is triggered if the attacker is too close to the drone. This query uses another ruleset:
1) generate some nodes in proximity
2) prefer heavily the nodes that are not within the “minimum distance” requested by the designers
3) prefer heavily the nodes that are not exceeding the “maximum attack range” requested by the designers
4) prefer the nodes that are as close as possible to the drone itself

These rules should ensure both distances at the same time (if possible), but also should be executed as quickly as possible because of the shortest pathlength that is generated by rule #4.

Defender and his drones

An issue appeared that the drones detect the defender as an enemy. I checked everything twice, but it just does not work as intended. Accidentally, I discovered a minor design choice of Epic that had a major impact on my code. I do use the AI perception system for every actor. And it was very natural to me, to give eyes to the actors themselves. But Epic assumes that the controller will implement the perception system and not the actors. Both are possible, as I had proven, but the IGenericTeamAgentInterface, the one that is called to determine between friend and foe, only considers the perception system implemented on the controller. It is just tedious and costs much of the high valued development time to have to discover these important facts on my own, because of missing documentation…..

To iterate on the way how the defender will move the camera on the mobile device, I created a very simple UI, that had some buttons to move the camera into every direction including zoom. Later on, these buttons should be replaced by actual touch gestures, like swiping away from the direction I want to move the camera to and these two-finger gestures of mobile internet browsers to scale the view but do zoom instead.

Mainframe got a HUD

The Mainframe is the single structure in the middle of the level that the Defender has to take care of. If all of its health points vanish, he will lose the game. This central object should also provide the ability to spawn the drones. I changed the old mechanic of just clicking somewhere to spawn a drone towards a mechanic that only reacts if you hit the Mainframe object first. It then creates a Widget in the real game world around the Mainframe that automatically creates buttons for every possible drone to spawn. These buttons include showing the name of the drone, a description and its needed energy from the stock, which are now settable in each of the drone’s blueprints.

Attacker Pickups

Rebecca asked me for two simple versions: one that adds health, and one that adds energy to the Attacker. It was a fifteen-minute task and I really like these easy tasks if I trouble a lot on another end of my code and just want to have a break. So it got implemented.

Sound

One of the learnings of older projects is to implement sound as quickly as possible because of several reasons:
1) It is easier to implement a sound hook in the code while I am currently working in it. Otherwise, I have to get back to that mindset I had while working on the code days before.
2) Adding even cheap placeholder sounds enables the team early on to have a feeling for what fits into the scene and what not.
3) Compared to the complexity of sound towards art or the implementation of other mechanics, a simple sound is a lot easier to achieve and add nearly the same value towards the final product.

David provided some morse-code sounds, that should tell the player, what a drone is currently doing. It is like a program that prompts into a console. Also, there are now sounds for spawning a drone, shooting and an impact effect for the projectile.

 

That ends this weeks entry. Thanks for reading.