Unity and Refactoring

I use IntelliJ’s “Rider” for C# development in Unity and as all of IntelliJ’s IDE’s, it comes with top-of-the-class refactoring support.

Unfortunately, this does not extend to Unity’s serialised object data and so any refactoring of serialised fields on your MonoBehaviours will break objects that are using that behaviour – both prefabs and objects in your scene.

I’m quite sure it’s perfectly possible to fix this in Riders Unity plugin (IntelliJ: hint hint :)), but until that happens, this is how I deal with it:

Whenever I need to change a public field (or any field marked with [SerializeField]) on a MonoBehaviour that I know is already used on either prefabs or in a scene, I first add the new structure to the MonoBehaviour, then let it implement ISerializationCallbackReceiver. In OnAfterDeserialize, I then add code to copy the data from the old structure onto the new one. When this is done, I switch to Unity, which effectively causes it to reload all prefabs and scenes, applying my transformation code so that I now have the old data in the new structure, and can remove the old structure without losing anything.

For example, let’s say I have made this fantastic behaviour

public class Circle : MonoBehaviour
 float angleInRadians;

And a level designer calls me up and says: “Radians? WTF dude?”. I guess that means he’d probably prefer to specify this in degrees. I could just change the meaning in the code, but then the name is misleading and existing values will be wrong. So I would like “angleInRadians” to be called “angleInDegrees” – if I just rename it, whatever values are assigned to this in Unity will be lost, so we instead add angleInDegrees as a new field, and let’s make it an integer, just because:

public class Projectile : MonoBehaviour
 float angleInRadians;
 int angleInDegrees;

Then, let it implement ISerializationCallbackReceiver

public class Projectile :MonoBehaviour,ISerializationCallbackReceiver
 float angleInRadians; 
 int angleInDegrees;  
 public void OnBeforeSerialize() { }
 public void OnAfterDeserialize() { }

And add code to convert the value:

public class Projectile :MonoBehaviour,ISerializationCallbackReceiver
 float angleInRadians; 
 int angleInDegrees; 
 public void OnBeforeSerialize() { }
 public void OnAfterDeserialize() 
   angleInDegrees = (int)(180*angleInRadians/Mathf.PI);

When we return to Unity, it will compile and run the code, updating our new structure with the correct value.

Now, this looks all fine and dandy in the editor, but there’s one little catch – the modifications we made went unnoticed by Unity’s change tracking system, so it does not think the objects has changed and thus will not save them unless we explicitly, and manually, cause it to.

Having to select each object and prefab kind of defies the whole purpose of what we’re trying to do. Ideally we’d want to automatically mark the objects dirty, but we can’t do that during de-serialisation since the APIs for this are off-limits in this context (for obvious reasons), so I have added two small scripts that help me do this:

One is just a static list of objects where I can track all the mutations I make on load:

public class MutatedOnDeserialisation
  protected static List<Object> _mutated = new List<Object>();

  public static void Register(Object obj)

The other is an Editor script that extends the one above with a File menu command to traverse the list and mark all the objects as dirty:

public class MutatedOnDeserialisationEditor :MutatedOnDeserialisation
  [MenuItem ("File/Mark Mutated Objects")]
  public static void MarkMutated ()
    foreach (var obj in _mutated)
      if (obj)
        Debug.Log("Marked " + obj);
        Debug.Log("Skipping " + obj);

I must admit that I don’t understand why there are null refs in there, but there are – I suspect a combination of temporary objects during load and Unity’s odd boolean operator overload to be playing tricks on me, but as far as I can tell it’s ok to just ignore them.

Also of note: Unity’s documentation says to use Undo.RecordObject() instead of SetDirty(), but for whatever reason, that did not work for me.

In any case, with these two scripts in place, we just need one additional line in the de-serialisation code to make it work, namely the call to register our mutated object:

public void OnAfterDeserialize() 
  angleInDegrees = (int)(180*angleInRadians/Mathf.PI); 

This is obviously (a lot) more work than simply pressing SHIFT+F6, but it’s also a lot less than having to go through every prefab and object instance in the project and updating it manually, especially in cases that are not as trivial as this one, like changing from a primitive type to a list or struct (and also has no risk of typos or accidentally missing occurrences).

Still, if anyone has a better way of doing this, I would love to hear about it ūüôā


2016 MacBook PRO review

Update 2017/3/14:

I’ve been meaning to append an update to this post for a while, but have been too busy to do so. Today, I was once again reminded about it for reasons I’ll get back to shortly.

So, immediately after the original blog post, I had two major issues:

  1. It had a HW glitch in the keyboard (Hubris for being too tough on Dell, I’m sure) – basically the ‘O’ key only worked one of three times it was pressed. Apple replaced it overnight, no prob.
  2. Then I was consistently getting a black screen on start up if I had entered sleep mode with my external monitor attached. Apple tried for a short while to blame my external monitor… That stopped when they learned that it’s an Apple Thunderbolt display. There was another feeble attempt at blaming 3rd party software, so I reset the machine to factory and the problem didn’t go away. So, finally convinced, they offered to take it back, but I would then have to wait 3 months for a new one because it’s a custom build.¬†I finally opted to just have it repaired. One more night in the shop, and everything has been fine since. So, early build, monday model, lemon, I don’t know, but I guess no matter what you pay, you can never be sure HW¬†won’t have issues. Lesson learned.

In both cases Apple Support has been top notch (safe the external monitor thing, but that¬†was actually kind of¬†funny ūüôā ).¬†When things do go bad, and they do, good customer support¬†is all I really ask for, and Apple’s is second to none. I leave them my number, 30 seconds later they call back and walk me¬†through the various tests and problems until there’s a solution.

HW glitches¬†aside, there’s a couple of things that I wrote about originally that I now have a 3+ months perspective on:

Weight and size does matter. I was unsure if the modest reduction would make any difference, but it is noticeably nimbler on an every day basis. It also is way more silent than the old one and it does not have the same hotspots around the charging port, for example.

Performance, well, it’s the same story every time you upgrade. It only takes a few days to get used to it, and you then forget what it was like before. Only thing I’ve noticed consistently is that I have much less graphics issues in Unity. Before, I would get weird texture glitches every now and then (lack of GPU memory probably), and had to restart Unity. I haven’t seen that since I got the new laptop.

The USB-C port, much as I predicted, has been a non-issue. Or rather, it has turned out to be more annoying for others because they can’t borrow my charger or my lightning cable, but since that greatly reduces my risk of loosing either, I consider it an overall win ūüôā

The O-led display however… Well, it’s the reason I finally decided to write this, because it’s such a ridiculous non-feature that I just had to vent. It has no practical purpose what so ever; it does nothing that I couldn’t do without it. Nothing. But if it was just that, if it was just a party trick, I could try to forget how much it probably added to the BOM and simply ignore it.¬†But that’s exactly the problem, it’s not just useless.¬†It¬†is in fact incredibly annoying on a daily basis.

The bar¬†have no tactile feedback, so even just using it as if it was a row of regular old function keys is a step down compared to a normal keyboard. But it gets worse because of how good the touch is. I usually hover my finger over the F7 and F8 keys when debugging, but now I can’t because it causes random taps when I don’t want them. And don’t get me started on how many times I’ve been asked if I want¬†to turn Siri on. I do not plan to talk to my laptop. Ever. So you can stop asking, but the damn button is right above the backspace key, so if your finger is just a wee bit off, Siri will think you called. Same story in the other side where F1 is so easily activated that I get annoying help dialogs all over the place.

Apple, seriously, what were you thinking? Did this “feature” go straight from Ive’s skull directly into production – nobody actually tested it?

If you absolutely had to change the keyboard, how about you fix the goddamn cursor keys. Yes, I would like full size up/down keys also – I’m a grown up, I have grown-up-size fingers.

Yes, I could use an external keyboard, but at this price, I really shouldn’t have to.

Original post follows below:

Me and my laptop

It’s been 4¬†years since I bought my trusty MacBook PRO retina. That’s easily a year more than I’ve ever had a laptop before, and the first time ever that I was in doubt as to whether it made sense to get a new one.

All the laptops I’ve had before (Top of the line Dells¬†and¬†Lenovos, typically) has started getting tired¬†after less than 2 years, making the last year of ownership a real drag. Part of that has been due to Windows bloat and build up of crud in the registry and file system in general, so in part I suspect the MacBook has lived longer not just because of its hardware but also because of OsX, but the hardware is certainly part of it.

(On a side note, the Dells have been notorious for needing hardware replacements; from keyboards loosing keys to graphics cards getting bugged colours or weird artefacts, I’ve made extensive use of the next-business-day service. In fact, my wife got a Dell not too long ago and had to replace the HDD after just a few weeks – They might be ok for home computers, but they’re anything but a professional tool and I’ve had my last Dell for sure).

Still, 4 years in with the MacBook, and I feel like it could easily have done a year or two more, but¬†two things were nudging me towards a new laptop – the 256GB SSD was constantly kissing¬†the 95% mark and I had to do frequent cleanups in my temp and trash folders to be able to work (not a bad thing per-se ūüôā ) and the graphics card was sometimes struggling a bit with two large displays running Unity, photoshop and Blender at the same time, causing the fan to be on more than I would have liked.

Enter the 2016 MacBook Pro.


First of all, let’s get the price of out of the way: It’s ridiculous – there is no justification for it, other than Apple charging big bucks because they know they can. For me, as for most people in my situation I would guess, it¬†is, however, also somewhat¬†irrelevant.

It’s my primary work horse¬†– the only tool I own (aside from software licenses), and I buy¬†one every 3 (now 4) years and use it 10 hours every day – saving $1000 and risk getting a Dell-esque piece-of-crap Hardware that needs a new HDD in the middle of the week, costing me a couple of days worth of work, just isn’t an option. The fact that it lasts a year or two more than the competition means I would probably pay even more if needed (Don’t tell Apple that though ūüėČ )

So, with that out of the way, what am I getting for all my cash?

It’s obviously lighter and thinner – I only just got it so can’t say if this is anything I’ll really notice, but since I drag it around all the time, less is definitely better. The limited testing I’ve done suggests that it also stays cooler, which was my only real concern with the reduced size.¬†The old one could get seriously hot around the hinge when Unity was pushing polygons at 60FPS so I had a (seemingly unwarranted) suspicion that a smaller form-factor might make it harder to get rid of the heat and thus make the problem worse.

I love the keyboard. It’s a bit click-y-ti-clack where the old one was dead silent, but the feel is just phenomenal. I actually think I type faster and make less typos, though¬†that could be a¬†euphoria-induced illusion¬†;).

The touch pad is even more¬†awesome. I loved the old one which was leaps beyond anything I’d ever had on any other laptop, yet it pales next to the new one. Everything from the sound it makes to the very subtle, yet rewardingly haptic, feedback when you press it is just best-of-class – nothing I’ve tried on any other laptop (mine or others) are in the same ballpark – hell, they are not even in the same galaxy. It’s so good that I feel a bit bad for throwing $100 at a new magic mouse ūüôā

The O-led display that replaces the function keys is… Well, it’s a gimmick. A very expensive one, I presume. I really don’t see the point of it, and seriously doubt I’ll ever use it, but I’m going to try to keep an open mind. Maybe there some kind of use for it, and I just can’t see it, but this is probably the one thing I would have taken out if I could, just to save a few bucks.

USB-C… I *really* hate that I can no longer use my thunderbolt display as a charger. This, to me, is the only really annoying “feature” of the new MacBook, but I still think it was the right thing to do. My old macbook would get quite¬†hot around the magnetic charging port, so I suspect the connection wasn’t all that fantastic, even if it was a great concept, and having a single shared port for everything just seems like the right thing to do. The world probably isn’t ready for it, but it wasn’t ready for tablets either, and that changed quick enough. Having an adapter for my other legacy peripherals is a minor moan, that I suspect will go away faster than I can possibly imagine. So thumbs up for paving the way for a world of fewer odd cables, though I wish I could have bought a mag-charger adapter.

I now have 512 GB of SSD which should keep me running for the next 4 years (my 256 GB disk has been almost full for 4 years, so it’s not like my needs are skyrocketing, but¬†every new project I start do tend to add a handfull¬†of GB, so it’s nice to not have to worry about it).

The new graphics card is fast. The old one was never a problem, but I can quite clearly¬†feel the difference¬†when working in Unity – projects that were just barely managing 60FPS in the editor are¬†now keeping an almost constant 100FPS. I know this isn’t a cardinal point for everyone, but for me,¬†this upgrade alone was worth the price.

It’s pretty much the same CPU I had in the old one, though the memory is faster, so I suspect that a benchmark would tell me that everything runs a tad quicker, but probably not enough to¬†have any real-world impact on my daily use. If I consider a typical day in my life, the amount of time I spend waiting for CPU bottlenecks¬†is probably less than the time I spend getting coffee (Yes, I drink too much coffee, but that can’t be helped). I much prefer a smaller, cooler, less battery hungry machine over one that is 15% faster in those few cases where it actually matters, so CPU was never a concern for me.

Finally, I love that I can have it in space grey ūüôā

#MadeWithUnity in 7 days

7 Days

IMG_06727 days after I got up in the middle of the night and Рfor no particular reason Рdecided to add my own peculiar contribution to a very long list of color-match puzzle games for mobile devices, I finished and uploaded the release build of Qube Crush 3D.

That’s 7 days (and nights)¬†from initial spark to a completed F2P game with 50 levels of brain bending puzzles uploaded to the App Store.

Just to prove to myself that it wasn’t a fluke or a happy accident, and¬†since I had a week or two to burn while waiting for the AppStore review process to complete, I decided to do another game in the mean time.

This was to be a 2 player arena battle game and the result is Magic Micro Battle Arena, which I completed in just 5 days Рmostly because it has no IAPs or Ads and no highscores, so less integration work and no horrible IAP testing. Ironically, it also got approved faster because Qube Crush got rejected due to an issue with my AdColony integration.

IMG_0611So two fully functional playable games completed in two weeks. These are not Gears of War or Doom 8, and their feature lists has been cut to the bones, but anyone with an idea of just how much time the crud of project setup, tool chain woes, app-store nightmares and 3rd party integration (not to mention play-testing) usually takes, should realize what an amazingly short time-to-market that is.

The rapid development probably means I missed a bug here or there, and I could certainly have polished both of these¬†two games for another couple of weeks, or months even, but¬†before I do, I’d like to know that there is at least someone out there who wants to play it ;).

Because my investment is so limited, I can try whatever crazy ideas I come up with, and if it fails, I can move on without bleeding to death.

Unity Rocks!

And that’s really why¬†I’m writing this blog post. Because of how this is all possible. Obviously, I *am* a pretty awesome developer (hah! ūüôā ) but, truth be told, I owe most of my productivity to the game-dev tools we have at our disposal today, and at the heart of those is¬†(in my case at least) Unity3d.

I can’t say enough good things about Unity¬†– sure, it has its bugs and quirks and annoying issues, but at the end of the day it’s just so insanely productive that I hardly notice.

I run a 4-screen setup with Unity and Blender on a big cinema display, photoshop on my Cintiq and Mono on my laptop with either my phone or iPad¬†hooked up for testing – with Unity’s rapid automatic import there is no sense of context switching, it’s as if it’s one big application build up of state-of-the-art parts¬†(minus Mono which is anything but).

IMG_0674Testing and trying out stuff in Unity is so quick and easy that in two weeks, I have not once had to start the debugger. Mind you, if¬†launching the debugger from Mono wasn’t such a nightmare,¬†I’d probably been able to finish the games even without¬†cancelling my weekends¬†ūüôā

So here’s to the guys at Unity: YOU ROCK!

(But please find a replacement for Mono. For those of us who’d rather chop off our left arm than¬†install Windows, Mono is¬†still¬†(if only barely) the preferable¬†option, but I’d really like to see a Unity/JetBrains¬†C# IDE with bullet proof indexing, a nice coder friendly editor and proper refactoring that does not break script-to-game-object bindings).

If you have an iOS device you can get the¬†apps on the AppStore – Qube Crush is free* and Micro Arena is $1 (if you ask nicely I may have a promo-code for you ūüôā )



If you don’t have an iOS device, wait for an Android update, or check out a video of Qube Crush here:


*) Yes, I know what I’ve previously said about “free apps”, I still think it’s a fundamentally bad idea, but the sad fact of the matter seems to be¬†that as an unknown IP or small game developer you are left with¬†a choice between “free” and, well, nothing else at all… So much for not being part of the problem, though.

C# and it’s broken scope rules

I just got bit by a real fluke in C#.

I’m sure one of Hejlsbergs soldiers will dig deep into the spec and locate some addendum to a side-paragraph that explains it as perfectly reasonable behaviour, but before you do that, I’d like you to inspect the code and, without running or compiling it, tell me what you would expect would happen:

 public void WeirdCSharp()
   int i=1;
   DateTime dt2 = new DateTime();
     case 0:
       DateTime dt = new DateTime();
       SomeFunction( () => { Debug.Log (dt2); Debug.Log (dt);} );
     case 1:
       SomeFunction( () => { Debug.Log (dt2); } );
 public void SomeFunction(VoidEvent d) { d(); }

I expected this to print out dt2.

Not so. In fact, it will throw a run-time error of enigmatic proportions:

ArgumentException: Value does not fall within the expected range.

Never mind that the error makes about as much sense as tea without biscuits, but it is in fact thrown in case 1 when SomeFunction is called – not because there is anything wrong with this line, but because (I assume) the dt variable from the case above is un-initialized but gets captured in the call block due to the messed up switch-level scope.

You can fix it by introducing local scope in the case (add curly brackets). Ironically, I’ve made a general habit of this because C# has weird scope rules in switch statements, but missed it this once.

So yes, there’s a simple workaround, but¬†obviously, C# should never¬†have had scope at the switch level – it made some amount of sense in C when you could fall through from one case to the next, but since C# does not allow this, there is absolutely no point in maintaining this arcane design.

And thus, no reason for this issue to exist.

Unity GameCenter Support

I’ve spend the last couple of days implementing GameCenter support in¬†a Unity game and found more than a few pitfalls all of which goes back to a seriously broken Unity API.

So, if you are¬†planning on using the Social API in Unity3D, here’s a couple of landmines you may not want to step on (I’m sure there are others, but these are the ones that I wasted time on) – oh, and just for the record

1) I love Unity – it’s an awesome platform. It’s just their GameCenter implementation of the Social API that could do with a bit of love ūüôā

2) These issues were found in Unity 4.5 (but Google suggests they have been there forever, so I would not count on them being fixed anytime soon)

Loading scores

The first thing that bit me is that ILeaderBoard.LoadScores only supports a single request at a time. If you try to call it multiple times, only the last invocation will receive a callback (several times, whatever good that will do you). I implemented the following helper to work around the issue:

  struct LoadScoreRequest
    public string            _id;
    public string[]          _userIds;
    public BoardLoadedEvent  _callback;

  public delegate void BoardLoadedEvent(ILeaderboard board, HighScore[] highScore);
  private bool _friendsLoaded;
  private bool _loadScoreRequestPending;
  private List<LoadScoreRequest> _loadScoreRequests = new List<LoadScoreRequest>();

  public void CreateAndLoadLeaderBoard(string id, string[]userids, BoardLoadedEvent ondone)
      _loadScoreRequests.Add( new LoadScoreRequest() { _id=id, _userIds = userids, _callback = ondone }  );

  private void SendNextScoreRequest()
    LoadScoreRequest req;



      _loadScoreRequestPending = true;
      req = _loadScoreRequests[0];

      req._userIds = new string[Social.localUser.friends.Length];
      int i=0;
      foreach(IUserProfile friend in Social.localUser.friends)
        req._userIds[i++] = friend.id;
    ILeaderboard board = Social.CreateLeaderboard();
    board.id = req._id;

    board.LoadScores( (bool scoresloaded) =>
        _loadScoreRequestPending = false;

Basically, it queues up all requests until the GameCenter authentication has been completed and the friends list has been loaded. It then starts loading the leaderboards one at a time, making sure each board gets returned to the proper callback.

The¬†authentication code is not shown above, but it’s straight forward and simply sets the _friendsLoaded member to “true” after loading the friends list and then calls SendNextScoreRequest in case any requests got queued up while authenticating.

Saving scores

IScore.ReportScore does not work.

At least, I could not get it to work. Use Social.ReportScore() instead as it seems to work as advertised, and has the added advantage that you don’t need to carry around the ILeaderboard reference. Why the broken method is even in the public API I can only guess at.

Error handling

Basically, there isn’t any.

At least, you can forget about checking the boolean “success” parameter provided in the various callbacks. I’ve never seen this have a value other than “true” despite a myriad of issues, none of which I’d personally characterise as a “success”.

Instead, check that the data you expected is available Рfor example, that the user- or highscore-lists are not empty.


Be really careful with Social.LoadAchievementDescriptions – it will load the associated images for all achievements and keep them in memory. They will not be released again.

In my case it amounted to 12 images of¬†1024×1024 pixels which, with some overhead of unknown origin, amounted to 96MB each time the game refreshed the achievements list.

There’s a similar problem with loading user profiles – their profile images are also kept in memory and are not released when you let go of the profile itself.

Throw to hit

Most people probably don’t consider the intricacies of computer-game enemies shooting at them. In many cases rightfully so, since it’s mostly a very simple feature. The game code to send a¬†fast moving projectile towards you rarely involves more than a subtraction¬†of two points to get a direction, and multiplying with¬†whatever speed bullets¬†happen to travel at in the given game. If it’s fast enough, it’ll hit you. Other times, bullets are instant hit and the game¬†simply checks for intersection with a line segment – no chance of escaping that one either.

In¬†the¬†game I’m currently working on, however, most of the enemies¬†don’t shoot bullets; they throw water balloons. Just as in real life, timing and aiming a throw is quite a bit more involved¬†than firing a bullet, and¬†while most of us learn this as kids by trial and error, a game usually solves it with a bit of math.

It really isn’t much more complicated than the two scenarios I just described, but it does involve¬†matching a polynomial¬†with a linear¬†interpolation, so if both of those sound alien to you, and you happen to be faced with this problem, read on :).

The Stage

Before we look at the math, here’s a quick overview of the problem we’re faced with:

Throwing B at P

Throwing B at P

Let’s say we’re the enemy and we’re at location B0, throwing at the player who is currently at P0. What we want is to find the velocity F that our projectile B must travel with so that it will impact with P when the gravity (g) pulls the projectile to the ground. N is the current velocity of the player.

The path of the projectile is given by the second order polynomial

B = B0 + F*t + G*t²

The path of the player is given by the linear equation

P = P0 + N*t

Assuming, of course, that he does not change direction or speed – which is kind of the whole point of the game ūüôā

Before we try to solve this, note that the G vector is special in that it only affects one axis. This simplifies matters a great deal since our second-order polynomial then only exist for this axis, and hence, we can separate the motion in the X/Z plane from that of the Y axis. Or more specifically, we can ignore X/Z and pick a velocity in Y that creates a nice curve and gives us a time of impact, and then adjust our projectile speed in X/Z to match the players location at the calculated time.

So let’s look at the equation for Y:

B0.y + F.y*t + G.y*t² = P0.y + N.y*t

For some rather game-specific reasons, I don’t care if the player moves in Y, so my N.y is always¬†0. It should be straight forward to change the code to take this into account, you just need to carry the N.y term along.¬†Anyway, in my case,¬†I can simplify to:

G.y*t² + F.y*t + (B0.y -P0.y)  = 0

Which is the 2nd order polynomial we need to solve to determine our time of impact, t. If this doesn’t ring any bells, just google it, there’s a text-book solution for this, and chances are you’ve solved a million of these in school and¬†just forgotten about it ;).¬†Below I’ll just go through the code.


This is what a prettified version of my implementation looks like:

Vector3 P0 = player.transform.position;
Vector3 B0 = enemy.transform.position + initialOffset;
Vector3 F  = (P0-B0);
float l = F.magnitude;
  F.y = l*.5f;
  Vector3 G = new Vector3(0,-9.81f/2,0);
  Vector3 N = player.transform.forward * player.runningSpeed;
  float t = SolvePoly(G.y, F.y, B0.y-P0.y);
    F.z = (P0.z-B0.z) / t + N.z;   
    F.x = (P0.x-B0.x) / t + N.x;   
    WaterBalloon b = (WaterBalloon)Instantiate(_waterBalloon);   
    b.transform.position = B0;   
    b.rigidbody.velocity = F;

In short, this starts by picking a Y velocity that is proportional to the distance to the target. This creates a higher arc for targets that are further away. With F.y set, I can solve the second order polynomial for Y and derive the time of impact “t”, which I then use¬†to calculate the remaining two components of F.

F is now my initial velocity which I can assign directly to my (non kinematic) rigidbody in Unity. If, for some reason, you are using a kinematic projectile, you’ll need to move it yourself in the FixedUpdate method. Something along the lines of

transform.position += _velocity * Time.fixedDeltaTime;
_velocity += _gravity * Time.fixedDeltaTime;

should do the trick.


There’s a couple of details in this code that are maybe¬†not immediately obvious:

Line 2: I add an initial offset to the enemy location to get the projectile start location. This is simply because the enemy is supposed to throw the balloon, not kick it – I.e. I want it leaving his hands, not his feet ūüôā

Line 8: I set the Y velocity to half the distance between the enemy and the player. This is a more or less arbitrary choice – the smaller the number, the flatter the curve. Note that this value alone determines time-to-impact, so if you keep it constant, short throws will take just as long to hit as long throws.

Line 10: If you look closer at the definition of G you’ll notice that gravity is defined to be only half of earths normal gravity of 9.81. The reason for this is that the polynomial is describing the curve of the projectile which is the integral of its velocity, not the velocity itself. If you differentiate to get the velocity of the curve:

p = p0 + v*t + k*t² =>

p’ = v + 2*k*dt

…and¬†insert g/2 for k, you will see that the change in velocity caused by gravity correctly becomes g*dt.

Line 11: I calculate the players velocity using his forward direction and a speed – if the target is a non-kinematic rigidbody you could probably use the velocity directly, but mine isn’t.

Line 13: As you probably know, a second order polynomial has two solutions РI, however, only calculate one of them. If you plot the curve, the reason will be obvious: The polynomial will cross zero twice between thrower and target. Once going up, and once coming down Рwe want the one coming down which is obviously the larger t and because of the nature of our particular set of coefficients, this is always the solution described by s1 below:

private float SolvePoly(float a,float b,float c)
   float sqrtd = Mathf.Sqrt(b*b - 4*a*c);
   float s1 = (-b - sqrtd)/(2*a);
   return s1;

Line 14:¬†It is¬†worth noting that this *can* fail. If the chosen Y velocity is not sufficiently large to bring the projectile to the same height as the target location, the second order polynomial will have no solutions, and sqrt will return NaN, hence the check for NaN. You¬†*could* check for this in SolvePoly before doing the sqrt, but in my case¬†it’s such a rare occasion that¬†I’m not too concerned with the performance penalty.¬†I’m happy as long as it does not create any visual artifacts.

Lines 16-17: These two equations are easily derived from the original equations for projectile position B and players position P, like this

B0 + F*t + G*t² = P0 + N*t =>

And then solve for F, noting that G is zero in the cases of X and Z which are the ones we are currently interested in.

 F*t = P0 РB0 + N*t =>

F  = (P0 РB0) / t + N

Floating point numbers and colors…

… and why the idea of mapping a floating point color value of 1.0 to the byte value 255 was a really bad idea.

We’re stuck with it, I know, and I found a way to hack myself around the problem, but still, it’s one of those things that may have seemed like a good idea at the time, but the sensible thing would still have been to map 1.0 to 256 – a lot of things would have worked a lot nicer that way.

This all started with me messing around with shaders in Unity. I needed to pass a bunch of data from the Unity application all the way down to the fragment shader, and decided that the best way to do this was to encode it in a texture – I mean 4 floats per pixel and relatively cheap access from the shader – what could go wrong?

The first couple of issues I ran into was a lot of sub-systems’ over-zealous desire to “improve” my texture before handing it over. No thank you, I do not want that pixel averaged and mipmapped. This was mostly a matter of setting the right options and properties, so I got that sorted.

The real problem, however, was that I somehow managed to completely ignore what a texture really is: An array of 8-bit color values. While we may see pixel values as a nice vector of 4 floats in both Unity and the Shader code, the color itself lives a small portion of its live in the cramped space of just 4 bytes.

And here comes the mapping issue: Because 1.0 is mapped to 255. The nice round floating point value of 0.5 does not become a nice round 128 byte-value. It becomes 127.5. You cannot store 127.5 in a byte, so it ends up as 127. When you convert 127 back to a float it, in turn, ends up as 127/255=0.498..something. Which is not just off from the 0.5 I was hoping for, it is also rounded down, so when I try to use it to find a tile in a texture, I end up in the wrong tile.


What I really needed to store was the integer values from 0 to 15, so here is my solution:

In unity: color.r = N/16 + N/4080;

In my shader: N = floor(color.r*16)

Not rocket science; I convert the number to a float in the range [0;1] , and add a magic offset. N/4080 is the smallest number that, after being converted to a byte, will not cause the conversion back to float to be rounded down. This number depends on N due to the nature of FP numbers. Why the constant is 4080 I don’t know, but I’m sure there’s some IEEE FP guru somewhere, who can enlighten me.

For now, I am just happy that my textures are tiling properly ūüôā

Remember, remember, the month of December

It’s been mildly annoying for a while – how iOS, and UIKit in particular, treats even the most irrelevant little deviation in input data as a hard error and throws an exception, instead of just logging the problem and either ignoring the request or adjusting the input parameters.

This morning it became *really* annoying, but I’ll get back to that in a minute.

You may argue that the hard error causes more of these issues to be fixed by sloppy developers who would otherwise ignore the warnings and release code that, strictly speaking, isn’t correct. This is a fair argument but I really think it should be limited to debug code.

The problems I have with throwing hard errors on a visual rect that is 2 pixels to wide or, say, a “scroll to index path” that has an invalid path (and this is what I will be getting back to, shortly), are:

1) The exception is caught at application level, and stops execution so further debugging is not possible. As a result, locating the actual code can be tedious – especially for cases that are not easily reproduced. If it had been a warning in the log, it would often be possible to step back and reproduce the action while the app is still running.

2) When this happens in released code you crash an app because of a glitch that would, in many cases, probably not even have been noticeable.

And (2), of course, is why I am even writing this – I got bit by this today, badly, as I realized that TempusCura basically does not work for the entire month of December.

The reason?

Well, as a courtesy to the end-user, the calendar view automatically scrolls to the current month when opening the main view. The current month, obviously, is a number from 1 to 12. The indices in the table that is showing the months, however, are 0-11. Now, for all other months, this creates a small offset as the view actually centers on the next month rather than the current – I never noticed this, and neither did anyone testing the beta version of the app. When you get to December, however, UIKit throws an exception because the index 12 is illegal for an index range of 0-11.

This is not really a critical feature in any way, yet it turned out to be a major showstopper, as the app is completely unusable for the entire month of December.

Yes, I know it’s a bug on my part, and yes, maybe I should have tested all uses cases for all days of the calendar year, but we all know how realistic that is.

Yet, if UIKit had not been so anal about these utterly irrelevant details, and simply showed January, or capped the index at 11, the app would have worked just fine.

And if Apple didn’t take 2 weeks to approve an app, the patch would already be on-line… Sigh!

** 2013/12/02 – UPDATE **

TempusCura is now available in V1.7.13 which fixes the December crash. Hats’ off to Apple who got the app reviewed and approved in just one day! Awesome :).

Tempus Cura Testing


So, it’s crunch time again… Time to dig into those last frantic hours (days) before uploading a new app to the AppStore, making sure everything is correct, that all the texts and images that comes with the release are in their proper place, all signatures and little iTunes oddities are sorted and, of course, that the App actually works as it should.

TempusCura complicates matters further by being a hosted multi-user App, introducing a variation in UI perspective defined by different user-roles and a server backend that needs to be tested for vulnerabilities and scalability as well. I am using Google App Engine which introduces a third backend problem: Optimizing for cost: Sometimes the best performing solution is not the cheapest, and at the end of the day, I *am* trying to run a business so numbers need to add up.

But that’s a topic for a different post – right now I wanted to share my pre-release test procedure for TempusCura.

During development I use TestFlight to manage testers and while that works for distribution of test binaries, it does not really help validate anything. Unit testing sorts out bugs in core functionality and is especially useful for backend testing, but as anyone who ever worked on an App (or any other UI intensive piece of software) can tell you, there’s a scary bug potential hiding in the UI which no amount of Unit testing will find.

So at the end of the day – or the end of development, rather – boring as it is, you do need to go through that end-to-end systems test. I guess there are many ways to do this, but for an iOS app, I like to sit down with InterfaceBuilder as a guide and the app running on a device and then go through every screen and every button while writing up a test protocol.

The test protocol is basically a check list of what should happen in various parts of the app when the users performs certain actions under some given circumstances.

To give an idea of how that may look, I’ve attached the TempusCura test protocol in its first revision as an example.


Some of the things that I try to keep in mind when writing the protocol are:

  1. Write each test as a simple one-line question/statement that can be easily validated as true or false.
  2. Try to group tests so that they follow the flow in the app naturally – this makes testing so much quicker.
  3. Make sure all UI paths are included.
  4. Make sure the expected result is obvious from the test description.

One of the interesting things with a test protocol is how many bugs you find just writing the document. For TempusCura I found a handful of small and large issues when writing the document, and 17 bugs (quite a few serious ones) the first time I ran the test.

Remember to always take the entire test from the start with the binary you intend to upload – we all know what usually happens when we “fix” bugs in the 11th hour ūüėČ

Enclosure Progress

It’s my own damn fault for not paying attention to the little details, I know. But it was still annoying to find that even though Unity4 supports terrain objects on mobile devices, it doesn’t really “support terrain objects on mobile devices”. As soon as you include a terrain object in the scene, frame rates drop to “unplayable” and none of the kings tweaks or any of his¬†optimisations¬†will bring it back above 20fps again (if anyone thinks differently, please do let me know how you made it work).

So, after cursing for a little while, I accepted my fate and threw away the 30+ hours of work I’d spend getting the terrain just like I wanted it. The replacement is a flat plane which came with the obvious problem of how to create holes. It’s easy enough to add geometry on top of the plane (like the rocks in the screenshot), but when you have objects that extend below the plane (like the crater), it doesn’t really work. I considered custom building the plane geometry at run-time, but was a little concerned with the potentially growing triangle count, so I ended up using a custom shader that allow me to “paint” holes in the plane at run time. While not as pretty as the terrain, it sure is a helluva lot faster.

I’ve done a lot of work on weapons and enemies and am quite close to having something playable – the environment is still a little pale, but since it’s purely for decoration, I want the gameplay done before I start polishing it.

Day15b Day17a

So far the¬†armoury¬†includes 3 automatic sentries: The obligatory mini-gun, a flamethrower and a rocket launcher. The rocket launcher uses the Unity physics engine for rocket movement, creating some funky trajectories. On top of this I’ve added two manually controlled weapons to involve the player a little more in this aspect of the game: The plasma canon is the surgeons tool to take out individual enemies while the not-so-subtle “Artillery” covers and entire area in a mayhem of fire and debris for a short period of time.

I’ve added limited ammo, including clips and reload delays to all weapons. This was not in the original game, but it’s another parameter to adjust when balancing the game later, and I think it will come in handy. I will also add the option to turn sentries on and off to make the most use of whatever ammo you have.

The beasts fall into two categories: Walking and flying – both are done and working, although the flyers still need some tweaking… I have 6 unique beast types in total, 4 walking and 2 flying – some of them will be included in various skins with different attributes.