If you can’t measure it, you can’t manage it.
When bootstrapped, it is especially important to measure what you achieved as output for any given input. That is important in order to weigh your opportunity costs and possibly the return of investment otherwise.
WHY is this important?
Because you simply have a very finite amount of resources when bootstrapped, most notably human (young founders tend to ignore this and fail badly – see also founder fever). You must be more than ever brutally aware of the output you generate for any given input. Repeat: input – output relationship. Start-Ups simply cannot afford to be wasteful with any sort of ressources.
Hence, if you don’t have the input-output relation clear (as clear as you tolerate it to be ambiguous) in front of you, get it clear or stop doing it! Why would you do something which you cannot see any impact of?
While this seems intuitive in theory and the usual nodding in the crowd starts to beget moaning (or readers begin to think whether to continue reading), I have witnessed too often just the opposite. The reason: There are still 4 common pitfalls preventing rigorous application in “real life”:
1 Degree of data driveness
Unwillingness to apply a data driven approach and or work outside the zone of comfort. Example statements I have. One across:
- “You cannot measure HR”
- “You cannot measure PR”
- “You cannot measure XYZ”
- …et cetera
There are three approaches to this:
- your culture is not particular data-driven and I would strongly recommend to fix this!
- your culture broadly speaking is fine (that is data-driven in this case) but you got people on board that don’t get it or don’t support it: get on with feedback talks and tools to learn the basics of measuring output. Worst case here: let the employees in question move on quickly!
2 It is tough to measure!
The qualitative nature of the subject makes it indeed tough to measure
- “Well that’s brand budget – we don’t get any visible return to measure”
- “How do you want me to measure HR output?”
Just because it is tough to quantify something qualitative does not mean you need not try. After all, if it was easy, everyone would do it.
Build a matrix with the following axes: accuracy and availability (degree to which it is easily obtainable). Plot your potential options to measure and settle according for the best proxy which you are willing to accept as a compromise.
If then input does impact the proxy you chose as output, you’ll be in the green.
Example: Customer Satisfaction
I used to measure customer satisfaction (goal, output) in weekly volume of so called “love mails” (proxy chosen) from customers. Though imperfect (definition of love mails was often contested), it directed attention of our service where it belonged rightfully (beyond expected service level).
Note: I could have also opted for regular feedback surveys such as a net promoter score but that required time to set up and evaluate plus it involved potentially costs for the respective tool and so on. To me at that point in time, I chose to tolerate the imperfect proxy love mail trading off accuracy for time and costs.
- radical “scream test”: stop doing it (input) and see if it really has no / any impact (output). Surely HR will show its impact if not done thoroughly or even at all. You’ll be surprised sometimes of the results!
3 Technical shortcomings?
- “Google does not let me!”
- “Xyz is not working correctly”
Let’s assume the tools are installed and configured correctly then the issue is simple:
- training and
- more training.
It’s a matter of personal technical ability and curiosity. End of story.
4 Technical shortcomings!
Let’s assume the tools really are “not working”.
Here, define first what “not working” means. That is, are we talking about something being recorded but not correctly, not recorded at all, not or poorly displayed…
- Broken? Fix it! Duh.
- Something missing? Usually the tools are fine and they only lack metrics or visual representation. MS Excel hacks usually do the trick (beware of short-term solutions becoming permanent). Before investing heavily in new tools (set up time, costs for upgrades, training) try to export data to Excel and see if you can build something that works
- Review again if you really cannot do without it. Can you settle for a proxy which you already measure or which is more easily obtainable? Beware of analysis-paralysis!
- Understand: finite ressources allow you to do only so many things. Measure their output wisely.
- Move first toward a higher degree of data driveness as a part of your company culture before you upgrade or buy tools. No tool regardless how great was ever used when data was not appreciated.