You know those bad stories you hear about designers and architects, for example? They deliver the project to the clients. They say “nice!” only to then discover that they have completely changed the site, the logo, the architecture and much more even though they paid for it.
In the world of statistics, it can happen that you consult for companies that need staff, but do not have a human resources employee. I’m taking an example from the world of micro and small companies because this doesn’t happen from medium-sized companies upwards. Since the statistician sees every problem as its own nail, they can help the selection by creating a structured quiz survey, to be filled out by the candidates
The survey has theory questions and sections that measure various theoretical aspects. The questions are “googleable”, but not immediately. Also because you want to discourage the candidate who takes hours to find the perfect answers. In addition to the quiz, the candidate can be assessed with an oral interview, managed by the company’s technical employees who are obviously listening. It may happen that employees give a positive opinion on the interview. But the quiz shows an opposite result on the candidate. This error can cost several thousand euros as it involves hiring a person who did not reflect subjective expectations. The statistician may have formulated a quiz that is not consistent with the requests of the technical employees, but this also depends on the collaboration with them. In any case, the discrepancy in evaluation between the quiz and the oral interview effectively becomes a wake-up call, which if ignored it costs.
Another example: in a structured company with a marketing department, the statistician reports that the earnings derived from campaign A do not earn significantly more than campaign B, with the same costs and other controls. Ignoring this warning wastes thousands of euros on ineffective campaigns or advertisements.
Or the statistician can point out that the marketing strategies do not have the desired effectiveness with the risk of antagonizing the employees of that department, who perhaps won’t communicate that result to managers.
Another example: the company asks for a data dashboard that contains a certain number of metrics deemed important. The statistician, also for reasons of readability, may recommend lowering that certain number. The company rejects the advice. After a while the statistician realizes that no one has looked at the dashboard which contains too many numbers. “Calm” in “STATiCalmo” also means making statistically informed decisions with supports that do not repel because they contain too many numbers, graphs, etc.
Another example: the company asks right away a technical solution. The statistician says they need another one, after having explored the reasons that generated the request for that technical solution. In some cases it may happen that the request is unnecessarily more elaborate,and costly, of the statistician’s proposal. The company proceeds with another professional who does not object, finding itself with a solution that perhaps requires more maintenance.
Another example: the company asks to optimize conversions at the end of a certain process. The statistician says it pays to have an overview of the process from the beginning. So you have a dashboard that leads to the fishbone diagram, a tool that is part of causal analysis (a term I rarely use). Having an overview costs more in terms of calls, organization and obtaining data; the company doesn’t have the budget. In the end the company does nothing and continues with 2 problems, which can be summarized as 1.
This company will probably have a lifespan of less than 12 years on average without context (by sector, by legal form, etc.).
If you think about it, these situations include opportunity costs, a concept that any microeconomics student can explain. If you’re interested in avoiding them, we can try in a preliminary call.