Before the popularization of statistics, which dates back to the 1980s (later for data science), companies rarely had computers. As a result, they navigated by sight and made decisions mostly based on intuition and experience. Simplifying, companies were based on people, calculators and printers, not on the data.
I’ll immediately make a clarification: obviously the large companies of the time had greater availability for statistical processing. The personal computers of the 1980s, combined with programs and open source statistical languages, democratized statistics, making it accessible even to small and medium-sized businesses, which, apart from administered or command economies, represent the absolute majority of the economic fabric.
Before the 80s, the math was done by hand or semi-automatically: paper, pen and squares, typewriters, printed calculators such as the Olivetti Summa 24, electronic calculators such as the HP 9100, which they could not yet call a personal computer.
With the 1980s, electronic calculators take the leap and become personal computers. And programs such as Lotus, the ancestor of the ultra-quoted Excel, pop up: the graph function, very popular in its time, allowed to dynamically connect the data on the worksheet with the graph, chosen from a range of five possible types: linear, stack, histogram, pie and XY. It also allowed the statistical inference: https://www.researchgate.net/publication/19471001_The_use_of_LOTUS_1-2-3_in_statistics
this doesn’t mean that it couldn’t be done before, but that you had to know some programming language, like S, the ancestor of R. So a statistical programmer was needed, which was much more expensive at the time.
Curiosity that really amazed me about Lotus: the graphs and diagrams routines were written in Forth by Jeremy Sagan (son of Carl Sagan), a science communicator whom I appreciate very much since my adolescence.
Statistical programmers of the time could know C++, SQL, C, Fortran, BASIC.
Without a programmer, instead, a person trained in particular programs was needed, such as SAS (1976, by IBM), Minitab (1972), SPSS (1968). They had a considerable cost, which is why, as anticipated, only a few large companies could justify the computational and software licensing expense (excluding training courses).
With the 90s we have languages that have become the standard today: R, python.
Do you want to make decisions based on statistics, which will improve the effectiveness and efficiency of your company? Let’s get to know each other first in a free call.