Neural network analytics
Neural net analytical techniques emulate, in computer software, the working of an organic brain, identifying patterns and relationships in the data presented. For example, we could present our neural net software with data on the branch operations of a multi-site service business. We could then build a model to explain what combinations of location, employees and customers lead to more profitable branches.
The model could then be interrogated to examine the effect of changing the team experience quotient in a particular branch; the neural network model would show that including some individuals might lower the expected profitability, while others might increase it - we would, therefore, have derived a measure of the relative productivity of different employees. Another model might then be constructed relating this individual productivity to tenure, age, training and so on, leading to a value-based profile of ideal employees.
Sometimes the data just isn't good enough to build reliable models - but neural net techniques are more likely to find relationships than any other method, particularly where data is limited.
Neural nets "learn" their way gradually towards a solution rather than solving equations for minimum error in the way that traditional statistical regression does. They are also capable of inferring relationships between "inputs" and "outputs" that may be very non-linear. Unconstrained, though, they will derive models with misleadingly high statistical "fit" between modelled and actual data...
By using the right training algorithm we can ensure that the most obvious lessons are learned first, so that the spurious "modelling" of noise - the achilles heel of regression - is effected last. Our stopping algorithms are designed to halt the learning process close to the point where truth is maximised, by testing the model on data set aside for the purpose. As long as the neural net is inferring relationships that are real, the "fit" of modelled output to actual output will be increasing for the "test" data as well as for the modelling set. When spurious inferences are drawn from the modelling set, (called "over-training"), the fit of such models to the test set drops. Stopping modelling at the highest fit to the test set is the best way of ensuring reliability of a model.
The Alexander Partnership have been developing and using neural net techniques for many years in a wide variety of business applications, enabling us to successfully attack problems that were previously impossible to analyse statistically and at the same time doing so in a very time efficient manner.