Tuesday, November 6, 2012
Hello ANN.
Predictive Analysis Pro:
An Artificial Neural Network or ANN is a statistical computing model inspired by biological neural networks. It is controlled by nodes in it's network that's affect on the analysis can very based on the hidden neuron input. Similar to the sensory organs affect on our brains processing. These systems consist of a learning algorithm and can take advantage of parallel and sequential processing. The strength of system like this is in prediction of patterns in systems using not a specific input algorithm but rather, using historical data on the subject.
In biological systems the discovery of the neuron was worth a 1906 Nobel Prize. Uniquely it had many inputs and connections and only one output. This works simply by adding the signals of other neurons together until a threshold is reached and the neuron reaches an excited (firing) state. The synapses that join each neuron can attenuate that input signal. In the early 40's this was modeled mathematically as an artificial neuron. These were made to produce a binary output. The power of a neuron is found in an interconnected group. The network itself has the capacity to evolve. It becomes capable of far more than the sum of it's parts. A network can work to a state where there are no further changes. It may have more than one stable state, somehow determined by synaptic weights and thresholds for each neuron.
Feed-forward configuration helps to model perception. It consists of a network of pyramidal structured neurons that start with an input layer of neurons consisting of the large base and a smaller output layer capping it off with a layer of neurons in between. The more complex the system the more neurons that consist in this structure. Minsky discussed in a full AI that this would be several tiers rather than the three for a simple structure. He remarked on 9 but it is easy to imagine that number rising to increase the output of a system.
Utilizing a system like this in a business to help predict patterns using vast amounts of historical data would provide a great competitive advantage by having a strong statistical model to help with every day decisions in a company. Take for example a company that wishes to release a new product to the market. They have a customer list of 1 million customers. They wish to find what percentage would be interested without having to contact all these customers. The system is given a threshold of sales it needs to procure from this resource of customer contacts. The company has decided that there will be 20,000 in the first run of the product, and that shall serve as our threshold. That means it requires a response of 2 percent to reach the goal. Instead of contacting the entire list the system is taught what to expect from customers by first seeing the responses for a given set, so only 100,000 are contacted. This subset helps teach the system relationships and patterns it can perceive in order to predict what customers from the 900,000 remaining should be contacted. After the 900,000's data are presented to the network it determines 32,000 individuals most likely to purchase the product. The savings is seen instantly in this case, because you don't require labor hours in contacting the majority of customer contacts, you are focusing on the goal and people who will likely buy your product.
This type of system used in phone sales analysis could be immensely useful and i can see several other applications where predictive technologies are not only invaluable but the entire business. A fund manager, stock traders, anyone involved in financial services, because their product is money, they likely stand the most to gain. Also weather modeling, and even sports managing.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment