Computational Intelligence (CI), Machine Learning (ML) & Artificial Intelligence (AI)
- expert systems – logic based
- neural networks – based on biological neuron
- fuzzy inference systems – linguistic modelling
- neuro-fuzzy learning systems – combination of learning and rule based transparency
- genetic algorithms – evolutionary technique
- other biologically inspired – particle swarm optimisation (PSO), ant colony optimisation
(Ref: Artificial Intelligence in Mechatronics, University of Melbourne)
Epidemiology-based information driven dynamic sensor collaboration(DSC) provides
a flexible approach that often allows the accommodation of these opposing criteria of
saving energy and information gain, and the provision of an optimal protocol for a
The primary goal of algorithms/protocols for communication networks is rapid
information diffusion based solely on local information. Their effectiveness can be
measured by the total amount of information that is moved into the network to
accomplish the task, the time it takes for updates to be complete, the kind of global
information assumed and the environmental constraints.
In application of epidemiology to networks, following questions similar to the
epidemic nature of a disease arise:
1. How many hosts will have been infected when the outbreak dies out ?
2. How many hosts are infected at any given time ?
3. Which hosts are vulnerable ?
4. Which hosts are critical ?
This can be well applied for marketing and propagation of word of mouth messages. Nevertheless Bush-fire forecasting.
A gossip algorithm was initially introduced for replicating databases. Later it was applied
to peer-to-peer networking to extract aggregated information from the network.
In physical implementations, in a paper they present a sensor with two power
modes (sleeping and receiving transmission). This substantiates the feasibility of
using a gossip-based protocol in sensor activation with existing sensor technology.
We did studies on correlated and non-correlated chemical tracer fields, random and
regular grid topologies.
The gossip-based algorithms have been applied to information dissemination in
large-scale systems due to its simplicity in deployment, robustness and resilience to
failures. It mimics the spread of contagious diseases in a way similar to the epidemiological protocol.
In health related research, they are more focused on stopping epidemics rather than encouraging
them. When applied to communication networks, the stress moves towards the better spread of
information. The gossip-based protocols generally have the following properties: buffer capacity,
relay count, fan-out.
Particle swarm optimisation
Particle swarm optimisation (PSO) is an evolutionary algorithm which is based on
swarm of particles. It was first introduced by Kennedy and Eberhart in 1995.
Similar to other population-based optimisation methods such as genetic algorithms,
PSO starts with the random initialisation of a population of particles in the search
space. It converges to the global best solution by simply adjusting the trajectory of
each individual particle towards its own best location and towards the best particle
of the entire swarm at each time step (generation).
This can be well applied for Team dynamics in Social Sciences or in Management where we
can reap the best solution out of diverse groups.
Genetic algorithm (GA)
The genetic algorithm is an evolutionary algorithm that mimics evolution of biological processes.
It is a powerful optimisation tool that can be used to find maxima or minima in complex multidimensional
spaces. Unlike algorithms such asSimulated Annealing, which explores the space using a single Markov Chain, the GA
uses a population of random vectors to search the space. Evolutionary algorithms
have been used to solve complex problems including inverse radiation problems,
assimilating sensor data in dispersion and finding optimal placement of biological sensors in a CBR fusion context.
In GA, a randomly initialised population of solution strings is evolved by manipulating the strings using a set of
operators such as crossover and mutation. At each iteration, the population evolves such that the fitness of the population improves in
a manner similar to the survival-of-the-fittest principle in the evolution of biological systems.
GA-b represents variables as encoded binary strings and works with the binary strings to minimise
the cost, while GA-c works with continuous variables.
The two genetic algorithms we implemented in MATLAB used the negative log
likelihood function as the fitness function. Selection of individuals for mating was based on a
chromosome’s rank ordering of fitness rather than the actual (or
scaled) fitness value. Relative to proportional selection, ranking selection reduces
selection pressure when fitness variance is high and increases selection pressure when
he variance is low. Ranking schemes can avoid premature convergence. Because
three parameters (x, y, α) are used to characterise a radiation source, the GA-c implementation
used a chromosome consisting of floating point numbers, where r is the
number of sources. In GA-b, each parameter was represented using a 16 bit binary
string and, therefore, a binary chromosome of length 48 real numbers. Both implementations of
GA used a population size of 12 and the maximum number of iterations was 4500.
Artificial Neural Networks
This is modelled after the biological neuron cells in human brain. Brain uses massively parallel computation: 1011 neurons, 104 connections per neuron.
Neural networks can be trained for
- function approximation (non-linear regression),
- pattern association, or
- pattern classification.
The training process requires a set of examples of proper network behaviour
- inputs x and target outputs t
- during training the weights and biases of the network are iteratively adjusted to minimize the network performance function.
Applications of ANN to forecast heavy vehicles traffic using pizo-electric cables, Correlation of WIM Data with Classified Counts: Using Artificial Neural Networks to Identify Load Distribution at Classifier Sites.
A fuzzy set is an extension of an ordinary (crisp) set. Fuzzy set allows partial membership: an element may partially belong to a set. The membership of elements in a crisp set can be described by the characteristic function.
Replacement of human operator by a FIS:
- Sendai subway (Hitachi), Elevator control (Hitachi, Toshiba)
- Nuclear reactor control (Hitachi)
- Automobile transmission (Nissan, Subaru, Honda)
- Video image stabilisation (Canon, Minolta)
Replacement of human expert by a FIS:
- medical diagnosis
- Fault diagnosis
- Credit worthiness
Application of Fuzzy analysis to create ascertain congestions in traffic, Freeway Traffic Flow under congested Conditions using Fuzzy Analysis),
Other interesting reference
Train networks with many layers (vs. shallow nets with just a couple of layers). It uses multiple layers work to build an improved feature space. The first layer learns 1st order features (e.g. edges), The second layer learns higher order features (combinations of first layer features, combinations of edges, etc.). In current models layers often learn in an unsupervised mode and discover general features of the input space – serving multiple tasks related to the unsupervised instances (image recognition, etc.) Then final layer features are fed into supervised layer(s). The entire network is often subsequently tuned using supervised training of the entire net, using the initial weightings learned in the unsupervised phase. Could also do fully supervised versions, etc. (early BP attempts)
(Source: CS 678 – Advanced Machine Learning and Neural Networks)
Markov Chain Monte Carlo
The basic idea of the Markov chain Monte Carlo (MCMC) method is to perform a random walk through the solution space, which is modified using a probabilistic rule, to sample a target distribution. Metropolis developed the probabilistic rule that allows the target distribution to be obtained in the most efficient manner. The algorithm results in a Markov process, as each random step depends only on the previous step. While the original Metropolis algorithm requires a symmetric proposal distribution, Hastings extended this algorithm to use asymmetrical proposal distributions.
Simulated annealing (SA), developed by Kirkpatrick, is a widely used method for finding minima of complex high dimensional functions. The method simulates the physical annealing process in a hot molten material that is cooled down slowly to form a perfect crystal. Unlike in a gradient based minimisation where search can only go downhill, Simulated annealing allows the solution vector to sometimes move uphill, which gives SA some ability to avoid getting trapped in local minima.
CI, ML & AI
Updated on 2015-11-11T11:22:30+00:00, by .