For the central attribute you choose an inference method and for each of the 'inference' links to its antecedents you might choose a unary operator and set a weighting.

Page under construction.

- Meaning: The more the better
- IM: Count % A =, with first antecedent being a TRUE constant.
- UOps: Direct
- Weights: - (You can achieve greater weightings on some antecedents by having two or more relationships to them.)
- Antecedents: Boolean. The first ante must be a TRUE constant. Usually all the others are presented on a single form so that user can tick those that apply.

- Meaning: The more the better, until we have enough. Variant of Simple Picking List
- IM: Count % A =, with first antecedent being a TRUE constant.
- UOps: Direct
- Weights: -
- Antecedents: Boolean. The first ante must be a TRUE constant.
- Consequent: Follow this by a limiter which defines 'enough'. This can be either a consequent with inference method of "Whether A >" or a link whose unary operator is "Whether >".

- Meaning: One or a few will do, but the more we have, the more certain we become. Opposite effect of Paranoid. Belt and Braces asymptotically approaches 1.
- IM: ProbOr
- UOps: ProbAnd
- Weights: Between 0..1, higher for more important antes (weight gives maximum contribution of the ante, when true)
- Antecedents: Probability or Boolean

- Meaning: Chain is only as strong as its weakest link.
- IM: Min
- UOps: ProbOr
- Weights: Between 0..1, lower for more important antes (weight gives lowest value the effect of ante can take the result down to)
- Antecedents: Probability or Boolean

- Meaning: If any one is missing then we are in trouble; and the more missing, the worse we get. Opposite effect of Belt and Braces.
- IM: ProbAnd
- UOps: ProbOr
- Weights: Between 0..1, lower for more important antes (weight gives lowest value the effect of ante can take the result down to)
- Antecedents: Probability or Boolean

- Meaning: The strongest of the antecedents. A bit like Belt and Braces but only the strongest antecedent makes a contribution and the others are effectively ignored.
- IM: Max
- UOps: ProbAnd
- Weights: Between 0..1, higher for more important antes (weight gives maximum contribution of the ante, when true)
- Antecedents: Probability or Boolean

- Meaning: The whole is more than the sum of its parts. In Belt and Braces the result is less than the sum of all its (weighted) antecedents, its graph being always below a 45 degree straight line. With Synergistic the graph is above the straight line. However, once it has reached 1 it stops there. Although this seems as though it could be meaningful, we could not think of a real life example, so perhaps it is only meaningful in theory.
- IM: Complex in Istar. I think you would probably have to do something like the following: convert all the antecedents to numbers in the range 0..100, multiply them together, limit them to 100, and convert back to probability.
- UOps: As needed.
- Antecedents: Probability or Boolean.

- Meaning: Evidence for and against.
- IM: Bayesian
- UOps: Direct is taken to mean Bayesian Weighting automatically
- Weights: Bayesian Weights
- Antecedents: Bayesian, Probability or Boolean.

Unary operators are applied to inference relationships to modify the effect of the antecedent attribute before using it in the inference. At present they only apply to inference relationships.

Normally a relationship starts off with no unary operator, except as required by the inference method. For instance, the Bayesian inference method requires that its links hold bayesian weights, so in this case the indicated unary operator is overridden.

The highest priority is given to the CONTROL unary operator. If one of these is found, and the antecedent is such that it can be converted into a truth value (i.e. of types Boolean, Probability, Proportion, Bayesian, Odds) then it might set the value of the consequent regardless of the other antecedents; see below. If it is not one of these types then the CONTROL is ignored.

The next highest priority is when the inference method of the consequent is Bayesian Accumulation of evidence. Then the operation is different. See below.

Otherwise, the operations are as follows:

- Integer, OZMO, Float: Takes negative of antecedent.
- Boolean: Takes boolean opposite.
- Probability: Takes probabilistic opposite (1 - p).
- Direction: Changes direction by 180 degrees.
- Otherwise: No change.

- Odds, Ratio: Exchanges numerator with denominator.
- Boolean: Takes boolean opposite.
- Probability: Takes probabilistic opposite (1 - p).
- Otherwise: No change.

- Integer, OZMO, Float: Takes absolute value of antecedent.
- Otherwise: No change.

- Integer, Float: Multiplies value by weight.
- Otherwise: No change. But in future e.g. Ratio might be affected.

- Integer, Float: Divides value by weight.
- Otherwise: No change.

- Integer, Float: Adds numeric value of weight.
- Otherwise: No change. But in future e.g. Ratio might be affected.

- Integer, Float: Subtracts numeric value of weight.
- Otherwise: No change. But in future e.g. Ratio might be affected.

- Probability, Proportion, Bayesian: Multiply the probability of the antecedent by the weight taken as a probability.
- Boolean: Convert to Probability type and treat as above, giving the value 0 if False and giving the weight value (taken as a Probability) if True.
- Otherwise: No change.

- Probability, Proportion, Bayesian: Combine antecedent probability value with that of weight as a probabilistic OR (p + q - pq).
- Boolean: Convert to Probability type and treat as above, giving the value Unity if True and giving the weight value (taken as a Probability) if False.
- Otherwise: No change.

- Integer only: Bitwise AND with weight bits.
- Otherwise: No change.

- Integer only: Bitwise OR with weight bits.
- Otherwise: No change.

- Integer only: Bitwise XOR with weight bits.
- Otherwise: No change.

- Integer: Randomly give a value that is between the antecedent value and zero.
- Enumerated type, Ordinal: If the antecedent has an Ordinal Limit (as most will, but a few from old KBs will not) then select a value from 1 to the limit at random.
- Otherwise: No change.

- If True, then the consequent attribute is deemed relevant, and its inference proceeds as dictated by the inference method and the other antecedents.
- If False, then the consequent attribute is deemed irrelevant, and it is given an answered value found in the weight of this Control relationship.

- If the antecedent is of type Bayesian, Probability, Boolean, then it is
converted to Odds, using the weights found in the link, and the result is
used as an Odds multiplier for the consequent. If the unary operator is
(ie. has become) INVWEIGHT, then the weights are reversed before being
applied to generate the Odds.
- If the antecedent is Odds, then it is used directly as an Odds multiplier for the consequent (and so INVWEIGHT and WEIGHT have the same effect).

For each method we give the following information:

- esIM_XXX name
- Its normal label in a list view gadget - what the knowledge engineer is likely to see.
- What it does.
- What types of antecedents is acts on or requires.
- What types of consequents is requires.
- Advanced use.
- Any stopping rule that makes the result answered before all antecedents have been searched by backward chaining.

"X = A + B, C, ...", "X = A - B, C, ..."

**Antecedents**:
These act on numeric antecedents.

**Consequents**:
Integer, Float, Angle/Direction, Enum, Ordinal. Others in future,
perhaps.

**Advanced use**:
When B (and C etc.) is a PROPORTION the inference is different: it
adds that proportion of what is already accumulated. Thus (60 + 50%) gives
90, not 110. If C, D, etc. is a PROPORTION, it takes a proportion of what
is already accumulated, not just of A. Thus if A is 60, B is 50%, C is 20
and D is 10% the result is:

- take A to give 60
- add B=50% of that to give 90
- add C=20 to give 110
- add D=10% of that to give 121.

"X = A * B, C, ...", "X = A / B, C, ...)"

**Stopping**:
With multiplication, if an antecedent makes the result zero then no
further ones are taken.

**Antecedents**:
These act on numeric antecedents.

**Consequents**:
Integer, Float or Angle/Direction.

**Advanced use**:
The stopping rule can be used to control the order in which questions
are asked.

"X = A & B & C & ...," "X = A | B | C | ..."

**Stopping**:
With AND, if any antecedent in order A, B, C ... is FALSE, no others
are taken. With OR, if any is TRUE, no others are taken.

**Antecedents**:
These act on boolean antecedents. Any other type is converted to
boolean if possible before being accumulated into the result.

**Consequents**:
Boolean.

**Advanced use**:
The stopping rule can be used to control the order in which questions
are asked.

"X = !(A)"

**Antecedents**:
This acts on a single boolean antecedent, and ignores all others.

**Consequents**:

**Advanced use**:

"p(X) = p(A) & p(B) &...", "p(X) = p(A) | p(B) |...", "p(X) = 1 - p(A)"

**Antecedents**:
These act on probabilistic antecedents. Proportions are treated as
probabilities. Bayesians have their main part treated as a probability.
Booleans are converted to probability. Odds are converted to probability
in the normal manner, viz. P = O / (1 + O).

**Consequents**:

**Advanced use**:

**Stopping**:
Normally there is no stopping rule. But you can define a Lower or
Upper Cut-Off to ignore any weak evidence once you are sufficiently
confident in the present result. Suppose you have six antecedents and a
Lower Cut-Off of 10% on the consequent, and that three of the antecedents
have been answered in such a way as to bring the belief in the consequent
down to 5%. Then if the other three (unanswered) antecedents have
sufficiently weak evidence that however they are answered their combined
effect will not bring the final result over 10%, then the consequent is
considered answered. So the three remaining antecedents are not searched
during the backward chaining process. Conversely for the Upper Cut-Off.

**Antecedents**:
This acts on Bayesian antecedents. Probabilities, proportions and
booleans are converted to Bayesian whose a-priori is 0.5 (50%).

**Consequents**:
Bayesian.

**Advanced use**:
Using the Lower and Upper Cut-Offs you can make the knowledge base
appear more 'intelligent' and less pedantic.
To understand Bayesian accumulation of evidence, read the following.
The consequent belief is an accumulation of the antecedent beliefs,
for and against. Each antecedent can have a different weight, so that
having a red breast is of greater weight (more conclusive) as evidence for
the bird being a robin than that the bird is small. Evidence against (such
as thick bill) is also indicated by the weights, and in this case the
weight would be inverted.
The weights are held as parameters of the relationship that joins the
antecedent to the consequent. All antecedent relationships must have a UOp
of either 'Normal' or 'Negate', and are expected to hold a Bayesian Weight,
which is two odds multipliers - four small positive integers in all. One
pair should give a ratio >= 1 and the other pair should give a ratio <= 1.
The distance these ratios are from 1 the 'stronger' the weight for this
antecedent.
If belief in the antecedent is total (e.g. 100%) then the full weight
found in the relationship is taken and accumulated into the consequent
belief. But if the belief is partial then only part of the weight is
taken. By "total" we mean either that the antecedent is definitely known
to be true (100%) or definitely known to be false (0%).
The consequent and each antecedent has an a-priori belief, which is
the belief which would be taken if there were no evidence. (The a-priori
is often found from the statistical probability of the proposition being
true.) The a-priori belief of the consequent is the starting point for
accumulation. e.g. The a-priori belief that the bird is a robin might be
10%, and as evidence is accumulated for or against the belief varies from
this level. When the proposition is an antecedent of a Bayesian inference
then the a-priori is also used to calculate partial weights. If the belief
is precisely that of the a-priori then the antecedent has zero weight, no
effect. As the belief moves away from the a-priori of the antecedent the
amount of the weight taken increases until the whole is taken for a total
belief.

"Whether A = all", "Whether A <> all"

**Stopping**:
Answered as soon as it is known the result is false, so no further
antecedents will be searched during backward chaining.

**Antecedents**:
A: Any.
Others: Any convertible to type of A.

**Consequents**:
Boolean.

**Advanced use**:
Owing to the stopping rule, this can be used to control the order in
which questions are asked of the user.

"Whether A > all", "Whether A < all", "Whether A >= all", "Whether A <= all"

**Antecedents**:
A: Any numeric or ordinal.
Others: Any convertible to type of A.

**Consequents**:
Boolean.

**Advanced use**:

"% A =", "% A >", "% A <", "% A <>", "% A >=", "% A <="

**Antecedents**:
A: Any numeric or ordinal.
Others: Any convertible to type of A.

**Consequents**:
The result is a Proportion (or Probability or Bayesian).

**Advanced use**:, esIM_CEQ, esIM_CGT, esIM_CLT, esIM_CNE,
esIM_CGE, esIM_CLE

"Count A =", "Count A >", "Count A <", "Count A <>", "Count A >=", "Count A
<="
**Action**:
These perform comparisons similar to the above, but count the number
for which the comparison is true.

**Antecedents**:
A: Any numeric or ordinal.
Others: Any convertible to type of A.

**Consequents**:
Integer.

**Advanced use**:

"A is found in all B, C, ...", "How many A is in", "% A is in"

**Antecedents**:
A: String or integer.
Others: Must be like A or convertible to A's type.

**Consequents**:
esIM_HAS: Boolean.
esIM_CHAS: Integer.
esIM_PHAS: Proportion, Probability or Bayesian.

**Advanced use**:

"A contains all B, C, ...", "How many are in A", "% in A"

**Antecedents**:
A: String or integer.
Others: Must be like A or convertible to A's type.

**Consequents**:
esIM_HAS: Boolean.
esIM_CHAS: Integer.
esIM_PHAS: Proportion, Probability or Bayesian.

**Advanced use**:

"First Known"

**Antecedents**:
Any that can be converted to type of consequent.

**Consequents**:
Any.

**Advanced use**:
With this you can provide sophisticated strategies of questioning the
user or otherwise finding out information.

"Chooser"

**Antecedents**:
A: Anything that can be converted to integer.
Others: Converted to type of consequent.

**Consequents**:
Any.

**Advanced use**:
Note that in backward chaining, it first gets A answered and once its
value is found will only backward chain up the appropriate chosen
antecedent. In this way you can cut out whole sections of questioning if
you wish.

"All Answered"

**Antecedents**:
Any.

**Consequents**:
Boolean.

**Advanced use**:

"How many Answered"

**Antecedents**:
Any.

**Consequents**:
Integer.

**Advanced use**:
Beware: The consequent is always known and answered.

"How many Known"

**Antecedents**:
Any.

**Consequents**:
Integer.

**Advanced use**:

"No of Antes"

**Antecedents**:
Any.

**Consequents**:
Integer.

**Advanced use**:

"X = A + (Rand * (B - A))"

**Antecedents**
Any.

**Consequents**:
Integer.

**Advanced use**:

"Max", "Min"

**Antecedents**:
Anything that can be converted to type of consequent.

**Consequents**:
Numeric or string. The antecedents are converted to type of
consequent.

**Advanced use**:
These act on numeric or string antecedents. They return the maximum
or minimum value.

"Which Max", "Which Min"

**Antecedents**:
Most numerics and string.

**Consequents**:
Integer, ENUM or ORDINAL (in which case the answer is 1 if A is
max/min, 2 if B, and so on).
Or BLOCK, in which case the result is the DSAP of the antecedent
block.

**Advanced use**:
The use of BLOCK consequent is expected to become more versatile in
future, but at present you can find for instance the name of the block by
conversion to string at the next stage.

"Mean"

**Antecedents**:
Numeric.

**Consequents**:
Numeric.

**Advanced use**:

"Concat"

**Antecedents**:
If the antecedents are not string type then they are converted to
string if possible.

**Consequents**:
String.

**Advanced use**:

Copyright (c) Andrew Basden 1997. Last updated: 21 May 1999