Finding the right way –analysing decisions 11 Chapter objectives This chapter will help you to: ■ work out expected values using probabilities ■ appreciate attitudes to risk and apply de
Trang 1Finding the right way –
analysing decisions
11
Chapter objectives
This chapter will help you to:
■ work out expected values using probabilities
■ appreciate attitudes to risk and apply decision rules
■ construct decision trees and use them to decide betweenalternative strategies
■ ask ‘what if’ question about conclusions from decision trees
by employing sensitivity analysis
■ make use of Bayes’ rule to find posterior probabilities fordecision trees
■ become acquainted with business uses of decision analysis
In the previous chapter we looked at how probability can be used toassess risk In this chapter we will consider how probability is used in
the analysis of decisions We will begin with expectation, the process of
multiplying probabilities by the tangible results of the outcomes whose
chances they measure to obtain expected values of the process or situation
under investigation We will move on to examine various quantitativeapproaches to taking decisions, including decision trees
11.1 Expectation
A probability assesses the chance of a certain outcome in general.Expectation is using a probability to produce a predicted or expectedvalue of the outcome
Trang 2To produce an expected value we have to apply the probability tosomething specific If the probability refers to a process that is repeated,
we can predict how many times a certain outcome will occur if theprocess happens a specific number of times by multiplying the prob-ability by the number of times the process happens
Example 11.1
The probability that a customer visiting the Kenigar Bookshop makes a purchase is0.35 If 500 customers visit the shop one day, how many should be expected to make apurchase?
Expected number of customers making a purchase 0.35 * 500 175
The result we obtained in Example 11.1 is a prediction, and like anyprediction it will not always be true We should not therefore interpretthe result as meaning that out of every 500 customers that visit the storeexactly 175 will make a purchase What the result in Example 11.1 doesmean is that in the long run we would expect that the average number ofcustomers making a purchase in every 500 that visit the store will be 175
In many business situations outcomes are associated with specificfinancial results In these cases the probabilities can be applied to themonetary consequences of the outcomes to produce a prediction ofthe average amount of money income or expenditure These types of
prediction are called expected monetary values (EMVs).
Example 11.2
A rail operating company incurs extra costs if its long-distance trains are late.Passengers are given a voucher to put towards the cost of a future journey if the delay isbetween thirty minutes and two hours If the train is more than two hours late the com-pany refunds the cost of the ticket for every passenger The cost of issuing voucherscosts the company £500 The cost of refunding all the fares costs the company £6000.The probability that a train is between thirty minutes and two hours late is 10% andthe probability a train is more than two hours late is 2% What is the expected monetaryvalue of the operating company’s extra costs per journey?
To answer this we need to take the probability of each of the three possible outcomes(less than thirty minutes late, thirty minutes to two hours late, more than two hourslate) and multiply them by their respective costs (£0, £500 and £6000) The expectedmonetary value is the sum of these results
EMV (0.88 * 0) (0.1 * 500) (0.02 * 6000) 0 50 120 170
The company can therefore expect that extra costs will amount to £170 per journey
Trang 3At this point you may find it useful to try Review Questions 11.1 to
11.4at the end of the chapter
11.2 Decision rules
From time to time companies are faced with decisions that are pivotal
to their future These involve developing new products, building newfacilities, introducing new working practices and so on In most casesthe managers who take these decisions will not know whether theyhave made the right choices for many months or years to come Theyhave to take these decisions against a background of either uncertainty,where they cannot attach a probability to each of the outcomes, or risk,where they can put a probability to each of the outcomes
In this section we will look at decision rules, techniques available tomanagers taking decisions under conditions of both uncertainty andrisk All of these techniques assist managers by helping them analyse thedecisions and the possible outcomes in a systematic way The starting
point is the pay-off table in which the results or pay-offs of the different possibilities or strategies that could be chosen are arranged according to the conditions or states of nature affecting the pay-off that might prevail.
entrepre-Table 11.1
Expected profits (in £m) for Soll and Perretts
State of future demand Strategy Increasing Steady Decreasing
Trang 4The pay-off table in Example 11.3 does not in itself indicate what strategy would be best This is where decision rules can help.When you apply them remember that the decision you are analysinginvolves choosing between the available strategies not between thestates of nature, which are by definition beyond the control of the decision-maker.
11.2.1 The maximax rule
According to the maximax rule the best strategy is the one that offersthe highest pay-off irrespective of other possibilities We apply the max-imax rule by identifying the best pay-off for each strategy and choosing
the strategy that has the best among the best, or maximum among the
deci-11.2.2 The maximin rule
If maximax is the rule for the optimists and the gamblers, maximin isfor the pessimists and the risk-avoiders The maximin rule is to pick the
Trang 5strategy that offers the best of the worst returns for each strategy, the
maximum of the minimum pay-offs.
opportun-11.2.3 The minimax regret rule
This rule is a compromise between the optimistic maximax and thepessimistic maximin It involves working out the opportunity loss or
regret you would incur if you selected any but the best strategy for the
conditions that come about To apply it you have to identify the beststrategy for each state of nature You then allocate a regret of zero toeach of these strategies, as you would have no regret if you had pickedthem and it turned out to be the best thing for that state of nature, andwork out how much worse off you would be under that state of naturehad you chosen another strategy Finally look for the largest regret figure for each strategy and choose the strategy with the lowest of these
figures, in doing so you are selecting the strategy with the minimum of the maximum regrets.
Trang 611.2.4 The equal likelihood decision rule
In decision-making under uncertainty there is insufficient informationavailable to assign probabilities to the different states of nature Theequal likelihood approach involves assigning probabilities to the states
of nature on the basis that, in the absence of any evidence to the trary, each state of nature is as likely to prevail as any other state ofnature; for instance if there are two possible states of nature we giveeach of them a probability of 0.5 We then use these probabilities towork out the expected monetary value (EMV) of each strategy andselect the strategy with the highest EMV
decreas-These figures are the opportunity losses for the strategies under the increasing demandstate of nature
The complete set of opportunity loss figures are given in Table 11.2
From Table 11.2 the maximum opportunity loss from investing is £80m, fromfranchising, £30m and from selling, £50m The minimum of these is the £30m fromfranchising, so according to the minimax regret decision rule this is the strategy theyshould adopt
Table 11.2
Opportunity loss figures (in £m) for Example 11.3
State of future demand Strategy Increasing Steady Decreasing
Trang 7At this point you may find it useful to try Review Questions 11.5 to
11.9at the end of the chapter
11 3 Decision trees
The decision rules we examined in the previous section help to dealwith situations where there is uncertainty about the states of natureand no probabilities are available to represent the chances of theirhappening If we do have probabilities for the different states of nature
we can use these probabilities to determine expected monetary values(EMVs) for each strategy This approach is at the heart of decisiontrees
As their name implies, decision trees depict the different sequences
of outcomes and decisions in the style of a tree, extending from left
to right Each branch of the tree represents an outcome or a decision
The junctions, or points at which branches separate, are called nodes If
the branches that stem from a node represent outcomes, the node is
called a chance node and depicted using a small circle If the branches
represent different decisions that could be made at that point, the
node is a decision node and depicted using a small square.
All the paths in a decision tree should lead to a specific monetaryresult that may be positive (an income or a profit) or negative (a cost or
a loss) The probability that each outcome occurs is written alongsidethe branch that represents the outcome We use the probabilities andthe monetary results to work out the expected monetary value (EMV)
of each possible decision The final task is to select the decision, orseries of decisions if there is more than one stage of decision-making,that yields the highest EMV
pay-off and a one-third chance of a£30m pay-off To get the EMV of the strategy wemultiply the pay-offs by the probabilities assigned to them:
EMV(Invest) 1/3 * 100 1/3 * 60 1/3 * (30) 33.333 20 (10) 43.333Similarly, the EMVs for the other strategies are:
EMV(Franchise) 1/3 * 40 1/3 * 50 1/3 * 0 13.333 16.667 0 30
EMV(Sell) 1/3 * 20 1/3 * 20 1/3 * 20 20
According to the equal likehood approach they should choose to invest, since it has thehighest EMV
Trang 8Example 11.8
The proprietors of the business in Example 11.3 estimate that the probability thatdemand increases in the future is 0.4, the probability that it remains stable is 0.5 and theprobability that it decreases is 0.1 Using this information construct a decision tree torepresent the situation and use it to advise Soll and Perrets
EMV for the Invest strategy 0.4 * 100 0.5 * 40 0.1 * (30) £57m
EMV for the Franchise strategy 0.4 * 60 0.5 * 50 0.1 * 0 £49m
EMV for the Sell strategy £20m
The proprietors should choose to invest
Pay-off (£m)
100
40
30 60
Decision tree for Example 11.8
The probabilities of the states of nature in Example 11.8 were vided by the decision-makers themselves, but what if they could commis-sion an infallible forecast of future demand? How much would this beworth to them? This is the value of perfect information, and we can put a figure on it by working out the difference between the EMV
pro-of the best strategy and the expected value with perfect information
Trang 9This latter amount is the sum of the best pay-off under each state ofnature multiplied by the probability of that state of nature.
Example 11.10
Sam ‘the Chemise’ has a market stall in a small town where she sells budget clothing.Unexpectedly the local football team have reached the semi-finals of a major tourna-ment A few hours before the semi-final is to be played a supplier offers her a consign-ment of the team’s shirts at a good price but says she can have either 500 or 1000 andhas to agree to the deal right away
If the team reach the final, the chance of which a TV commentator puts at 0.6, andSam has ordered 1000 shirts she will be able to sell all of them at a profit of £10 each Ifthe team do not reach the final and she has ordered 1000 she will not sell any this sea-son but could store them and sell them at a profit of £5 each next season, unless theteam change their strip in which case she will only make a profit of £2 per shirt Theprobability of the team changing their strip for next season is 0.75 Rather than storethe shirts she could sell them to a discount chain at a profit of £2.50 per shirt
If Sam orders 500 shirts and the teams reach the final she will be able to sell all the shirts
at a profit of £10 each If they do not make the final and she has ordered 500 she will not
0.4 * 100 0.5 * 50 0.1 * 20 £67mFrom Example 11.8 the best EMV was £57m, for investing The difference between thisand the expected value with perfect information, £10m, is the value to the proprietors
of perfect information
The decision tree we used in Example 11.8 is a fairly basic one, representing just one point at which a decision has to be made and the ensuing three possible states of nature Decision trees really comeinto their own when there are a number of stages of outcomes anddecisions; when there is a multi-stage decision process
Trang 10A decision tree like the one in Figure 11.2 only represents the ation; the real point is to come to some recommendation This is a littlemore complex when, as in Figure 11.2, there is more than one point atwhich a decision has to be made Since the consequences for the firstdecision, on the left-hand side of the diagram, are influenced by thelater decision we have to work back through the diagram using what is
situ-called backward induction or the roll back method to make a
recommen-dation about the later decision before we can analyse the earlier one
We assess each strategy by determining its EMV and select the one withthe highest EMV, just as we did in Example 11.8
have the option of selling to the discount chain as the quantity would be too small forthem She could only sell them next season at a profit of £5 each if the team strip is notchanged and at a profit of £2 each if it is Sam could of course decline the offer of the shirts.Draw a decision tree to represent the situation Sam faces
Profit (£)
Final 0.6
Store Changed strip
Changed strip 0.75
No changed strip 0.25
Sell
Final 0.6
No final 0.4
No final 0.4
Trang 11Once we have come to a recommendation for the later course ofaction we assume that the decision-maker would follow our advice atthat stage and hence we need only incorporate the preferred strategy
in the subsequent analysis We work out the EMV of each decisionopen to the decision-maker at the earlier stage and recommend theone with the highest EMV
Example 11.11
Find the EMV for each decision that Sam, the market trader in Example 11.10, couldtake if she had ordered 1000 shirts and the team did not make it to the final
EMV(Store) 0.75 * 2000 0.25 * 5000 £2750Since this figure is higher than the value of selling the shirts to the discount chain,
£2500, Sam should store rather than sell the shirts at this stage
Final 0.6
No changed strip 0.25
Changed strip 0.75
No changed strip 0.25
No final 0.4 Order 500
No order
Figure 11.3
Amended decision tree for Example 11.10
Trang 12We can indicate as shown in Figure 11.3 that the option of selling the stock if she were
to order 1000 and the team does not reach the final, should be excluded This makesthe EMV of the decision to order 1000 shirts much easier to ascertain In working it out
we use the EMV of the preferred strategy at the later stage, storing the shirts, as the pay-off
if the team were not to make the final
EMV(Order 1000) 0.6 * 10000 0.4 * 2750 £7100
In identifying the EMV of the decision to order 500 shirts we have to take account of thechance of the team strip being changed as well as the chance of the team reaching thefinal This involves applying the multiplication rule of probability; the probability that
Sam makes a profit of £2500 is the chance that the team fail to reach the final and don’t
change their strip next season
EMV(Order 500) 0.6 * 5000 0.4 * 0.75 * 1000 0.4 * 0.25 * 2500 £3550
We would recommend that Sam orders 1000 as the EMV for that strategy is higher,
£7100, than the EMV for ordering 500, £3550, and the EMV of not making an order, £0
The probabilities used in decision trees are often little more thaneducated guesses, yet they are an integral part of the analysis It istherefore useful to see how the recommendation might change if theprobabilities of the relevant outcomes are altered, in other words tosee how sensitive the recommendation is to changes in these probabil-
ities Sensitivity analysis involves finding out by how much the
probabil-ities would have to change for a different decision to be recommended
Example 11.13
In Example 11.11 we recommended that Sam, the market trader in Example 11.10, shouldstore rather than sell the shirts if she had ordered 1000 shirts and the team did notmake it to the final We worked out the EMV that led to this conclusion using the prob-ability that the team would change its strip, 0.75 But what if it changed? At what pointwould we alter our advice and say she should sell the shirts to the discount chain instead?
If we use p to represent the probability the team strip changes and 1 p to represent
the probability it doesn’t, then the point at which the sell and store strategies have equalvalue is when:
p * 2000 (1 p) * 5000 2500 2000p 5000 5000p 2500
3000p 2500
p 2500
3000 0.833
Trang 13
The probability in Example 11.13 would not have to change by very
much, from 0.75 to 0.833, for our advice to change; the decision is
sen-sitive to the value of the probability If it needed a substantial shift in
the value of the probability we would consider the decision to be robust.
The probabilities that we have used in the decision trees we have ied so far have been prior, or before-the-event, probabilities, and con-ditional probabilities There are situations where we need to includeprior, or after-the-event probabilities, which we can work out using theBayes’ rule that we looked at in section 10.3.3 of Chapter 10
stud-This result suggests that if the probability of the team changing its strip is more than0.833, then Sam should sell the shirts to the discount chain rather than store them Wecan check this by taking a higher figure:
EMV(Store) 0.9 * 2000 0.1 * 5000 £2300This is lower than the value of selling the shirts to the discount chain, £2500, so if theprobability of the team changing its strip were 0.9, Sam should sell the shirts
If the building proves to be sound they will make a profit of £15m, but if it is notsound the extra costs of extensive structural work will result in a profit of only £1m.They could decide at the outset to commission a full structural survey The firm theywould hire to carry this out have a good record, but they are not infallible; they werecorrect 80% of the time when a building they surveyed turned out to be sound and 90%
of the time when a building they surveyed turned out to be unsound The surveyor’sreport would only be available to Karovnick, so whatever the conclusions it contains thecompany could still sell the site for £4m
We can draw a decision tree to represent this situation:
The decision nodes in Figure 11.4 have been labelled A, B, C and D to make it easier
to illustrate the subsequent analysis Note that although we have included all the offs in this decision tree, the majority of outcomes do not have probabilities This isbecause we do not know for instance the probability that the building turns out to besound given that the surveyor’s report predicts the building is sound This probability