Consider a purely probabilistic game with n potential outcomes x1, x2, ..., xn (each of which is a specified gain or loss of dollars) which occur with the probabilities p1, p2, ..., pn respectively (where p1 + p2 + ... + pn = 1.0 and 0 <= pi <= 1 for each i). Assume that x1, x2, ..., xn and p1, p2, ..., pn are known quantities. While answering each question below, please be as specific as you can.
(2a) Theoretically speaking, how would you define and measure the risk of playing this game?
(2b) Assuming that YOU personally could play this game at most one time, how would you decide whether you would want to play it? Unless risk doesn't matter to you for some reason, your solution should use your measure of risk from (2a).
(2c) Assuming that you could play this game one time per day for as many consecutive days as you like and that the quantities p1, p2, ..., pn and x1, x2, ..., xn never change, how would you decide how many days to play it? Assume that you must decide IN ADVANCE (before the first play) how many days you are going to play for.
(2d) How does your answer to (2c) change if you are allowed to stop playing the game after any number of days that you like?
Note: Though question 2. is central to much of human activity, there is no single agreed upon answer. We are interested in the reasoning that you undergo to arrive at a reasonable solution.