In each round you bid to get first choice of the available mines. Each mine has a production value (1 through 6 gold) and a number that specifies when it triggers, based on the roll of two dice each turn. Further, collecting a set of mines from the same town gives you the mayorship, which lets you collect additional gold from players who build mines in that town (or are forced to by the available options.)
Position is important because choice of mines proceeds clockwise from the winner, and bid payments are paid counterclockwise. (The player to the winner's right gets half, the next player gets 1/4, etc., with any remainder going back to the bank.)
So, each mine has value in a variety of different ways:
* Its "equity" value, the production value translates to points at the end of the game
* The expected stream of future payments from production (more important in a small game than a big one). Usually the mines with rarer numbers (2 and 12) do not have big enough production value to compensate for the low probability. This can be calculated exactly.
* Its contribution towards winning a mayor, or increasing the value of the mayor's office
The mayor has the same components:
* A 5-point value at the end of the game
* The expected stream of future payments from mines in the matching city. This is somewhat dependent on future bidding but the number of undiscovered mines in that city is known.
Both these need to be risk-weighted against the possibility of losing the mine (to a special card) or the mayor's office (due to somebody outbidding.) So valuation is complex enough to be interesting. I wonder if tools from conventional finance are worth using here, does it make sense to apply a discounting rate to future returns? I think they don't because there is no risk-free reward.
However, the bid payout mechanism (and the fact that you get one card every round no matter what) makes bidding nontrivial as well. Let's ignore coalitions for a moment--- they're hard anyway--- and just look at two players, A and B. Let's also ignore any + or - value to position (the winning bidder bids first the next round.)
mine X: value 3 to A, value 0 to B (a production-3 mine in A's city, assuming A has three mines in that city)
mine Y: value 3 to A, value 6 to B (a production-6 mine in B's city, mirrored assumption)
The global optimum is that B gets Y and A gets X. But because this is a competitive game, A prefers (AY,BX). So we can recast this in terms of A's utility:
X to B: +3
X to A: -3
B's utility is the opposite (in games with more players we can't make this simplification). But A can't bid 3, because 2 of those gold would go to B. Writing things out:
A bids 3 and wins: +3 -3 -2 = -2
A bids 2 and wins: +3 -2 -1 = 0
A bids 1 and wins: +3 -1 -1 = +1
Is A's win enough? Well, from B's perspective (more negative is better) he can bid 2 and get an improved result--- the payoffs are all reversed:
B bids 3 and wins: -3 +3 +2 = +2
B bids 2 and wins: -3 +2 +1 = 0
B bids 1 and wins: -3 +1 +1 = -1
So if A goes first he should bid 1, forcing B to bid 2--- or he can bid 2 himself and achieve the same payoff (but B might make a mistake either way.)
In a two-player game, is there always a way to force no net gain? No, because the mine values may be fractional due to the future revenue stream, and only whole-value bids are accepted. In that case, the winning strategy is to immediately bid the amount which produces a small (<1) positive result for the first bidder; the second player cannot improve his bid without going negative, since zero is not possible.
But this strategy suggests there is an advantage to bidding first, equal to the fractional payouts in future rounds. So, confusingly, it might be worth overpaying in round 1 if you could go first on all subsequent rounds. But the other player could compensate by overbidding in round 2. I don't know what the end effect of this line of reasoning would be (it might not even be feasible with limited bankrolls)--- it might make an interesting toy game to study all by itself.