Gambit's model for the numerical data of a game is that all payoffs (and chance probabilities) are stored as exact numbers. These numbers can be specified as decimal or rational, and their input text representation is retained.
However, in the graphical interface, payoffs input as decimal numbers are displayed as rational numbers - for example, writing a payoff as 2.1 will be displayed as 21/10. Nevertheless, the 2.1 is retained, and in a .nfg save file it is recorded as 2.1 and in pygambit will appear as 2.1.
The reason this happens is that the display is representation-agnostic, in this sense. The graphical interface can display the reduced strategic form of an extensive game. Therefore, the iteration or accessing of payoffs always returns rational numbers because, in the presence of chance nodes, the entries of the table must be computed, and we do not do any computation with decimal numbers - the calculations are all done in rational numbers because rational numbers are a superset of decimal numbers with finite digits.
Although on the surface there seems to be an obvious fix (allowing for the display to know about the underlying representation and behave accordingly), we believe there are some more subtle UX considerations at play and so this isn't a good issue for someone new to Gambit and the codebase.
Gambit's model for the numerical data of a game is that all payoffs (and chance probabilities) are stored as exact numbers. These numbers can be specified as decimal or rational, and their input text representation is retained.
However, in the graphical interface, payoffs input as decimal numbers are displayed as rational numbers - for example, writing a payoff as
2.1will be displayed as21/10. Nevertheless, the2.1is retained, and in a.nfgsave file it is recorded as2.1and inpygambitwill appear as2.1.The reason this happens is that the display is representation-agnostic, in this sense. The graphical interface can display the reduced strategic form of an extensive game. Therefore, the iteration or accessing of payoffs always returns rational numbers because, in the presence of chance nodes, the entries of the table must be computed, and we do not do any computation with decimal numbers - the calculations are all done in rational numbers because rational numbers are a superset of decimal numbers with finite digits.
Although on the surface there seems to be an obvious fix (allowing for the display to know about the underlying representation and behave accordingly), we believe there are some more subtle UX considerations at play and so this isn't a good issue for someone new to Gambit and the codebase.