# In research, optimise for bitrate

(A simplification of Steinhartd’s post.)

When deciding what experiments to run, you often have to pick between experiments that are easy/hard to run, and that have low/high chances of success.

When picking which experiments to run, try to approximate the entropy of the experimental results and the amount of time it takes to complete. The ratio between the two is your experimental bitrate. The higher the more you’re learning.

Your information content can be derived from the distribution of expected results. If you’re distinguishing between two hypotheses, and you think the experiment has 90% chance of coming out in favour of your first hypothesis, then your entropy is $$H = -0.9 \log 0.9 + -0.1 \log 0.1 = 0.469$$ bits. If you can instead run an experiment that has even odds, you can instead get $$H = -0.5 \log 0.5 + -0.5 \log 0.5 = 1$$ bits. So experiments with higher uncertainty yield more bits.

Of course, the perfect experiment can yield lots but be unwieldily, hence we should normalise by time to create a bitrate and maximise our information across time rather than per experiment.