« Beeminder home

Beeminder Blog

Integer collage

Here’s a thing users really hated about Beeminder until today: when Beeminder told them they needed to floss their teeth 0.23 times or archive 6.8 emails or whatever nonsensical / physically impossible thing. I personally shrugged that off for years. Obviously you just mentally take the ceiling of that number, right? If you have to floss 0.23 times and your possible choices are flossing zero times and one time, then the requirement to floss 0.23 times is logically equivalent to a requirement to floss exactly once.

No. Bad. Illusion of transparency! At least one user, arguably more mathy than me, said that when he first saw something like “+0.23 due by midnight” he assumed that that implied the deadline for the actual +1 was some pro-rated amount of time after midnight. More commonly, users were genuinely baffled or walked away in disgust at what they saw as blatant brokenness.

So, point taken. As in, we got those decimal points taken out. Hooray!

The main reason we’re telling you this is that we promised to, by the Pareto Dominance Principle. We think we implemented this really nicely so that you’ll only see decimal places when it actually makes sense to, namely, when you have at least some datapoints that are not integers. But it’s possible that you really liked seeing those decimals even on your patently integery goals. (“Integery” is what we call goals where only whole numbers make sense.) If so, we just made Beeminder slightly worse for you. But only slightly! We now have a per-goal “precision” setting so you just have to go set that to however many decimal places you like.

Most of you, however, do not in fact like decimals and for you, all the numbers should be much saner as of today. If you did once enter a 1/2 for a datapoint value but still only want to see integers, you too can force that by setting your goal’s “precision” field to 1.

Everyone’s happy!

Do I need to read the rest of this?

You absolutely do not! In fact, it’s way more nitty-gritty than you can reasonably care about. But since some of you unreasonably care about such things, you can read on for the rest of the story…

Wait though, before you go, here’s a handy list of the main user-visible improvements:

  1. More consistent/correct conservative rounding so you’re never misled about how much you have to do.
  2. Goals default to integery until you enter a datapoint that’s not a whole number.
  3. We generalized the “integery” field to “precision”. Instead of integery (only available for custom goals) you can now set precision=1 on any goal.
  4. You can customize how many decimal places you see by setting precision < 1.
  5. You no longer see a superfluous “.0” appended to every dang datapoint value.

Timey-Wimey Goals

There’s one case where we show fractional amounts without decimal points: goals whose datapoints and bare-min or hard-cap values are shown in HH:MM format. That’s a separate checkbox in settings:

[ ] Show data in HH:MM format

If checked then every datapoint value, bare min, hard cap, safety buffer, etc — including the “precision” field [1] — is displayed as HH:MM.

Integery Goals

“Users were genuinely baffled or walked away in disgust”

Dealing with integery goals correctly is a special case of conservative rounding. If you have a hard cap of turning your phone’s screen on 4.92 more times, then Beeminder better let you know you can do it up to 4 more times, not 5. In general, we want to round up in the case of an integery do-more goal, or down in the case of do-less. Even more generally, we want to round in the direction of the good side of the yellow brick road.

And it’s all the same principle no matter what we’re rounding to, so behind the scenes we implemented this nice and generally.

In particular, in addition to the “timey” field, every Beeminder goal now has two new fields: “precision” and an internal field keeping track of whether the precision was set explicitly by the user.

The “Precision” Field

You can think of the precision as the granularity of the thing being measured, though Beeminder will always count any value you give it towards your goal. All numbers displayed for what you need to do for the goal are rounded conservatively to the nearest quantum as defined by this field. (Note: we never round the datapoint values themselves — those always keep whatever precision they were born with.) If precision is 1 then the goal is integery. For dollar amounts you might want a precision of 0.01. Or 0.1 for a weight goal if your scale weighs to the nearest tenth of a kilogram.

(Terminological warning: If we think of the amount to round to as the precision then it’s confusing to talk about greater precision when we mean a smaller value for the field. Instead, we always refer to finer or coarser precision.)

The “precision” field is found in goal settings, defaults to 1, and has explanatory/help text as follows:

Display Precision: _______
Determines how many decimal places to display. E.g., your weight has precision 0.1 if that’s what your scale measures to. Use “1” if you never want to see decimals.

In theory you could have a precision coarser than 1. We don’t know of a use case where that would be better than precision 1 so we haven’t worried our pretty heads about that so far. (Let us know if you have a use case!)

A precision of zero means no rounding at all — full machine precision. No one wants that and the UI enforces a precision of 0.00001 or coarser.

Again, if timey is true then “precision” in the above UI is also shown in HH:MM format. [1]

In practice, as users have been vociferously advocating, the overwhelming majority of goals are integery or timey-wimey. The first obvious decision was to make integery goals (precision = 1) the default, or precision = 1/60 for timey-wimey goals. That alone is what should make newbees never have to think about any of this!

Whether The User Explicitly Set a Precision

We also store a flag indicating if the precision field was set explicitly by the user. It’s initially false. The first time the user submits a value for precision in goal settings, the flag is set permanently to true.

Every time a datapoint is added, if the flag is false, we set precision to the min of itself and however much precision the datapoint value actually has. (But also not finer than 0.00001.) For example, from a datapoint value of 1.23 we infer a precision of 0.01 (2 decimal places). We worked out all the code for this — which we call conservarounding — at conservaround.glitch.me.

Which is now almost surely further down that rabbit hole than you wanted to go, but just in case you want to go further, there are a few more assorted issues and questions we wrestled with that we’ll relegate to footnotes! (What about so-called fractional beeminding? [2] What about entering fractions like 1/3? [3] What about rescaling graphs? [4] What about weirdo pessimistic presumptive reports? [5])

Thanks for putting up with those infuriating decimal places all those years!


 

Thanks to Queen Bee for coauthoring this, to the daily beemail subscribers for resoundingly rejecting a version of this that twisted itself into a pretzel trying to avoid having an explicit user setting, Oliver Mayor for reminding me to consider data rescaling, and to Zedmango for setting me straight on the question of rounding datapoint values as well as help with the webcopy.


 

Footnotes

[1] Turns out the “precision” does not respect the “timey” field as we’re going to press. We intend to delete or update this footnote when the implementation matches the spec here!

[2] Fractional beeminding works fine. That’s when you have an inherently integery metric but you treat it as non-integery and enter fractional amounts. It’s a thing, it’s fine, none of this impacts it.

[3] If someone enters a fraction like 1/3 as a datapoint value, that gets macro-expanded in the UI to “0.333333333333333333”. Similarly, datapoints submitted via the API could accidentally have a stupid amount of precision. Decision: Tough cookies, go fix the precision if it’s messed up. (But let us know if it’s too ugly because we may adjust the limit on how fine the precision can be.)

[4] What happens to the precision when you rescale your data and the graph? If the flag that says the user explicitly set precision is true then we just rescale the precision as well. Otherwise, set precision = 1 and then rescale all the datapoints, triggering an update for each of them. That sets precision however makes sense according to the precision-inferring function based on the new data. What about if rescaling yields stupidly fine precision? This we callously ignore. If you’re rescaling then you can set your own dang precision.

[5] What if you have a totally integery do-less goal with a rate of 2/7 so the PPRs are .571429 or whatever? Do they just ruin the integeriness of your goal until you go set an explicit precision = 1? (Deleting the PPR doesn’t help since inferred precision isn’t recomputed when datapoints are deleted.) Answer: No, because we don’t update precision based on PPRs. Those aren’t user-originating datapoints.

Tags: