Last updated: 2019-11-20
More user-oriented version now on the blog
Beeminder users really hate it when Beeminder tells them they need to floss their teeth 0.23 times or archive 6.8 emails or whatever nonsensical / physically impossible thing. I personally shrugged that off for years. Obviously you just mentally take the ceiling of that number, right? If you have to floss 0.23 times and your possible choices are flossing zero times and one time, then the requirement to floss 0.23 times is logically equivalent to a requirement to floss exactly once.
No. Bad. Illusion of transparency! At least one user, arguably more mathy than me, says that when he first saw something like “+0.23 due by midnight” he assumed that that implied the deadline for the actual +1 was some pro-rated amount of time after midnight. More commonly, users are genuinely baffled or walk away in disgust at what they see as blatant brokenness.
So, point taken. As in, let’s get those decimal points taken out. It’s a big deal.
There’s one case where we show fractional amounts without decimal points:
goals whose datapoints and bare-min or hard-cap values are shown in HH:MM format.
That’s determined by a separate boolean goal field,
timey, and is shown as a checkbox in settings:
[ ] Show data in HH:MM format
timey = true then every datapoint value, bare min, hard cap, safety buffer, etc — including the
quantum field described below — is displayed as HH:MM.
We call goals where only whole numbers make sense integery. Dealing with them correctly is a special case of conservative rounding. If you have a hard cap of turning your phone’s screen on 4.92 more times, then Beeminder better let you know you can do it up to 4 more times, not 5. In general, we want to round up in the case of an integery do-more goal, or down in the case of do-less. Even more generally, we want to round in the direction of the good side of the yellow brick road.
And it’s all the same principle no matter what we’re rounding to, so behind the scenes we’re implementing this nice and generally.
In particular, in addition to
timey, every Beeminder goal shall have two new fields:
quantumField, AKA Precision
quantum field gives the granularity of the thing being measured.
All numbers displayed for what you need to do for the goal will be rounded conservatively to the nearest
(Note: we never round the datapoint values themselves — those always keep whatever precision they were born with.)
quantum is 1 then the goal is integery.
For dollar amounts you might want a
quantum of 0.01.
Or 0.1 for a weight goal if your scale weighs to the nearest tenth of a kilogram.
(Terminological warning: If we think of the amount to round to as the precision then it’s confusing to talk about greater precision when we mean a smaller value for the
Instead, always refer to finer or coarser precision.)
The user-facing manifestation of
quantum is a field in goal settings called “precision”, defaulting to 1, with explanatory/help text as follows.
E.g., your weight has precision 0.1 if that’s what your scale measures to. Use “1” if you never want to see decimals.
In theory you could have a
quantum greater than 1.
We don’t know of a use case where that would be better than
quantum = 1 so we won’t worry our pretty heads about that until we do.
quantum of zero means no rounding at all — full machine precision.
No one wants that and the UI should enforce
quantum > 0.000001.
But the implementation is fine with
quantum = 0; it just can’t be negative.
timey is true then
quantum or “precision” in the above UI is also shown in HH:MM format.
In practice, as users
have been vociferously advocating, the overwhelming majority of goals are integery or timey-wimey.
The first obvious decision is to make integery goals
quantum = 1)
quantum = 1/60 for timey-wimey goals.
quantex field is a flag indicating if the
quantum field was set explicitly by the user.
It’s initially false.
The first time the user submits a value for
quantum (AKA “precision”) in goal settings,
quantex is set permanently to true.
quantex := true is part of the submit action for the
Mostly we don’t want newbees to have to think about this and newbees almost always want
quantum = 1, timey = false) or
quantum = 1/60, timey = true) goals.
So that’s what
quantum is set to by default.
Every time a datapoint is added, if
quantex is false, set
quantum to the min of itself and
x is the datapoint value, as a string, the way it was entered, and
quantize() is defined as in
and safety buffer expressed in goal units,
conservaround function as defined at
conservaround.glitch.me with the goal’s
yaw field as the error direction parameter.
For example, a do-more goal (good side up) will round up.
Fractional beeminding works fine. That’s when you have an inherently integery metric but you treat it as non-integery and enter fractional amounts. It’s a thing, it’s fine, none of this impacts it.
If someone enters a fraction like 1/3 as a datapoint value, that gets macro-expanded in the UI to “0.333333333333333333”.
Similarly, datapoints submitted via the API could accidentally have a stupid amount of precision.
Decision: Tough cookies, go fix the precision if it’s messed up.
(But also try it after this is all shipped and if it’s too ugly, enforce a minimum
quantum in the back-end as well as the UI.)
What happens to the precision when you rescale your data and the graph?
quantex is true then just rescale
quantum as well.
quantum = 1 and then rescale all the datapoints, triggering an update for each of them.
quantum however makes sense according to the
quantize function based on the new data.
What about if rescaling yields stupidly fine precision?
This we’ll callously ignore.
If you’re rescaling then you can set your own dang precision.
Possible Pareto-dominant proposal that avoids the precision setting:
Hold off on the
quantex field and the
quantum field is strictly inferred from datapoints.
quantum field defaults to 1 (or 1/60 if timey)
but if you ever enter a non-integer (non-divisible-by-1/60)
datapoint it permanently reverts to the status quo where we target 4 sigfigs.
Tentative decision: Given the existing half-assed integery setting, this is too hard to quite Pareto dominate the status quo.
Also we’re feeling ready to just take the plunge now that this spec is a Heartbreaking Work of Staggering Genius.
What if you have a totally integery do-less goal with a rate of 2/7 so the PPRs are .571429 or whatever?
Do they just ruin the integeriness of your goal until you go set an explicit
(Deleting the PPR doesn’t help since inferred quantum isn’t recomputed when datapoints are deleted.)
Answer: No, because we won’t update
quantum based on PPRs.
Those aren’t user-originating datapoints.
Thanks to Bee Soule for pretty much coauthoring this, to the daily beemail subscribers for resoundingly rejecting a version of this that twisted itself into a pretzel trying to avoid having an explicit user setting, Oliver Mayor for reminding me to consider data rescaling, and to Zedmango for setting me straight on the question of rounding datapoint values as well as help with the webcopy.