Moving from https://github.com/catapult-project/catapult/issues/3769
Users often search for alerts by revision. In the V1 UI, this is /group_report?rev=N. In v2spa, this will be #alerts&rev=N.
A user who is reviewing a set of alerts may naturally ask, "Is this all? Or will there be more alerts here tomorrow?"
When users are investigating a group of alerts or the overall impact of a particular CL, there is some probability that, as more data is uploaded to the dashboard, additional alerts will be detected.
Low-tech:
* show a red badge in alerts-section if the given revision is less than 1 day old
* show a yellow badge if <2 days old
* show a green badge if >2 days old
Slightly higher-tech:
A bot's "completeness score" is the number of data points uploaded to the bot since the revision.
Normalize bots' completeness scores linearly: a bot is 100% past a revision when 40 data points (or some other large number) have been uploaded after it.
Consider all bots that use the same revision schedule as the given revision. Or, if a set of test suites is selected, all bots that upload that suite.
A revision's "completeness score" is the average of the normalized bots' completeness scores.
This can be accomplished using /api/timeseries2?min_revision