Metrics model some measurable, quantifiable state.
In high volume systems both can then be observed through various sampling techniques. A key item is that sampling is good to handle separately to application logic creating those signals as it may change over time or be dynamic.
> The moment of capturing a measurement is known as a metric event
Which suspiciously reads like a log.
In practice, a metric is an aggregate of events (the "metric events") when you're not interested in the individual event but, but in the aggregate itself. For practical reasons this is not implemented with logs but with more primitive technical events emission.
This is not fundamentally incompatible notions. If you do an electrocardiogram, you might be interested in your BPM, but it is deduced by the full log of each beat. The segregation we do in computing is more practical than fundamental.
Completely agree on the confusing terminology there. IMO that should be:
> The moment of capturing a measurement is known as a metric sample.
The mental model I hold is the metric is the actual value. This may be discrete (e.g. a packet counter) or some continuous value (e.g. a voltage in you ECG example). It can then be observed at some time/value delta interval or summarised into other time series based on what you're hoping to capture.
In essence:
Logs mark some event in the system.
Metrics model some measurable, quantifiable state.
In high volume systems both can then be observed through various sampling techniques. A key item is that sampling is good to handle separately to application logic creating those signals as it may change over time or be dynamic.