Measuring time budget deviations

So I have a man and his time budget (log or diary of the duration of activities) over 24 hours.

Let’s say the subject does 8 hours sleep, 9 hours work, 1 hour eating, 2 hours exercise, 4 hours watching TV every day with small variance in every feature.
I observe the subject for a week or two (about 10 samples) and calculate what to be the normal time budget distribution for a day. I could calculate the mean and the variance for every feature.
Now I want to be informed when this time budget distribution is deviated from this “standard time budget” in a meaningful way. Let’s say, the subject exercises much less, but sleeps more, so I know, something is off with him today. Or he skips eating time.

I tried to use a chi-squared test but it strongly depends on the measuring scale, so if I measure the time for time budget in seconds, then even the slightest deviations are noticeable, which I do not want.

How could I measure the deviation from “standard time budget” in the time budget distribution in a proper way?


Fortran must die
I am not sure what the proper ways means. I think you have to measure deviation and make a substantive not statistical determination of how much deviation has to be to matter.


Ambassador to the humans
I feel like the dirichlet distribution is what is needed here. Maybe a likelihood ratio test against the null hypothesis that the ratio of the alphas give the desired proportions? Just spitballing here. Maybe there is a prebuilt test for this but I don't know.