iPhone sensor data mining to predict events

#1
I'm working on a concept for a mobile device programming class. I have no clue whether or not it will work. Basically my plan is to mine all of the sensor data I can (magnetometer, gyroscope, noise level, etc) during different types of events. These might things like walking, at work, driving to work, etc. Then I'm also taking random points every few minutes as "noise" data. Numbers off of most of these sensors are usually arbitrary values between -1 and 1 or 0 and -1.

My ultimate goal is to design an app that can try and learn what you're doing based on past sensor data.

The result of this data mining for one event, "driving" is attached. The data points for driving are in red, the noise points are grey. These histograms are relative percentages - so it's the number of points in each bucket divided by the total number of points (the height of the y axis should be 1).

Usually the red and grey histograms look approximately the same but there are always a few sensors that appear to have the "signature" of some kind of event. Maybe in one case the change in magnetic field and the time of day shows I'm at work. In another case the change in speed, gyroscope activity and the ambient light level shows I'm driving. I really want to design an algorithm that applies to any kind of general continuous sensor data.

So this is my question for you guys-

Can I take these histograms work backwards to find the confidence of an event happening based on the probability distribution? If I have an instantaneous sensor value (or maybe a moving average or a range of values from a few seconds back) can I plot that against the histogram to get the % chance that this is actually happening? And if I can do that what's the best way to consider the data from all sensors?

I have a few ideas that may or may not work but I'd love to see what others might think of that I haven't.