Piecewise regression


Well-Known Member
Bats tend to be active near hedgerows and the activity drops off as they move into adjacent pasture. The problem is to determine at what distance the activity can be considered zero.
I have the activity levels at a series of scattered monitoring stations at various distances away from the hedgerow, and as distance increases the activity eventually drops to zero for most distant stations. My idea is to model the data out to an as yet unknown distance as a decline (hopefully linear, after a suitable transformation) and then as zero after that.
The problem is to determine the best breakpoint between the decline part and the zero part, with a margin of error for it. The actual form of the decline is less important.
Any ideas? Thanks, kat