I am working to simultaneously (multivariate) regress 'A' and 'B' parameters from matched observations of predictors (I have an 'A' and 'B' for each set of predictors and all are continuous random variables).

The 'A' and 'B' parameters are products of another regression of the form y = 1 - exp(-((x-1)/B)^A) (attached figure). Creating "observations" of the 'A' and 'B' parameters is easy. The difficult part comes when trying to form relationships between the 'A' and 'B' parameters and external predictors ('A' and 'B' are linear with respect to predictor coefficients).

The 'A' parameter is pretty well fit by a normal distribution but the 'B' parameter is necessarily positive with a large standard deviation, resembling a gamma distribution.

Because I require simultaneous regression for 'A' and 'B', 'B' is non-normally distributed, and 'A' and 'B' are likely to have different predictors, I have chosen Vector Generalized Linear Models with Maximum Likelihood Estimation as my regression strategy. This will allow me to use the identity link for A and the gamma link for 'B'.

I understand that MLE works by finding the maximum of the joint probability distribution across all observations. My questions are...

1. How do I define this pdf? It can't simply be the equation that defines the 'A' and 'B' because that is a monotonically increasing function (maximum y in the figure will always correspond to the maximum x-value)

2. Is there a different method that would be better than MLE and work with vector GZLMs?

3. Is there a better method than vector GZLMs?

ps I am working in Matlab, but suggestions for python or R (maybe) are also appreciated