If you know some theory behind which support you to choose a certain kind of parametric model, or some semi-parametric model like what Karabiner suggested, you can always do so - but you need to check whether such model give a good fit to a data. This kind of model may give you a more elegant result, a nicer interpretation, a more powerful prediction, and maybe easier to develop further theory based on such result. But such elegant model may not even exist, and may have mis-specification error.

So back to your question - if you have a parametric model, you can of course find out the multivariate distribution by estimating the parameters, and thus also given an estimated conditional distribution. After estimating all the necessary parameters, one can simulate a multivariate random vector according to it. Whether you have a easy/good method is another issue; e.g. for multivariate normal you may use Cholesky decomposition to help, but such elegant method may not exist for other kind of multivariate distribution.

For non-parametric alternative, actually you are having a discrete multivariate random vector where there are a total of \( 11^k \) support points, where \( k \) is the number of variables, i.e. the dimension of the vector. With sufficient number of data, you can always give a good estimation of the empirical multivariate probability mass function, and thus giving everything you want. The difficulty is that you may not have such large amount of data, esp when \( k \) is large, and you may need to fall back to parametric approach to lessen this requirement.