Machine Learning Models Are Conditional, Right?

My Googling is turning up no great results, but given the terminology differences between statistics and computer science, that may not mean much.

My basic query is if machine learning models such as neural networks, random forests, gradient boosted machines are all classified as Conditional Models by statistical standards or if there is some algorithm type or aspect of them that could be considered Marginal. I am aware of procedures that allow you to integrate out many variables in a random forest or GBM and allow one to get a ‘marginal effect’, but I’d be skeptical that this equates to a population-level effect.

If anyone has opinions or papers to share in this area, I would be very appreciative. Thanks!

Machine learning attempts to make predictions for individuals, so it is fully conditional on the features it uses subject to whatever constraints it puts on functional forms. For example a machine that bins a continuous variable is conditional at the bin level but not at the original predictor value label. After prediction, the algorithm can be used to obtain conditional differences (e.g., treatment effects) or marginal differences.