assuming you have two sound sources emmitting sound at the same time, you can emulate distance by using delays: a single listener position will receive the source that is further later, so set the delay of the woodwinds to a bigger value. however, if you have a recording that already contains this delay, you may want to either keep the delay setting the same for both instruments or set the delay of the violin (that is closer) to a bigger value to make the sound of the two instruments arrive at the same time. you can set the amount of delay by using the speed of sound in the room and the distance.
however, this will not simulate the room acoustics behaviour, as the acoustics you experience in a concert hall, especially the early reflections, are dependant of the location of the source-listening position, so adding delays is not enough.
OK, the delay should be bigger for an instrument in the back row. But not really, because the musicians will compensate this - strings play always a little delayed per se.
The point is: The instrument in the back row is closer to the *rear* wall (the wall the specators are looking at) than the instrument in the front row. So the overall delay should be bigger but the early reflections should be sooner - relatively to the sound itself.
You can try this yourself - apply very late early reflections to a solo instrument and it sounds like it would be at the front edge of an empty opera stage with a nice distance between the player and all walls. But the sound of an instrument/singer that is way back at the rear wall arrives more or less at the same time as the early reflections.
Yes, you can also regard the side walls and the wall behind you but I think the rear wall makes a first order effect.
I have had previous discussions with Beat on the topic of pre-delay. Beat's tutorials are excellent, but I have an opposite approach to pre-delays for the distinct instrument groups. Let's define predelay as the time difference between the direct sound and the first reflection.
Here is a simple approach:
Make a drawing of a concert stage and put yourself in row 10 in the hall.
Draw a direct line from the strings/conductor's position to your listening position.
Try to draw first reflection lines via the walls from the strings to your ears.
Repeat the same for the percussion in the back of the stage.
You will see by simple geometry that the closer intruments have MORE predelay (i.e. time to first early reflection) than the instruments further to the back. Early reflections from the closer instruments have a relatively longer path. Early reflections from the back instruments are quite close to the direct sound.
The crux in this discussion is that we tend to confuse the distance between listener and instrument with the relative ER-"distance".
Of course instruments in the back have a longer sound "path", which some people tend to include in the pre-delay calculations. However, we typically want to have all instruments played in sync, so we are better off by deleting the direct signal path from the calculations. What is left: early reflections from the front instruments have longer paths and ERs from the back instruments have shorter paths.
Make a drawing, it makes this a lot more clear.
In my DAW's mixer I use two true stereo IR sets (from my own libs) and to my ears this "ER" placement works fine for making a difference in front and back instruments. If I had more CPU power I'd use three sets.