Sure. I haven't tested this but hopefully it'll put you on the right track. Assuming you're using the code from the triangle picking sample linked above, you need to modify the custom model processor to store UV coordinates for the model in the Tag data. So first add a member variable to the TrianglePickingProcessor class to accumulate texture coordinates in:
|List<Vector2> texCoords = new List<Vector2>();
In the FindVertices method, inside the foreach loop that iterates through the mesh's GeometryContents (but outside the inner foreach loop that iterates through the Indices), get the texture coordinates for each piece of
geometry and store them:
|IEnumerable<Vector2> geometryTexCoords = geometry.Vertices.Channels.Get<Vector2>(VertexChannelNames.TextureCoordinate(0));
What you get back from Channels.Get is just a list of all the texture coordinates for that GeometryContent, in the same order as the vertices. (So index 0 of texCoords contains the UV coordinates for index 0 of vertices.) You can access any of the model's non-positional data this way -- colors, normals, etc.
Then in the Process method where the tagData is generated, add the texture coordinates to the tagData dictionary along with the vertices and bounding sphere that are already being added:
That's it for the ModelProcessor. Next, modify the RayIntersectsModel method that does the triangle test so it will output the indices of the closest intersecting triangle's vertices -- right now it just outputs the vertex positions. This is in the inner most block of the method, where it stores the information for the closest detected intersection:
|// If so, is it closer than any other previous triangle?
|if ((closestIntersection == null) || (intersection < closestIntersection))
You just need to add an integer output parameter for the method and store the value of the i counter. If RayIntersectsModel returns with a hit, you use i, i+1, and i+2 as the indices into the texcoord list you have stored in the model's tag data.
Those three pairs of texture coordinates will tell you which part of the model texture covers the triangle that was hit, and proportionally where you should draw the damage onto your damage texture map, so that when you blend the damage texture on the model the bullet holes / explosion marks / whatever show up in the right place. The simplest way to process the texture coordinates just to get something working would probably be to average them and then center your damage graphic on that point. But to make it look good you will need some kind of logic that works out the appropriate scale of the damage graphic based on how large the intersected triangle is. (And what the range of UV values for the triangle is, indicating the scale of the texture on that part of the model.)
When it comes to creating the damage texture most likely you would have sprites with the different possible damage representations. For each model you'd have a Texture2D with the same proprtions as the model's main texture (not necessarily the same resolution, just same width/height ratio), as well as an array of Colors with (textureWidth*textureHeight) elements. Initially all the elements would probably be transparent black. Then whenever there's a hit you copy one of the damage sprites into the array at the location and scale indicated by the texture coordinates. To index into an array like this that you're creating a texture from, you use
to get the pixel at coords (x,y). Whenever the array changes because of new hits you call SetData on the Texture2D that stores the damage, passing in the Color array. (Not every frame! Only when it actually changes. SetData is slow.)
Does this help?