Read Back Buffers from GPU DirectX 11

Started by
6 comments, last by supesfan 3 years, 9 months ago

In DirectX 11, I'm looking to read back some buffer data that is on the GPU. I've done a great deal of research and I feel I'm almost there. I've been able to copy the buffers from GPU memory to system memory for CPU access. What I want to do now is access certain vertices so I can do some calculations.

I have been told that too loop through the buffer, you need to know what the structure of the buffer is. So for example, in this link it describes how a vertex buffer is created:

https://docs.microsoft.com/en-us/windows/win32/direct3d11/overviews-direct3d-11-resources-buffers-vertex-how-to

This is the structure they used in part to initialize it.

// Define the data-type that
// describes a vertex.
struct SimpleVertexCombined
{
    XMFLOAT3 Pos;  
    XMFLOAT3 Col;  
};

I'm using to the IAGetVertexBuffers() method to return some buffers during DX runtime. Is there a way for me to me query there vertex structure (similar to above example)?

Here is a small snippet of my code.

pContext->IAGetVertexBuffers(0, 1, &buff, &Stride, &veBufferOffset);
	if (buff)
		buff->GetDesc(&vedesc);

	
Advertisement

Update…After some additional research I found that its not possible to get the lay out description of a vertex buffer. Unless you were the one who originally created them.

Vertex buffers are just untyped data blobs. You can actually store multiple different vertex arrangements in the same buffer if you wish, and there's nothing in the API to prevent you from doing it.

What you really need to do is query the input layout in use as well as the bound buffers at the time a draw call is issued, and you can then use that to at least figure out the data types used in vertex buffers for that draw call. But bear in mind that applications can source vertex data from constant or other buffer types, shader resources, procedural generation, or even draw without buffers.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

@21st Century Moose I have tried using IAGetInputLayout(), but it doesn't seem to yield anything useful. At least for revealing data types. Maybe I'm not using it the right way. Is that what you meant?

I'm talking about using Shader Reflection - https://docs.microsoft.com/en-us/windows/win32/api/d3d11shader/nn-d3d11shader-id3d11shaderreflection - although that's not going to be a completely robust solution either, as it's valid for e.g. a vertex shader to consume float4 but the input layout to only specify float3 - the input will be automatically expanded.

Another way of doing it might be via a proxy dll that intercepts the CreateInputLayout call.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Those are great ideas! Especially that second one. I went ahead and hooked into CreateInputLayout().

I tried the following:

STD::cout << pInputElementDescs->SemanticName << endl; 

STD::cout << pInputElementDescs[0]. SemanticName << endl;

Both cases returned the following text, which make no sense.

ï Uï∞ u ï u?â└îPΦ⌐▌≥ ]┬

ï Uï∞ïE?Vï÷@u ╞@| âΦÇPìå¼j

The other arguments from CreateInputLayout() also return values that don't make sense based on what the docs say.

I was going to try printing like so:

STD::cout << pInputElementDescs[0] << endl

But cout doesn't know how to print that. A separate issue I suppose.

Any thoughts on why that might be?

You know what…my hook wasn't working. I got the wrong function offset. Makes total sense now. I'm all set now. Thanks again for your help.

This topic is closed to new replies.

Advertisement