I am writing a wrapper on C # for a static C ++ library. Not familiar with C ++. In an unsafe context, a pointer to byte [] is returned in a parameter of type void *. How to get this array for further processing in C #? (This is the text that needs to be converted from ANSI to Unicode). Thanks in advance.

Prototype:

MPFUN int MPAPI SignBufferEx(void **out_buf, int *out_len); 

Call in a running C ++ program:

 void* buf = NULL; SignBufferEx(&buf, &ln); 

Call in my:

 [DllImport(pathdll)] public static unsafe extern int SignBufferEx(void** out_buf, int* out_len) 

Added code, but it returns not what is expected: (

 var buffor = new byte[ln]; var pBuffor = (byte*) buf; for (var i = 0; i < ln; i++) { buffor[i] = *(pBuffor+i); } Encoding textEnc = new UnicodeEncoding(); Console.Out.WriteLine("textEnc.GetString(buf) = {0}", textEnc.GetString(buffor)); sign = textEnc.GetString(buffor); 

Closed due to the fact that off-topic participants Abyx , aleksandr barakin , user194374, LEQADA , PashaPash Jan 27 '16 at 8:58 .

It seems that this question does not correspond to the subject of the site. Those who voted to close it indicated the following reason:

  • "The question is caused by a problem that is no longer reproduced or typed . Although similar questions may be relevant on this site, solving this question is unlikely to help future visitors. You can usually avoid similar questions by writing and researching a minimum program to reproduce the problem before publishing the question. " - Abyx, aleksandr barakin, Community Spirit, LEQADA, PashaPash
If the question can be reformulated according to the rules set out in the certificate , edit it .

  • Add to the question a prototype of a function that returns a pointer to an array, but now it is not completely clear what exactly is returned. - Nicolas Chabanovsky
  • When converting an array of bytes (if you mean a string, that is, char * in C ++) in Unicode (in c # it is probably 2-byte) you need to consider the locale. The array itself is simply consecutive bytes. Take one by one and convert to two bytes. - avp

2 answers 2

Try instead of void** out_buf use ref IntPtr out_buf in the function declaration;

 [DllImport(pathdll)] public static extern int SignBufferEx(ref IntPtr out_buf, ref int out_len) 

and after calling the function

 int result = SignBufferEx(ref out_buf, ref out_len); 

and getting the address of the string in out_buf

 string out_str = Marshal.PtrToStringAnsi(out_buf); 

    My code is working. It was necessary to convert to base64 before converting to unicode - but that’s another story. Thanks to all!