Click here to Skip to main content
15,885,278 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
In my visual c++ project i want to convert a CString object in char *&. how to perform it?
project is build with UNICODE support
Posted
Comments
nv3 12-Mar-13 5:50am    
Why do you need exactly a "char*&" ? If you explain what you are really trying to do, it will be easier for us to help you.
Programmady 12-Mar-13 5:55am    
Actualy this argument is required by a function. That function is written by in another module.

Ok, so a function you call wants an argument of type char*&. That probably means, it takes a character string in the form of a char* as input, but wants to have the freedom of allocating a new buffer and return that via this parameter (therefore the reference).

Unfortunately the function wants to have a char*& and not a const char*&. That means, the function reserves also the right to modify the string you are passing in. That means, you must allocate a new buffer for that string and you cannot use the buffer of your CString object.

Here is what I would do:
C++
CString myString ("abcdef");
...
char* pBuffer = new char[myString.GetLenght() + 1]; // +1 for the NULL byte
strcpy (pBuffer, myString);
char* pArg = pBuffer; // copy pBuffer, because OtherFunction might
                      // return a different buffer in pArg and would
                      // overwrite our pBuffer
OtherFunction (pArg);
...
... look at what OtherFunction has returned via pArg
...
delete [] pBuffer;
...
... possibly you will have to delete [] pArg too, depending on the
... interface of OtherFunction.


As you see, this is an extremely ugly and dangerous interface and I would try to avoid such constructs whenever possible.
 
Share this answer
 
Comments
Eugen Podsypalnikov 12-Mar-13 7:14am    
The UNICODE version of the solution may copy not all characters... :)
Ian A Davidson 12-Mar-13 7:43am    
Best answer so far. Just take note, as Eugen has commented, that if you are compiling in Unicode to use a wide-character conversion copying function, e.g. "WideCharToMultiByte", instead of strcpy, and that in doing so you will not be able to rely on "GetLength" to give you the size required for your buffer. A good starting point in that case would be to assume that you need twice the length, allocate a buffer accordingly, and then check for ERROR_INSUFFICIENT_BUFFER, resize accordingly and loop until it is successfully copied.
Regards, Ian.
nv3 12-Mar-13 7:49am    
Thanks Ian for that valuable hint. I forgot that the application is in Unicode an that OP needs an additional conversion step to multi-byte or even ASCII.
Ian A Davidson 12-Mar-13 7:57am    
Ooops. Yes I see now that Progammady DID say it is in Unicode. I've edited my embarrasing comment! :) Another useful hint: if your code does or can use the STL, allocate the buffer using a vector of chars (std::vector<char>) then you can let it handle the memory management and don't need the delete. Then take the reference to the first char in the buffer as you would for a char array: "char* pArg = &buffer[0];" where buffer is the vector. Ian.
The "on-stack only" code :) :
C++
{
  CString cszTest(_T("abc"));

#ifdef _UNICODE
  USES_CONVERSION;
  const char* pCh1(T2A(cszTest));
#else
  const char* pCh1(cszTest);
#endif
  const char*& pCh2(pCh1);

  //..
}


C++
// Actualy this argument is required by a function. That function is written by in another module.


Another modul may allocate the pointer by itself :) ,
please read its documentation and then check this variant:
C++
{
  char* pChToBeAllocatedAndFilled(NULL);
  pModul->Call(pChToBeAllocatedAndFilled);

  CString cszResult(pChToBeAllocatedAndFilled); // convert it to UNICODE string

  // Warning !
  // Now we must know how the pointer has been allocated:

  //free(pChToBeAllocatedAndFilled); // or
  delete (pChToBeAllocatedAndFilled);
}
 
Share this answer
 
v3
Comments
nv3 12-Mar-13 7:48am    
Nice solution to use the T2A, which is worth mentioning in its own. +5
Ian A Davidson 12-Mar-13 10:44am    
Not convinced. The code that T2A uses looks horrendous. Granted, it uses WideCharToMultiByte, but (in the code that comes with Visual Studio 9) it also uses _alloca to alocate the buffer. This function is deprecated because it's not secure. I guess MS might have updated their libraries from the one I'm looking at, but I doubt it somehow; I imagine they are more focused on .Net these days. If you read the information about this function there are all kinds of issues around exception handling.
Secondly, T2A calculates the buffer length required assuming that every 2-byte character in the unicode string will fit into 1 character of the multi-byte string (which seems to me to be a contradiction in terms in that case). If the string cannot be converted to fit in the buffer, the function simply fails. There is no check for overrun to reallocate the buffer accordingly. In fact there's even a comment MS's code to this effect!
I think you'd be better off writing the code that calls WideCharToMultiByte yourself in a loop to check that the converted string fits in the buffer allocated and loop to reallocate accordingly.
Regards,
Ian.
nv3 12-Mar-13 12:08pm    
The implementation of T2A tries to get around using a heap allocation for the output buffer, as that would imply a relatively costly thread sync. Instead it uses kind of dirty trick to put that buffer on the stack. That's what _alloca is all about. Microsoft uses this macro in its own code, so I was assuming that it's safe enough. But I have to admit that I haven't used often enough to be a good witness.

As far as I can see in the code, they reserve 2*(n+1) bytes for the output buffer, where n is the number of characters in the source string. So that should cover even the worst case.

I find Eugen's solution very elegant and think, it's nice than my code. Nevertheless, if I had to do this at only one single place, I would agree with you walk the safe way and use WideCharToMultiByte, which is easy enough to use.
Hi,

You dont worry about "*&", it means the data in the given pointer should be modified inside that function. You just convert CString to char* and pass to that function. But all these factors depends on the "another module" implementation.
 
Share this answer
 
Comments
Ian A Davidson 12-Mar-13 7:49am    
You do need to worry about *&. Because you're passing the pointer by reference, you can't just take the reference of a buffer. (E.g. "char buffer[50]; function(&buffer[0]);" will not compile!). Also, you forget that the operator to convert a CString, LPCTSTR, returns a const char*, so you'd have to cast away the const as well - only to be done when absolutely necessary and not a good idea when passing to an external function (and this is ignoring the fact that it's being compiled in unicode, where LPCTSTR is a const wchar_t*). So if you weren't compiling in unicode you COULD do something like, "CString myString("Hello World"); char* p = (char*)(LPCTSTR)myString; function(p);", but you wouldn't. I hope! Regards, Ian.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900