Click here to Skip to main content
15,890,741 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Dear fellow devs,

I'm working on a task of converting some C++ code into C# . Unfortunately to be honest with you, I'm not very fluent with C++ so maybe I'm missing something simple.

Anyway, I have this unsigned char of 64 elements called buffer with the following elements:

[0] = 'C'
[1] = 'P'
[2] = '0'
[3] = '4'
...
[12] = '€'

The rest of the elements are all \0 (not '0').

Then this unsigned char array is cast to unsigned long array (within a class) as follows:

C++
typedef struct _SHA_MSG {
    unsigned long W[80];
} sha1_key;

unsigned char buffer[64]={0};

buffer[0] = 'C';
buffer[1] = 'P';
buffer[2] = '0';
buffer[3] = 4 + '0';
buffer[12] = 0x80;

sha1_key* k = (sha1_key*)buffer;


However at this point, it changes drastically! For example k.W contains :

[0] = 875581507
[3] = 128
[16] = 3435973836
[17] = 2641335238
[18] = 3865956
[19] = 18366288
[20] = 0
[21] = 0
[22] = 2130567168
[23] = 3435973836

etc, with most elements having value 0 or 3435973836.

What has just happened? Can someone kindly instruct any C# equivalent of this operation that yields these results (ideally without relying on IntPtr) ?

Thanks a lot!

Edit:

My C# code of this part is:

C#
struct sha1_key
{
    uint[] w;

    public uint[] W
    {
        get { return w; }
        set { w = value; }
    }
}

sha1_key key = new sha1_key();
key.W[0] = 'C';
key.W[1] = 'P';
key.W[2] = '0';
key.W[3] = (char)(year + '0');

key.W[12] = (char)0x80;


As you can see, I have so far retained much of the code as it is in C++. I will improve upon it later, when I manage to get it working :)
Posted
Updated 2-Sep-13 7:59am
v5
Comments
Philippe Mori 2-Sep-13 12:21pm    
Is there any reason that in C++ code, the array is interpreted as an array of unsigned long but filled using an array of unsigned char? Maybe, the best approach is to fix C++ code first to avoid that conversion by using appropriate type.
Trapper-Hell 2-Sep-13 13:58pm    
Hi Philippe,

Thanks for your comment. To be honest, I have no idea why it's declared as unsigned long but filled with char instead. However it is performing the required results (even if internally if code-wise it wasn't created through the best approach. No point in fixing the C++ code now since the idea is to implement it in C#.

In C++, the unsigned char type is a single byte. If you cast the pointer to an array of unsigned char to a pointer to an array of unsigned long, you are viewing the memory as if it were organized into 32-bit words instead of single bytes.

k.W contains 875581507, which is 34305043 in hexadecimal. Starting from the least significant byte, which corresponds to the [0]th byte in the original unsigned char array, you have 43 ('C'), 50 ('P'), 30 ('0'), and 34 ('4').
 
Share this answer
 
Comments
Trapper-Hell 2-Sep-13 8:50am    
Thank you for your prompt response!

So this means that the 0th element contains the original chars appended (as hex)? What about the rest of the elements?

How can I achieve something similar in C# ?

I have edited my question to reflect my C# code so far.

Thanks :)
[no name] 2-Sep-13 9:14am    
Your comment does not really make sense so far. Do you mean c++ code? What exactly are you trying to do? Is this in C# or C++? Is the answer so far what you expected? Remember that in c# chars are Unicode (2 bytes).
Trapper-Hell 2-Sep-13 9:20am    
I'm trying to port the original C++ code to C#. The code shown in my question is mostly in C++ (only after the Edit: tag is my C# code).

The answer helps answer one of my queries, but I think it is not sufficient for me to solve the problem I'm encountering.

Thanks for pointing out that in C# chars are Unicode, but what changes shall I make then to match the C++ functionality?
For ascii characters

C#
byte[] bytes = { (byte) 'C', (byte) 'P', (byte) '0', (byte) '0' + 4 };

 UInt32 i = BitConverter.ToUInt32(bytes, 0);



http://msdn.microsoft.com/en-us/library/system.bitconverter.touint32.aspx[^]
 
Share this answer
 
v3
Comments
Trapper-Hell 2-Sep-13 9:50am    
Wow! Thanks pwasser. As for the 1st element, it works exactly as I wanted it to!

But what has happened to the rest of the elements in the array? For example buffer[3] = '4' (byte 52), but k.W[3] = 128 etc.
[no name] 2-Sep-13 9:55am    
Have you looked at the provided link?
Trapper-Hell 2-Sep-13 10:17am    
Yes I have. Basically the method (and the link you provided) seem to convert four bytes as a 32-bit unsigned integer.

And this works well for setting the value of k.W[0] = 875581507. This also works for k.W[3] getting value 128 from BitConverter.ToUint32(buffer, 12).

But what about k.W index 16+ ? What values are they getting?

- Worth noting, that when the buffer gets cast to type sha1_key, the 'k.W' unsigned long array is 80 elements long.
[no name] 2-Sep-13 18:00pm    
Your (new) example is nonsense. The char array buffer is 64 bytes. That is enough for exactly 16 and only 16 4 byte integers. k.W[17] is past the end of the 64 byte array and could be anything. This is the power and the danger of pointers. To allow for W[80] you would need to allocate 80 * 4 = 320 bytes for buffer.
Trapper-Hell 2-Sep-13 18:08pm    
I don't know if this is a flaw in the current implementation or not then, but as I understand the end functionality is well. What this is all about is getting a different type of key from a supplied key - so I really don't know if all this is coincidentally working or just intended, but I'm trying to achieve the same functionality (i.e. result-wise).

As I understand, k.W is able to contain 80 bytes is because the struct sha1_key is declared to contain an array of size 80. However as to how it is getting those values, I really don't know...

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900