|
hi
hi can u please help me to convert string in Unicode bytes,
I do this like this:
CString Hex(long val)
{
CString Buf=_T("");
Buf.Format(_T("%X"),val);
return Buf;
}
int Asc(const CString &cs)
{
unsigned char ch= cs[0];
return ch;
}
int AscW(const CString &cs)
{
CString g=_T("");
int ch = cs[0];
g.Format(_T("%d"),ch);
return _ttoi(g);
}
CString EncodeUserData_16_bit(CString strUserData)
{
CString EncodeUserData_16_bit;
int I=0;
CString hex_string=_T("");
for( I = 0; I<strUserData.GetLength();I++)
{
hex_string=_T("");
CString c=strUserData.Mid(I + 1-1, 1);
int ch=Asc(c);
CString g=_T("");
g.Format(_T("%d"),ch);
ch=AscW(c);
g.Format(_T("%d"),ch);
hex_string=Hex(AscW(c));
if(hex_string.GetLength()<4)
{
hex_string=PadZero(hex_string,4);
}
EncodeUserData_16_bit=EncodeUserData_16_bit+hex_string;
}
return EncodeUserData_16_bit;
}
its convert the string in Unicode bytes only one problem is that
if I convert "hello" to Unicode its return "00680065006C006C006F" but this result mismatch at the end of process.
if in place of "00680065006C006C006F" the value is "680065006c006c006f00" the process successfully finish.
if u please guide me anyhow of help me to solve this, its so grateful to u.
thanks in advance.
|
|
|
|
|
Try something like ...
CString EncodeUserData_16_bit(CString strUserData)
{
CString strEncoded;
for(int i = 0; i < strUserData.GetLength(); ++i)
{
CString unext;
unsigned int chnext = strUserData[i];
unext.Format(_T("%.4X"), chnext);
strEncoded += unext;
}
return strEncoded;
}
Veni, vidi, abiit domum
|
|
|
|
|
this return same vale as my function...
|
|
|
|
|
Well it works fine for me. Maybe you should explain why you think the result is not correct.
Veni, vidi, abiit domum
|
|
|
|
|
Does anybody know how to program a pic micro in C to detect gradient. I have set up ADC pins and each LDR is detecting light. Just need to set that a specific gradient of light detected by LDR will switch a light on.
|
|
|
|
|
Quote: gradient
Do you mean "range of frequencies" (or "range of wavelengths")?
Programming the PIC is relatively simple, I believe. You have first to:
- Be sure your light detector is sensitive to such frequency range.
- Filter in the light of such frequency range.
Once such conditions are satisfied you'll get a high value on the ADC measure when the 'gradient of light' is detected.
Veni, vidi, vici.
|
|
|
|
|
I am new to programming especially with pics. I am measuring the voltage reading from each LDR. It looks like a capacitor charging/discharging. I have an idea of what I have to do detect min and max values calculating the gradient etc. its just actually putting it in code.
|
|
|
|
|
Senned wrote: I have an idea of what I have to do detect min and max values calculating the gradient etc Please provide an example.
BTW what PIC family are you using?
Veni, vidi, vici.
|
|
|
|
|
for instance if the LDR is measuring 5V when it is dark and decreases to 0V this is in an ideal world. My pic is constantly scanning so I will probably need 3 variables each for min and max, so previous min, current min and main min, previous max, current max and main max. Now it will store a value for the previous value then take a new reading, if the current min is less than the prev. min then it would replace it as the main min. same goes for max if current max is greater than prev max then it would replace it as the main max value. Now that I have these I have values to calculate my gradient and determine whether or not it is detecting light from a certain light source or just background light. I am using pic24F16KL402.
|
|
|
|
|
Since you can write your application using C , it shouldn't be difficult to implement such logic. Or have you troubles in dealing with the ADC?
Veni, vidi, vici.
|
|
|
|
|
Yeah that has proved to be a problem when I tried it. The pic has only 2 buffers and I have usually had to work with more in the past (assign 1 LDR per buffer). Do you have any idea how to assign more than 1 LDR per buffer? I could send you my code if you like
|
|
|
|
|
Quote: Do you have any idea how to assign more than 1 LDR per buffer?
That is not a problem, actually. Just perform the process sequentially on the LDRs e.g.
- Sample and convert LDR1, take the result from
ADC1BUF0 , store it in variable, say ldr1 . - Sample and convert LDR2, take the result from the same buffer
ADC1BUF0 , store it in the variable ldr2 . - ...
Veni, vidi, vici.
|
|
|
|
|
Does anybody know how to program a pic micro to detect gradient. I have set up ADC pins and each LDR is detecting light. Just need to set that a specific gradient of light detected by LDR will switch a light on.
|
|
|
|
|
How do I read the file index.dat written. By dcmqrscp application how is the file record format.
Is any example available for reading it and Get the name of file moved ?
Thanks
Andrea
|
|
|
|
|
|
Thanks for hint.
I had a look to the links
But stil is not clear how to read index.dat to Get the name of the files retrieved. Do you know where i can Get some example?
Andrea
|
|
|
|
|
Member 10174363 wrote: Do you know where i can Get some example? Google is the only place for questions like this. I have had another look but could not find anything that may be relevant. I would suggest you try http://medical.nema.org/dicom/[^].
Veni, vidi, abiit domum
|
|
|
|
|
I understand Visual Studio 2012 and 2013 are using Microsoft Foundation Classes version 11. But only in the more professional versions like Ultimate.
Is it truth that Microsoft is still using MFC for its own products such as Office?
What to conclude from the fact that sources based on MFC 6 still build without any errors in VS2012? What other libraries or templates can offer the same?
Many people are feeling unsecure for making a choice for a library to work with. After some years same questions appear again.
Using an other library makes sources completely unrecognizable for other users.
Is it still wise using MFC? Is it easy to make connections to code based on STL? Is it a wise choice when you want to make managed code when possible? In other words can it be nicely coupled to .NET code?
I tried to find a consistent policy at MSDN. More and more people within Microsoft didn't ever hear about MFC.
To find out what's the latest version took me some research.
http://msdn.microsoft.com/en-us/library/d06h2x6e.aspx[^]
modified 16-Jan-14 19:30pm.
|
|
|
|
|
Is it still wise? ...sure, why not? You're always going to have some amount of risk when using a framework, but given the alternative of developing everything from scratch, the benefits outweigh the risks.
Considering MFC is essentially a class based wrapper around the WinAPI (with a lot of helper classes), I doubt it's going to disappear any time soon. It has a long history and I just don't see it ending soon. As far as what versions of VisualStudio have MFC, it's usually the Professional versions (and up). That's probably because they're using part of the cost of those versions to fund the maintenance of the libraries. Usually every version of VisualStudio also comes with a new version of MFC (although not sure if that's always been the case).
If you have a completely new project... you could always opt for C# and the .net framework, or Java and JRE, but if you already have quite a bit of experience with MFC versus the alternatives, it's probably safe to stick with that for the time being.
|
|
|
|
|
You can easily find free or commercial alternatives to the MFC, you don't need to handcode it yourself at all. That is not a valid reason to justify using the MFC.
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
|
|
|
|
|
True, there's always open source alternatives like Qt, wxWidgets, but those also have their inherit risk. When the Qt project switched ownership hands, everyone was worried that they would start charging for it. So I guess when I said MFC vs hand coding a framework, I meant using an existing framework vs hand coding a framework. There's always a risk in using someone else's framework. Big companies like L3 even make their own frameworks so they won't be reliant on anybody else.
|
|
|
|
|
Quote: Is it truth that Microsoft is still using MFC for its own products such as Office?
Did they ever do that?
Veni, vidi, vici.
|
|
|
|
|
Personally I've been avoiding MFC wherever I could for at least the last 10 years. That said, I'm probably not the best person to ask about it's current state in version 11.
I do know however, that the MFC never changed their abysmal design of event handling functions: I just say two words: LPARAM and WPARAM . The need to convert and sometimes split up and reinterpret parts of these event parameters is assembler level coding; it's type-unsafe, prone to errors and misunderstanding, requires a thorough understanding to do right, the resulting code is difficult to maintain, and likely breaks when you switch from 32-bit to 64-bit.
There are both free and commercial frameworks available if you're looking for an alternative. QT and CodeJocks XTreme ToolKit Pro come to mind. But you can easily find more on the web.
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
|
|
|
|
|
Assembler level coding? ...that's a bit of an exaggeration... it's C-style coding. As far as 32 vs. 64 bit assemblies, well a lot of things break when that happens, so you're not likely to find a universal answer there anyway.
|
|
|
|
|
I think a very useful reaction. Just because of the flexibility to use ANSI and Unicode and prepare for 32-bit and 64-bit alongside, it turned out to be wise to take MFC (and ATL). I understood MFC and ATL are more or less integrated and coupled now.
To be critical on the role of Microsoft is wise to. Meanwhile (2018) the version for MFC is 14.
Newer versions of Visual Studio aren't linked to new versions for MFC anymore.
But … hate to say … sometimes the one you criticize (or criticizes you)... has right.
For example the preparation for Unicode wasn't understood by a lot of MFC-users (for a long time including me). I'm sure it even gave MFC a bad reputation. What the hell with … CString, TCHAR, LPCTSTR, LPTSTR, TEXT, etc. ???
The level of understanding of preprocessors has to be high. So even more people will drop out as fans for MFC. My level of understanding of preprocessors is very modest.
Some quick testing showed that MFC can be used 64-bit still. The CDAO... classes might be an exception. Although the MS communication on DAO in general was kind of an example of a large dragon with two heads (or even more) speaking with 2 mouths and thus speaking the truth always … or never. I didn't test, but perhaps those CDAO... classes can be used in 64-bit code as well. Let's be aware of the fact that 64-bit compilers are often 32-bit programs themselves. The IDE for 64-bit development … is often 32-bit. Real programmers have to laugh for the request for 64- bit software, because of the belief that for sure that will be faster. Even Microsoft again turns out to be that multi-mouth-dragon. Some people hope to sell us new 64-bit software … because of …
In my humble opinion MFC can be used seamingly alongside raw Win32 code. Many may smile now. It comes close to the contra for MFC being it just a little wrapper around Win32. It can be used seamingly alongside managed code of NetFramework, although I prepare for disappointments for years already. For many developers that sounded as crazy sadomasochism.
How many developers understand (and use!) the concept of deployment with debug-builds and runtime support with surveillance tools at site with the customers ? Do developers have to prepare for that before compile-time ? Do developers have to prepare for testing ? As a developer my ego tends to shrink and shrink.
To hear experiences from others is still welcome. I still use MFC but still … critical … I hope.
|
|
|
|