|
Don't have access to a PC right now ?
|
|
|
|
|
of course
You know some birds are not meant to be caged, their feathers are just too bright.
|
|
|
|
|
sprintf(res,str,xman);
here xman is undeclared identifier..
|
|
|
|
|
suthakar56 wrote: here xman is undeclared identifier..
What do you expect? You did not declare that variable.
|
|
|
|
|
Tritva wrote: can we achieve this?
Have you tried?
"Old age is like a bank account. You withdraw later in life what you have deposited along the way." - Unknown
"Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons
|
|
|
|
|
In my applcation,i have created two listbox in the view class.But initially only one listbox shoud be displayed,by pressing the button in menu..both first and second listbox will bedisplayed one below the another.
I do not know how to hide the listbox initailly.
int CAlarmView::OnCreate(LPCREATESTRUCT lpCreateStruct)
{
pAlrmListBox->Create(WS_CHILD|WS_VISIBLE|LBS_STANDARD|WS_VSCROLL|LBS_OWNERDRAWVARIABLE , CRect(0,0,600,400), this, ID_ALARMLIST);
.
.
pAlrmFilterBox->Create(WS_CHILD|WS_VISIBLE|LBS_STANDARD|WS_VSCROLL|LBS_OWNERDRAWVARIABLE , CRect(0,0,600,200), this, ID_ALARMLIST);
}
Anu
|
|
|
|
|
Remove WS_VISIBLE flag from Create(...). Use ShowWindow() API to make the listbox visible later.
|
|
|
|
|
bool Message::unpack() {
struct tuple{
qint32 op;
union {
qint32 val;
quint32 uval;
};
};
changeByteOrder();
//no data?
if (data() == NULL)
return false;
//invalid header?
if (*(qint32 *)data() != VALID_DGM)
return false;
//<b>skip header--- here is my problem i need the source ip </b>
<b> tuple *pt = (tuple *)(data() + 4); </b>
//reset active sensors
actSens = 0;
//read all sent values
for (int i = 4; i < size(); i+=sizeof(tuple)) {
switch (pt->op) {
case OP_TSLEEP:
tsleep = pt->uval;
break;
case OP_TRECV:
trecv = pt->uval;
break;
case OP_ID:
id = pt->uval;
break;
case OP_TEMP:
temp = pt->uval;
actSens |= SENSOR_LINE_TEMP;
break;
case OP_AI0:
ai0 = pt->val;
actSens |= SENSOR_LINE_A0;
break;
case OP_AI1:
ai1 = pt->val;
actSens |= SENSOR_LINE_A1;
break;
case OP_AI2:
ai2 = pt->val;
actSens |= SENSOR_LINE_A2;
break;
case OP_DIO0:
dio0 = pt->uval;
actSens |= SENSOR_LINE_DIO0;
break;
case OP_DIO1:
dio1 = pt->uval;
actSens |= SENSOR_LINE_DIO1;
break;
case OP_DIO2:
dio2 = pt->uval;
actSens |= SENSOR_LINE_DIO2;
break;
case OP_DIO3:
dio3 = pt->uval;
actSens |= SENSOR_LINE_DIO3;
break;
case OP_CURRENT:
current = pt->uval;
actSens |= SENSOR_LINE_CURRENT;
break;
case OP_VOLTAGE:
voltage = pt->uval;
actSens |= SENSOR_LINE_VOLTAGE;
break;
case OP_VBATT:
vbatt = pt->uval;
actSens |= SENSOR_LINE_VBATT;
break;
case OP_RSSI:
rssi = pt->val;
actSens |= SENSOR_LINE_RSSI;
break;
case OP_LAST_ACK:
lastAck = pt->uval;
break;
default:
break;
//cout << "UNKNOWN VALUE" << endl;
}
pt++;
}
return true;
}
i would like to not skip header so i can get source ip
|
|
|
|
|
Hi,
I have an application that controls a hardware system including a camera. I have noticed (after studying the logs) that when the application is running the system clock loses time gradually over an hour and then catches up the lost time in one stroke. This causes a big jump - 2 to 3 minutes - in the log times. To futher validate my theory I ran perfmon.exe and configured it to capture CPU usage every 30 seconds. In the log generated by perfmon I notice the same jump in the time.
Another interesting fact is that the interval between 2 jumps is exactly 1 hour. I have tried shutting down unnecessary services/processes to reduce the number of processes that are running but the problem still persists.
I'd appreciate any feedback and/or solutions
|
|
|
|
|
Member 519651 wrote: I have an application that controls a hardware system including a camera.
Is your application blocking the interrupt system in any way, so preventing clock ticks from being seen?
Does this happen when your application is not running?
|
|
|
|
|
I don't handle clock interrupt in my application but may be one of the drivers (camera) does. I wrote a sample app to display the values returned by GetSystemTimeAdjustment and found that the interval and increment values are 15.625 ms and that adjustment is disabled. The MSDN help says that when time adjustment is disabled then the clock will synchrnoize using "other mechanism" and make a noticeable jump.
I am guessing that the system clock is synchronizing every hour to the RTC on the motherboard. The questions I have is:
1. Why is it syncing exactly every hour?
2. What can I do to make it sync say every 10 minutes or so?
I'd appreciate any thoughts/ideas on this matter.
|
|
|
|
|
Member 519651 wrote: 1. Why is it syncing exactly every hour?
No idea I'm afraid. I have never seen this problem, and Google does not seem to have any relevant hits. You may like to try some of the Microsoft forums to see if anyone has the answer. It may be something connected to your specific hardware configuration, in which case you need to talk to the manufacturer.
|
|
|
|
|
Member 519651 wrote: ...I ran perfmon.exe and configured it to capture CPU usage every 30 seconds. In the log generated by perfmon I notice the same jump in the time.
Which would indicate it has nothing to do with your code or this forum, correct?
"Old age is like a bank account. You withdraw later in life what you have deposited along the way." - Unknown
"Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons
|
|
|
|
|
It happens only when my application is running. I ran perfmon in parallel to my application to see if this is really happening and not some bug in the logger code.
|
|
|
|
|
Member 519651 wrote: It happens only when my application is running.
Hi,
Usermode applications should have no effect on the system clock. I would suspect that a device driver may be effecting the clock interrupt.In fact... what you are describing sounds somewhat normal. What you are descibing is called clock crystal drift.
You should investigate the GetSystemTimeAdjustment Function[^] if you want to attempt to compensate for the clock drift.
If you need a higher resolution clock... then you may need to purchase special hardware. Its as simple as that.
Anyway here is what Larry Osterman had to say about it:
One in a million redux[^]
Best Wishes,
-David Delaune
|
|
|
|
|
Hi,
Thanks for the response.
I have looked into the GetSystemTimeAdjustmemt function and when I call this function on my target system it returns 15.625 ms for interval and increment and TRUE for the disabled flag. According to help if time adjustment is disabled then the interval is added to the time at each clock interrupt and time may be synchronize using "other mechanisms".
I do see a task running in the task manager that syncs the clock to RTC causing the jump in the time. I am not sure if it is timeserv.exe or not. It came and went so fast.
Is there a way to change the sync interval to something else other than an hour?
|
|
|
|
|
|
I am writing a program that runs in the background and controls a text editor. My program sends key sequences to the editor to open the Find and Replace dialog box of the editor (this is a dialog box customized by the editor).
So far, this method works great. The only problem I am having is that the dialog box quickly appears and disappear (since I sent the key sequences + enter key as well).
So, now I need to find a way to make the Find and Replace dialog box invisible to the user (when I sent my keys), yet, it must still be able to accept my key sequences.
I was thinking of the following options:
- Each time I want to trigger the Find and Replace dialog, I will set the size to 0 and restore the size later after I have executed my command => Will this work if this is a fixed size dialog box ?
- Set the transparency level to X so that the dialog box is not visible anymore => Is there such an X to do so ? It looks like the dialog caption is always blue
- Or other methods ?
Would really appreciate your help on this.
|
|
|
|
|
You could try those methods.
You could also try and move it out of the screen coordinates.
If even that doesn't work, you might need to write a shell hook which can notify you when the dialog is created and before it becomes visible.
|
|
|
|
|
Hello,
I have a "how-to" question to you in the COM area: I have a non-MFC application and must now expose some of the objects via dual COM interfaces.
In MFC based applications there are ways to make my life easier, I could use MFC classes (e.g. CCmdTarget) and nice macros (mfcdual.h, etc.).
My question: what can I do in my non-MFC Windows app? Is there something similar available (in MFC or ATL) to ease my job? Or do I have to write the COM interface by hand?
Kind Regards
modified on Wednesday, November 11, 2009 5:53 PM
|
|
|
|
|
Umm - ATL has plenty of classes to help you with that. Have a look at the documentation[^].
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
|
|
|
|
|
OK, I'll take a look at the documentation.
Thanks
|
|
|
|
|
Right click on the project in solutions explorer.
Select Add -> Class.
Select ATL -> Simple Object.
Following the wizard that pops up you will be able to add dual interface COM objects.
|
|
|
|
|
I developed a customized PageSetupDlg that has some additional controls including a "Restore Defaults" button. When this button is clicked the current settings for margins, orientation and paper size should be reset to certain default values without closing the dialog box. This is not a problem with margins and orientation but I haven't found a way to handle the dropdown list for paper size, i.e. to find the appropriate list entry that my app must select for the desired paper size. Parsing the text of the entries doesn't seem to be a very safe way.
I think there are at least three possible methods to achieve this but I couldn't get any of them to work:
- simply tell the dialog or list box to "select paper size x by y"
- a safe way to enumerate the paper sizes corresponding with each of the list entries (in the correct order!)
- "refresh" the dialog box with a predefined DEVMODE struct
Grateful for any clues,
hx2000
|
|
|
|
|