|
Hi All, many thanks up-front to anyone who can help here.
I have a client that wants me to migrate a CTreeView-derived class to one which works like a docking pane. The class is so large now that it's not feasible to rewrite everything.
So...I still have the underlying CTreeCtrl-derived class (more/less untouched), which is a child of a class derived from CDockablePane (migrated from the CView class). I've also derived from CMultiPaneFrameWnd (used when the pane is floating) to try to trap events.
I've got it working about 80%. I need help/direction with the other 20%. I've got the CDockablePane derived class working with bulk of the MESSAGE_MAP events (menu handlers, ON_UPDATE_COMMAND_UI, etc.) working correctly. The events associated with the mouse are giving me grief.
I see 3 standard modes for the dockable pane: floating, docked, and tabbed. And they all behave differently w/r to mouse events - and depending on the docking mode.
For example:
* If the docking mode is DT_SMART *and* the pane is NOT floating, then drag&drop and double-clicking work correctly.
* If the docking mode isn't DT_SMART, then drag&drop usually tries to drag the complete docking pane instead of working correctly (regardless of floating, docked, tabbed).
* When the pane is floating, I can't trap WM_LBUTTONDBLCLK (unless you click on the lines and little [+] boxes...) or the respective notify or reflect messages - even Spy++ doesn't record them. I just don't see any left button double-click events - at all!
Google searches show that I'm not the only one with these types of issues. I've read and tried pretty well every suggestion...and I still have these issues. I'm now concerned about bugs or assumptions in the framework after reading this:
http://connect.microsoft.com/VisualStudio/feedback/details/641096/cdockablepane-calls-releasecapture[^]
This link: http://www.johnbyrd.org/blog/index.php?itemid=405[^] helped a lot but didn't solve all issues either.
I can also replicate these issues (unable to trap double-clicking when floating) in the VisualStudioDemo feature pack demo.
Any suggestions on how to get a CTreeCtrl to work inside a CDockingPane - consistently - would be really appreciated. I feel that many issues are related to CMultiPaneFrameWnd, but can't resolve them.
Thanks!
|
|
|
|
|
Hm... My trees can register the double clicks in their floating panes
For example, it is possible to extend a node by the double click at its item...
The model in my case is (from outside): CXDocablePane->CXFrameWndEx->CXTreeCtrl
They sought it with thimbles, they sought it with care;
They pursued it with forks and hope;
They threatened its life with a railway-share;
They charmed it with smiles and soap.
|
|
|
|
|
Found it! There is an intermediate class which inherited from CTreeCtrl. It was originally set up to support a CView, and forwarded the LButtonDown event to its parent. Well, now that the parent is a dockable pane, the behaviour changed a lot! Removing that handler got lots of stuff working!
|
|
|
|
|
i know accessing was ok with windows 98/me but with windows 7 it cannot be done.
but i have read that they can be accessed by making some drivers orany extra special drivers for this !! help needed realy hard time !! i work on windows 7 now and getting win 98is a big issue for me !! help !
|
|
|
|
|
What exactly is your C/C++/MFC question? What code have you put together that you need help with?
"One man's wage rise is another man's price increase." - Harold Wilson
"Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons
"Show me a community that obeys the Ten Commandments and I'll show you a less crowded prison system." - Anonymous
|
|
|
|
|
If a PC running Windows 7 has a parallel port then you will probably find that it also has a driver. Now try posting a proper question, and stop filling your post with all those pointless emoticons.
Use the best guess
|
|
|
|
|
The windows API will support paralell port acces the same way it does serial port. You need to do a CreateFile() on \\\\.\\Lpt1 or some such. Look at the DeviceIOControl codes you can use to control it.
Dont try writing a driver, you will probably spend a year getting it working.
==============================
Nothing to say.
|
|
|
|
|
Here[^] is a good example that show you how to access paralel port in any OS. I hope to help you ...
|
|
|
|
|
Hi Folks... Having a hard time to figure out if we can call Rest Based Web Service API's in C++. Need to pass the secret Key to the Server and get back the response. Any idea or pointer or framework in C++ would be useful.
Any sample application will be useful.
Can it be done using sockets programming?
|
|
|
|
|
raghunath sahoo wrote: Having a hard time to figure out if we can call Rest Based Web Service API's in C++
You can do just about anything in C++. Having said that, it might not be your easiest solution. Higher level languages tend to have better abstraction for certain things.
raghunath sahoo wrote: Can it be done using sockets programming?
Of course it can be done using socket programming, how exactly do you think web services work? Problem with developers that work at a high level is they don't seem to know about the magic that's happening in the background. Simple answer: the web works over sockets.
I just typed "REST C++" in google and got a bunch of results, why don't you try one of those?
|
|
|
|
|
Thanks Albert... I had googled out earlier for the same...the links had no relevant information. The few pointers which are shown use may be other language framework to have the connection established.
I would rather just check out how win sock will be working.
Beacuse of the urgency just to write the code entirely in c++ and using any service or framework avalable or any one having worked in it... will be helpful to guide... as really am clueless.
|
|
|
|
|
http://www.codeproject.com/Articles/3849/Simple-HTTP-Client-using-WININET
This link is helpful.
|
|
|
|
|
Network stack goes something like this...
Socket->Raw Ethernet->IP->TCP->HTTP->Web Pages/Services
So basically, the sockets can be programmed with whatever language you want and everything else is layers that go above (or below) your base communication layer. Some languages offer higher abstraction so for example, you can get straight to web services without knowing anything about the other layers. Sometimes it's beneficial to keep things abstract to get things done quickly, but I'd personally recommend you at least know about the inner workings of things.
Here's a project[^] for C++ that deals with REST, it's from MS (I believe).
|
|
|
|
|
Here is the test code:
http://codepad.org/zKTatneG[^]
struct Empty
{
};
struct JustLong
{
long data;
};
struct LongAndEmpty : public JustLong, private Empty
{
};
struct StringAndEmpty: public std::string, private Empty
{
};
int main()
{
cout << "Empty:" << sizeof(Empty) << endl;
cout << "JustLong:" << sizeof(JustLong) << endl;
cout << "String:" << sizeof(std::string) << endl;
cout << "LongAndEmpty:" << sizeof(LongAndEmpty) << endl;
cout << "StringAndEmpty:" << sizeof(StringAndEmpty) << endl;
return 0;
}
When you try it in other compiler (for example, see link above for codepad.org), results are correct, i.e. output looks like this:
Empty:1
JustLong:4
String:4
LongAndEmpty:4
StringAndEmpty:4
Now, when you try this in VC++ (2010), size of struct with std::string is always (debug and release) 4 more that size of just std::string. Any other structs/classes are of correct size, and only those with std::string are bigger than they should be. Am I missing something or this is MS bug?
|
|
|
|
|
First it's not technically a bug unless it either doesn't work or doesn't meet the C++ spec. I don't think either of these things is the case.
Second it looks much more likely that StringAndEmpty is larger because it's derived from an already complex class the v-table for which is compiled into an external module. It's a guess but I reckon you've got an extra pointer in there to look up the pre-existing std::string vtable.
I have read a full explanation of the various sizes and formats of MSVC and other compilers minimal/empty/simple/complex structs and classes with and without virtual inheritance. I can neither remember all the details nor where I read it but I do remember that there were a lot more variations 4-byte, 8-byte, 12-byte, 16-byte, 20-byte 'headers' than I would ever have thought and much of the variation was within one type of compiler rather than between them. If I remember where that detailed research is I'll post a link.
"The secret of happiness is freedom, and the secret of freedom, courage."
Thucydides (B.C. 460-400)
|
|
|
|
|
Well, consider the following struct:
struct JustString: public std::string
{
};
It has the same size as just std::string, so no, there is no extra pointer in VFT. The question I have is about Empty Base Class Optimization - size of empty base class should be reduced to zero in descendant if possible. Yes, it is not a requirement, only a suggestion in a standard. Problem is, I don't understand this: it works for non-MS compilers; in MS-compiler it works for all cases EXCEPT std:string. I will check later with VC++ 2012 to see if they made it work, but right now I have no other explanation than some problems with MS compiler. Or is there any compiler option specifically for this case that I don't know about?
|
|
|
|
|
The example you give would not need an 'extra' pointer because it would only need the one to the external vtable as everything internal can be optimised away. So far only when you have an internal and an external base you have found the size increases. It's likely that even in that case the internal base could be optimised away but it isn't which may be a flaw in the Microsoft compiler or there may be some practical reason why that shouldn't be done. As I mentioned there are quite a number of combinations of single vs multiple inheritance, virtual and non-virtual bases, local and external bases, exported and non exported, complete and incomplete classes at the point of first declaration. Some optimisations such as empty base class removal will be available under some combinations and not others. For a comparison you could try the Clang compiler which is more C++11 compliant than MSVC and being open source has support forums where you can ask exactly why they did it a particular way. Some odd behaviour in this area in MSVC is more about backward compatability of the resulting binaries with older version of the compiler and of Windows than about mistakes in the current version.
"The secret of happiness is freedom, and the secret of freedom, courage."
Thucydides (B.C. 460-400)
|
|
|
|
|
Kosta Cherry wrote: ...size of empty base class should be reduced to zero in descendant if possible. The size of an empty class must not be zero to ensure that the addresses of two different objects will be different.
"One man's wage rise is another man's price increase." - Harold Wilson
"Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons
"Show me a community that obeys the Ten Commandments and I'll show you a less crowded prison system." - Anonymous
|
|
|
|
|
Standalone one - yes. But we are talking here about descendants, where empty class/struct can be eliminated. This is called Empty Base Class Optimization (you can google for it).
|
|
|
|
|
Kosta Cherry wrote: Am I missing something or this is MS bug?
The former.
As you have already googled this you understand that it is an optimization. Failure to optimize is not a "bug".
|
|
|
|
|
Hello Friends
I am creating a MFC application in which I am saving a image from DC into bmp.I gave different effects on that image by using Gaussian Filter Formula by following this link
http://lodev.org/cgtutor/filtering.html[^]
I am setting image blur,embossed and etc by getting image pixels and then settign back.
Now, I want to add noise Effect to image based on some range(0-100)[if 10 then less particles in image,if 20 more particles and so on].
Using same Gaussian,I didnt get any idea to use any filter to give effect of Noise.
I tried my own formula also but it is not adding random particles to image.
Here is own way to set noise.I got pixels of Image and randomly change some pixels to black but tht was not producing noise in some continuous manner. I want it should apply in some manner.
int n = 0;
for(int x = 0; x < nWidth; ++x)
{
for(int y = 0; y < nHeight; ++y)
{
n++;
if(n > 9 + randomFactor)
{
n =0;
randomFactor += x; memDC.SetPixel(x,y,RGB(0,0,0));
}
}
}
I want to add noise randomly but in some continuous manner.
Any Ideas??
Regards
Y
|
|
|
|
|
Well... I have to admit I have no idea of the Gaussian formula but I found a page with formulas which may help you:
IA State[^]
|
|
|
|
|
Two things: You are not really adding noise, but just setting a pixel to black here and there.
The idea of noise is to modify all color components of all pixels by adding a random value (plus or minus). So, start off by using a random number generator like rand(), scale its return value to the amount of noise you want and add the outcome pixel by pixel. You may think of it as computing a noise image and adding that image to your original image.
If you want to refine that, don't use equally distributed random numbers, but gaussian distribution or others. And you may apply filterind (low-pass, higo-pass, etc) to the noise image before adding to modify the spectral distribution of noise.
|
|
|
|
|
Hope you don't mind me posting this as a second question, but wanted to catch attention of anyone who is especially knowledgeable about timeGetTime as well.
Here is the simple code I'm using:
DWORD dwt=timeGetTime();
static DWORD dwt_was;
if(dwt_was!=0)
if(dwt<dwt_was)
dwt=dwt;
dwt_was=dwt;
Also - it's not the timer wrap around
I have just got it here set a break point on the dwt-dwt line and it shows:
dwt_was 54493247
dwt 54493246
How can timeGetTime() go backwards in time?
In case it is relevant this is on a Lenovo Thinkpad Edge e520
ALL FIVE TIMERS ARE INACCURATE - even when they are constrained to a single core:
GetSystemTimeAsFileTime(..)
timeGetTime()
GetTickCount()
QueryPerformanceCounter()
RTDSC()
They all jump back in time frequently on this machine. I have tested them all constrained to a single core.
The only thing I can think of is to try a best guess averaging process as described in this article but not convinced it will be much of an improvement or worth the work involved:
http://www.mindcontrol.org/~hplus/pc-timers.html[^]
Discussion here
http://devmaster.net/forums/topic/4670-need-help-with-queryperformancecounter-and-dual-processors/[^]
On the whole it works reasonably well most of the time. Only now and again get these glitches, they go back in time perhaps every second or two, by a ms or two - and with the metronome typically only playing a few notes per second at most, the chance of hitting a major glitch is fairly small.
I think that's the only reason that it is working as a functional metronome on my computer. It's actually pretty good, but wanted to make it more accurate than it is already.
Also this two discussion:
http://stackoverflow.com/questions/2904887/sub-millisecond-precision-timing-in-c-or-c[^]
There's also the KeQueryPerformanceCounter but I think that might just be for writers of Drivers - and with all the other ones not working don't know if it would fix the issue anyway.
There's the Windows timestamp project here:
href="http://windowstimestamp.com/description
[^]
but it seems to be work in progress, don't think you can actually use it in your apps yet??
If it can't be fixed, I'm also interested in any setting I can ask the user to set on their computer if they want to have the highest possible accuracy of timing? E.g. to switch off some power saving frequency adjusting feature or something?
The only other thing I can think of is to write to midi, convert the midi to audio, and play the audio - that is well timed, but not possible to do in real time.
(sorry forgot you can edit these posts - hence all those deleted extra messages below).
modified 15-Mar-13 3:39am.
|
|
|
|
|