|
A single instance of a .NET control cannot be "in" two Forms at the same time.
So, some clarification from you would be helpful:
1. I assume you are using Windows Forms, but please confirm that. Some folks do use the term "Form" (incorrectly) when they are describing WPF.
2. is it the case that at any given time only one of the two Forms will be visible to the end-user, or, are there cases where both will be visible ? If both Forms can be visible, and you want to show your GroupBox on only one of them: what determines on which visible Form it is shown ? Another way of asking this question is to ask: is the location on the screen of both Forms the same... in which case, if their size, and other attributes, like FormBorderStyle, are the same: one will "cover" the other.
3. are both Forms identical in every way, but, location: size, FormBorderStyle, StartPosition, appearance, etc. ?
4. are you using the standard WinForms model in which the Program.cs file, in its static class 'Program contains a static 'Main method in which you have the usual code to start your application, like: Application.Run(new MainForm()); ? That may seem like an odd question, but there is a reason I'm asking it, which I'll explain once you provide more information.
5. what do you mean by: "without any issue of screen resolution;" what could a screen resolution issue be ? Could you mean, by this statement, a possible overlapping of the two Forms when both are displayed (made visible) ?
6. are you familiar with using static classes now ?
bill
Google CEO, Erich Schmidt: "I keep asking for a product called Serendipity. This product would have access to everything ever written or recorded, know everything the user ever worked on and saved to his or her personal hard drive, and know a whole lot about the user's tastes, friends and predilections." 2004, USA Today interview
modified 6-Sep-13 2:42am.
|
|
|
|
|
|
For static members, do you prefer
public static Type Member; or
static public Type Member; where public can be any of the visibility keywords.
I've found I use both , and I'm trying to find out if there's a best practice out there that makes sense.
Software Zen: delete this;
|
|
|
|
|
I prefer option 1, but whichever option you pick, consistency is the key!
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
I use both too, but tend to group the statics at the bottom of the class in it's own region.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
I prefer to put visibility first, and for what it's worth, I don't think I would make a static field public.
|
|
|
|
|
Espen Harlinn wrote: I don't think I would make a static field public I normally don't, but for the purposes of the example...
Software Zen: delete this;
|
|
|
|
|
Fair enough
But then you never know who is following these discussions, and some will surely copy your code without a second thought ...
|
|
|
|
|
If they're copying the millifragment I posted, they're in more trouble than they know.
Software Zen: delete this;
|
|
|
|
|
Agree, but then, on the other hand, they probably wouldn't notice ...
|
|
|
|
|
I prefer the following.
static public...
private static...
I do it that way to make the basic nature of the items more visible to all.
|
|
|
|
|
Hmm. Some of my usage variation seems like that, like my intent was to call attention to the fact something was <span style="font-size:72pt">static</span> .
Software Zen: delete this;
|
|
|
|
|
Then you should have said so.
|
|
|
|
|
I'm with Richard Deeming. Additionally, I always specify the access modifier, I never allow a default to apply -- I think the defaults should be removed from the language.
|
|
|
|
|
I always specify the access modifier as well. I've always distrusted my memory when it comes to default behaviors with subtle consequences. I explicitly parenthesize as well .
Software Zen: delete this;
|
|
|
|
|
I agree with Richard Deeming's response focusing on "consistency." And, I emphatically agree with PIEBALDConsult's response, when he says/implies that one should always specify the scope with Public/Private, and that it would be a good thing for the .NET compiler to "enforce" this.
I prefer to use: public/private static variabletype.
It's interesting to me that if you write (C#, WinForms) inside the scope of a Form's class definition:
public static bool availableOutsideFormClassScope = true; That ReSharper 8 will "suggest" that the use of 'public is redundant.
But, if you write this:
static bool notAvailableOutsideFormClassScope = true; , you have, in effect, declared this variable as private.
I have never had occasion to use a Form scoped static variable, method, embedded class, etc; I only use static when I want something to be accessible outside a Form's class scope. But, perhaps if I were dealing with multiple instance of the same application, or multi-threading, there would be an "organic" need for this ?
When I write applications where there are multiple independent Forms (in WinForms this can be done by modifying the Program.cs file so it calls an initializer method to "run" the application ... and that initializer creates and "shows" all Form instances ... rather than doing the usual: Application.Run(new SomeMainForm()); ... in that circumstance I usually use a ststic class defined as an an application top-level class (i.e., added via VS's Project/Add Class menu).
I know there are people who feel an almost "religious zeal" about not using static anything/anywhere, perceiving the use of static whatever as a violation of the No-Globals-Strongly-Typed supreme being, and doomed to provoke the wrath of the Threading sub-deities I have yet to have a cause to feel such a conviction.
Bill
Google CEO, Erich Schmidt: "I keep asking for a product called Serendipity. This product would have access to everything ever written or recorded, know everything the user ever worked on and saved to his or her personal hard drive, and know a whole lot about the user's tastes, friends and predilections." 2004, USA Today interview
|
|
|
|
|
BillWoodruff wrote: I know there are people who feel an almost "religious zeal" about not using static anything/anywhere, perceiving the use of static whatever as a violation of the No-Globals-Strongly-Typed supreme being I have a feeling you and I are of a vintage, Bill . We are grizzled combat veterans of the FORTRAN wars. We remember when men were men,
women created compilers[^], and you knew how to handle globals safely.
Software Zen: delete this;
|
|
|
|
|
Well, I'd be pleased as punch to be of the same vintage as you ... but, I suspect the year of my exfiltration from the womb, 1943, is some ways "off" from your ship-date.
However, I only started messing with computers in 1982 (after my use-by date), and I turned to vinegar early.
I do, though, use 'static anything only in circumstances where I feel there is an absolute need to do so, although I suspect I use them more frequently than the mythical "average" .NET programmer.
bill
Google CEO, Erich Schmidt: "I keep asking for a product called Serendipity. This product would have access to everything ever written or recorded, know everything the user ever worked on and saved to his or her personal hard drive, and know a whole lot about the user's tastes, friends and predilections." 2004, USA Today interview
|
|
|
|
|
Hmm. Physically, I'm a mite younger than you, since I was born in 1961. Professionally, I'm older, as I've been working as a programmer since 1980, when I started part-time my sophomore year of college.
Compared to our compatriots here at CP (nice little alliteration there), we're both Jurassic .
Software Zen: delete this;
|
|
|
|
|
Joining the Jurassic crew:
Born: 1957. Wrote first FORTRAN Program: Early 1970. Managed to waste quite a few reams of greenbar paper due to a column or semicolon error. Got job while in Jr. High School punching punch cards for U. Took more FORTRAN. Included my own punch cards in stacks I'd punch on my job. Got in trouble for using computer time as unauthorized user. It worked for a while but I didn't think they would check CPU use that close. Dang.
Played Star Trek on paper terminal at U in mid-70's. Got billed for excessive use of paper (AGAIN). Got access to a Vax first running VMX later converted to Unix. Got first C compiler for it sometime in early 80's. My first C program: Output "shift up/shift down" terminal commands to random VT terminals connected to Vax. I got a little foolish and had the program fork itself about 5 times. Made quite a mess of the input terminals over in the student records department. I had the service guys swapping terminals, changing cables. I feel bad now. It was not the nicest thing to do. Punishment: Had to write professors grade book keeper in C. My grade shot to A and stayed there.
Went to west coast computer fair. Bought Apple I. Loved it. Bought Apple II. Sort of loved it. Hated BASIC so started to program 6502 assembly. Hated that fact the some registers could not be used for some things. Why not!? Got 300 baud modem. Screaming now! Lit up the BBSs. Posted little programs written in 6502 assembly. One of my first was a terminal program lots of people used to send receive files via Xmodem. Used the graphics page of the apple II as a buffer. Watching programs come in was sort of physadellic as it write binary data to the graphics page.
Got 1200 baud modem! So happy! I have speed! Could not talk to any one else for some time. No one else had 1200 baud and 300 was broken in this modem.
Got IMSAI 8080 with broken switches. Fixed switches. Wrote two programs before arthritus began to set into switch toggle fingers. Turned off IMSAI 8080. Never turned it on again.
Got job coding in COBOL. Lasted about 6 weeks. Could not hack it.
Got IBM PC. Wrote loan software in C. Beta tested first ever LAN for IBM PC. Davong or something like that. Man that was fast! I could move a 330k file to my co-workers computer in a minute! Outstanding! No more walking around with stupid 5 inch floppies. Oh yea... 330k at least initialially needed to be put on several floppies. Used Digital Research C compiler. Bug list was 298 pages. Example:
sprintf(). This function converts data from variables into a string using the standard C format string characters.
Erratta: This function does not work. Avoid the use of this function.
Had to write own sprintf, and much of the other "stdlib" in that damned compiler. As soon as compiled program went over 32K segment registers would get messed up and program would crash. Had to patch that too in compiler lib. Learned to hate segmented process architecture.
Got a contract to write a meeting manager system in Digital Research Pascal. Customer owned the compiler so had to use it. Same deal. 300+ pages of things that didn't work. Fought compiler at every turn. Got program runnig with lots of self written "standard library" functions.
One week later friend of mine sends me Turbo Pascal 1.0. Opened it, looked at it. The compiler and editor were 32k. Editor was nice. First real IDE I'd seen. Instructions said hit F "whatever" to compile. Hit it, screen blinked, nothing happened. Great another broken POS compiler. Looked on disk there was my "program" .com file. Hum.... Loaded up my meeting manager. Commented out all the self written patches. Hit compile. Screen blinked on disk was MtgMgr.com. Ran it. It worked perfectly. Every function spot on. No glitches.
Stuck with TP for a while. Did a number of successful contracts in it. Still have some "Delphi" contracts and large projects. Still one of the fastest compilers out there.
During early years of Delphi got a contract doing highspeed A/D acquisition and analysis of a drop tester. They wanted to use Borland C++. Oh man was that ever a slow POS! Every little change gave me time to walk around the building. Wrote the code, got it working using a mini desktop sized drop tester breaking popsicle sticks. Got invited to see it work for real on the "real" drop tester. Standing behind a 3 inch thick plexiglass barrier about 20 yards from a tower almost 50 feet in the air with close to 5 tons loaded on it. They hit the switch and down that damn thing came. The floor jumped about a foot when it hit. It broke a truck axel in half like it was a knife chopping a carrot. On the screen a beautiful little graph appeared of the force along with various computations. Was a good feeling to see something work first time.
Used more C to do a large number of embedded projects. Did embedded assembler as well.
Now working mostly in VS and c#. Love lots about it. Hate some things. But yes... I'm jurassic too. Been around a bunch and done quite a few things.
My motto these days:
KISS first
Then you can OOP, SOLID, DI, and framework to your hearts content.
|
|
|
|
|
MarkRHolbrook wrote: Used the graphics page of the apple II as a buffer. Watching programs come in was sort of physadellic as it write binary data to the graphics page Reminds me of the graphics machines at school. They were 64K Z-80 systems running CP/M. The lower 48K was bank-switched with the display buffer, so you had to call a routine 'up high' to draw a graphics primitive. It would switch the program memory out and display bank in, draw the point/line/whatever, and then switch the program memory back in. Occasionally the bank switch didn't work properly, and you would end up watching your program on the screen.
Kind of nifty, especially when you'd been in the lab for 40 hours straight subsisting on vending machine coffee and three packs of cigarrettes.
Software Zen: delete this;
|
|
|
|
|
Hi Mark, you are a true "veteran," a true example of technical "ontology recapitulates hardware phylogeny"
I really enjoyed following the course of your pilgrimage in the digital realm. My first computer circa 1981 was a little Radio Shack Color Computer, on which I taught myself 6809 assembly language, and, of cours, Basic. My second computer was a Mac that some very young Jamaican guy in Berkeley, who was a genius with hardware, had got from some bootlegger of rejects Apple intended to go to a recycler, or to be destroyed. Robert fixed it up, and when I bought it, for about US $600 around 1984, it had a Spanish keyboard, and all the second row keys from the bottom were somehow shifted right one character: so that when you typed an "m" you got a comma
thanks, for the memories,
bill
Google CEO, Erich Schmidt: "I keep asking for a product called Serendipity. This product would have access to everything ever written or recorded, know everything the user ever worked on and saved to his or her personal hard drive, and know a whole lot about the user's tastes, friends and predilections." 2004, USA Today interview
|
|
|
|
|
Many of us here have been "around". I've read your stories and love every one. It's hard to be 56, have all this experience and my current "director" of software development is about 30. He's a great guy, extremely knowledgable and good at what he does but he lacks the experience people like you and I have had. Doesn't make him a bad person. We all start somewhere. He just has a blank stare on his face when I talk about 300 baud modems. LOL.
|
|
|
|
|
Hi Gary, I am sure our souls are equally eternal, even if I do have more years on the meat-package than you. My career in software was, in the context of the "big picture" of my life, a strange diversion that unfolded by the principles of serendipity's quantum mechanics (and lots of hard work).
People who had known me before my first career, in social work, psychotherapy, and academic social science, were quite astounded that a former mad-poet, and ersatz yogi, from the 1960's cultural topsy-turvy maelstrom ever made it to back to Earth in his thirties, let alone made it into university level academia. They were even more shocked when they heard, I had been published by Addison-Wesley by age 44, and later, I worked at Adobe, at age forty-five, and they saw my name in the about-box of Adobe Illustrator 3.2, and 5.0, or heard I was one of three people who created the proof-of-concept prototype that, later, became Acrobat (under the keen eye, and subtle guidance, of John Warnock, the greatest technical mind I've ever had the privilege of working directly with).
My "true love" is literature, and writing poetry, and fiction, and I'm delighted that about age fifty-one I "re-designed" my life, and moved to Asia, and returned to the worship of the Muses. But, I still enjoy the hell out of programming, still occasionally program, or do web-related stuff, or graphic design, for $, so I have not deserted the shrine of the Goddess Techne.
Along the way: wives (virtual, and legal): six. Children: zero. Ex-wives (legal): two.
Life is as good as I allow it to be !
bill
Google CEO, Erich Schmidt: "I keep asking for a product called Serendipity. This product would have access to everything ever written or recorded, know everything the user ever worked on and saved to his or her personal hard drive, and know a whole lot about the user's tastes, friends and predilections." 2004, USA Today interview
|
|
|
|
|
Just to clarify:
static bool notAvailableOutsideFormClassScope = true;
is in fact NOT private . Neither is it public . The default is internal (scope=assembly). As a default this makes the most sense in my opinion (regardless if having a default makes sense at all)
I still specify the visibility for every class (and member). Just because I don't like relying on defaults
Edit:
See Access Modifiers (C# Programming Guide) (MSDN)[^] First paragraph of the section "Class and Struct Accessibility"
Edit2:
Regarding Resharper (At least in Version 6.1) you let Resharper suggest explicit access modifiers:
-> Resharper -> Options... -> Code Editing -> Formatting Style -> Other -> Modifiers -> "Use explicit private modifier" / "Use explicit internal modifier"
modified 5-Sep-13 9:35am.
|
|
|
|
|