|
Going as far back as Windows has had the ability, people have generally been recommending against mixing video chipsets from different manufacturers together, so that would mean disabling the onboard video and using instead a video card that supports two independent output ports. In reality--and I can only speak for myself--I've personally never experienced any display problem that I could attribute to mixing video cards from different manufacturers together.
Maybe I've just been lucky, but I can definitely see how there could be room for strange behavior when using video drivers and add-on display software from different manufacturers. YMMV, but IMO if you're going to get a card anyway to provide a second display, then it would make sense to get a card that can provide two outputs and then just disable the built-in one.
Another possibility (which is not necessarily as cheap as a low-end video card) - there are USB to VGA adapters that will act as a video card, and let you send video to a monitor through a USB port if, for some strange reason, adding a video card is not an option. You can chain them with no problem - at one point, for the hell of trying it out, I had 3 of them hooked up to a system that already had 2 "regular" displays, for a total of 5 monitors. But I will point out that the key here is to use a USB 3 port - when testing the configuration I mentioned, I could have 5 different HD videos playing full-screen, independently from each other, and not a stutter. Full-screen HD video over USB 2 did NOT work as smoothly. Even just moving a single window over a USB2 port showed the image getting clipped as it was being dragged.
And TBH that was years ago when HDMI still wasn't all that common as a PC connector. I can't imagine there wouldn't be equivalent USB to HDMI adapters nowadays.
[Edit]
..and sure enough, they exist, and they're even cheaper (CAD$25) than what I had paid for the USB to VGA adapters I experimented with.
|
|
|
|
|
dandy72 wrote: Going as far back as Windows has had the ability, people have generally been recommending against mixing video chipsets from different manufacturers together, so that would mean disabling the onboard video and using instead a video card that supports two independent output ports. In reality--and I can only speak for myself--I've personally never experienced any display problem that I could attribute to mixing video cards from different manufacturers together.
If you go back far enough (win95? NT4????) you actually had to use different brands because something in the driver model choked on 2 instances of the same GPU.
I ran into NVidia vs ATI driver problems with XP (vista?) years ago; I was running 3 monitors in an era when GPUs only had 2 outputs and upgraded my main card from NVidia to ATI and ended up having to buy a second cheap ATI card because I could never get my main card and low end NVidia ones to play nicely. On more modern systems I never have had an issue with Intel and AMD/NVidia/USB; I've never tried mixing AMD and NVidia though. (My current systems are Intel and since cards started offering more outputs haven't needed to double up just for that.)
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
No doubt problems existed, they were being reported for a reason. I just was saying I was fortunate enough not to have run across them in any of my own little experiments.
|
|
|
|
|
What OS?
Without waxing too descriptive then, get a PCIe video card but know that purchasing one will require some homework on your part, mainly brushing up on knowing what types of connections to IT you will be supporting once you decide you're going to spend x amount of dollars for your reason.
Why "what OS?" ... because the blue screen of death has evolved recently.
And with that warning, hang on to your old DB 15-pin monitor for a while or at least until you do everything in your power to resist the nostalgia of sticking with an old operating system just because your hardware seems to run optimally on it.
I'll end this here.
But remember the onboard VGA connection CAN BE USURPED by your BIOS.
modified 4-Feb-21 16:44pm.
|
|
|
|
|
I've run dual monitors for many, many years and I now find using a computer with a single monitor very restrictive.
In my case I've always used a PCIe graphics card with multiple outputs and that has worked well for me.
I'm considering going to three monitors (two for primary work area, one for systems monitoring, tv, or debug output, depending on what else I'm doing) and I'll probably get another graphics card to do it.
This is currently on Windows 10 (and was previously on previous Windows versions).
|
|
|
|
|
Personally I don't like any setup with less than two monitors. I guess I spoiled that way. I have had 3 monitors in the past for work. But the 3rd one was normally just dedicated to keeping an eye on my email.
One of the fellows I worked with had 5 monitors. I think he was using them to simulate a complete environment, via virtual machines, for testing and development purposes.
Another guy setup complete systems of monitors to turn MS Flight simulator a 360 degree simulation in is basement. He ended up getting his pilots license in record time.
INTP
"Program testing can be used to show the presence of bugs, but never to show their absence." - Edsger Dijkstra
"I have never been lost, but I will admit to being confused for several weeks. " - Daniel Boone
|
|
|
|
|
I agree with those before me extra monitors are a significant improvement in efficiency, and after years of doing so I can hardly get any work done on a single monitor. Sometimes I travel and I will bring an extra long DisplayPort and HDMI cable with me so I can use the TV in the room as a second monitor.
I would suggest a laptop with a dock and at least two external monitors, and skip the 1920X1080 FHD versions and go directly to 2K monitors
Consider Dell XPS, or the Precision mobile workstations. The latter can be had inexpensively as companies near tech centers don't hold on to them long as they upgrade to get the latest and greatest every few years.
|
|
|
|
|
Just an odd thought to throw out into the mix, but working remotely when you have two displays in the shop is a real pain. Instead of two displays, I upped the size of the display at work to allow space for an output area, and working remotely became painless. I have a coworker who has a two-display setup at work and he is constantly complaining, reminding me of what I am not missing.
|
|
|
|
|
I'm not sure I'm following. Are you saying you have two displays at work, but prefer a single-display setup when working remotely?
|
|
|
|
|
Yes. The problem with two at work and one at home is that anytime you move your mouse against the side the second monitor is, the screen on your local display scrolls in that direction. If the mouse move was just moving the mouse, this means you have to scroll back to where you were working, a delay and break of thought stream.
|
|
|
|
|
Let me clarify a bit more. I ONLY have a single large display at home. Having dual displays at work caused the problem. Replacing the two smallish displays with a larger display at work solved the scrolling problem. I have no room at home for a second display, which would have also solved the problem. A 27" display was < $200 USD, so it was a solution my owner approved of.
|
|
|
|
|
Right...I despise working on a laptop because I'm used to multiple displays so I find one display to be cramped, to the point of being unusable...but I don't feel like the solution then is to stick with one display everywhere for the sake of having one consistent experience no matter where I am. If that's what you're saying.
And are you saying 27" is "large"? What's the pair of "smallish" displays that you got replaced at work?
|
|
|
|
|
I offered this as an alternative, not as the end all. My dual displays were a 21" and an old 19", so 27" is large. The company I work for has 25 employees, so being the third developer means cast-offs.
I read the other responses and saw many bits of wisdom and offered this as an alternative that might be useful to you, not as an insult. What works for me might not work for you, but I thought you might consider it if it helped.
|
|
|
|
|
Well...size is not all that matters; resolution matters too.
I have a 40" 4K display as my primary monitor, and I'll make the claim that it's a better replacement than 4 1080p monitors (equivalent resolution) could provide. That being said, when I got it, I did not get rid of the other two 1080p monitors I already had.
|
|
|
|
|
Well that depends on how you are remote accessing.
For example, With MS's RDS your displays are driven by the remote machine.
With Google's your resolution and number of monitors is driven by the host machine.
Horses for courses.
|
|
|
|
|
Funny - it was only a couple weeks ago I found out that none of my remote team in India and Bangladesh were using dual displays, so I had them all go out and buy large monitors so they could have two monitors - their laptop screen and their external monitor.
Then, 2 days ago, I was on a remote session with one of my guys and it looked like he was using only his laptop screen. He assured me that he was using his beautiful large monitor (32 inches), but the resolution looked terrible. But it was clear that he was using only one monitor.
Turned out, he dutifully plugged in his monitor and then closed his laptop, using an external mouse and keyboard, forcing the external monitor to the same crummy resolution of the laptop. We went through the steps to show him how to use both simultaneously, which tripled his working desktop space - double resolution on the external monitor and then the original laptop. We put the debug window on the laptop, ran his program, and instantly found that his web system was making an extra extraneous SQL call for every action and a separate bug where a server call was being made when the data was already in memory.
I cannot overstate the benefit of being able to have a live console output/debug log for applications, especially web apps.
And for me, using double/triple monitors for years, don't care what the outputs are. VGA, DVI, HDMI, Display port - my primary tasks are writing and coding, so speed has very little concern for me. Once I needed a dual DVI cable to get a better resolution on my ultra-high res monitor, but that was it.
<hr>
"Qulatiy is Job #1"
|
|
|
|
|
That's an "attaboy" post if I ever saw one.
Agreed wholeheartedly, I've been using at least dual-displays for over a decade, and there's just no going back. A laptop is just something to get by when I'm away from my desk. There's just no way I could sit down and try to do something useful with one display.
|
|
|
|
|
David Carta wrote: I had them all go out and buy large monitors so they could have two monitors
Wow, your employees are very lucky. Most employers in my experience would take severe grovelling to agree to budget for new hardware like that.
|
|
|
|
|
Probably a benefit of having a CEO who started the company as the primary developer.
IMO, hardware expenditures, even extravagant ones, are so miniscule compared to the employee salary, I generally don't dicker when an employee needs better hardware, whatever it might be, computer, RAM, SSD, monitors, software, etc. It makes the employee happy and usually more productive, far outweighing the costs.
Even when some of your dev team is in relatively cheap places like India or Bangladesh!
<hr>
"Qulatiy is Job #1"
|
|
|
|
|
I think you'll do well by adding a video card which has a DisplayPort (DP) video port. With DP you can daisy-chain 2 or more monitors with DisplayPort cables. You'll also need monitors which support DP. However, not all video cards support more than 2 monitors - you'll need to verify if they do first.
My PC has an I5 9600K cpu, and its integrated grapics support 3 monitors quite well. I'm using 3 Dell U2415's arranged in a semi-circle.
|
|
|
|
|
At the onset of pandemic the company sent everyone home to work, but didn't let us take any monitors home. The LG 29" in the bedroom was commandeered as my second monitor. HDMI port from laptop. Couldn't have been easier.
|
|
|
|
|
David,
I did that with the Monochrome Monitor. Wow... Norton Guides... [I Created NG for the Windows.h files]
Anyways, USB-C is good enough. It all depends on where you want the monitor.
I went from 2, to 3, and recently transitioned to a 4K, 55" TV (Equivalent of 4 monitors on one screen).
It sits on my desk, my email is in the bottom right quarter. I use this and multiple desktops.
Its so much easier/cleaner than 4 monitors, and no gaps/seams.
BUT for you, in a desktop machine. Just add another monitor card. If you end up not using it, the cost was low. If you use 1920 type resolution, and travel (I use a laptop), I have 2 External/Portable USB type monitors. Nice and thin, all fit in the computer bag, so I can setup 2 or 3 monitors in my environment.
The ONE downside to the huge monitor. It's a postage stamp on someone elses screen if I share the whole thing... LOL.
Oh, and make sure you don't have to LOOK UPWARDS all day. It can throw your "bite" off, like TMJ...
|
|
|
|
|
Curious to know what portable USB type monitors you prefer.
<hr>
"Qulatiy is Job #1"
|
|
|
|
|
I have an AOC and a Colzer, I prefer the latter. It comes with a Tablet type "casing" that unfolds as a stand. It also uses USB C for everything (power, and video). And it has easy adjustments for Brightness.
The AOC requires special software to adjust any of the internal things, I've never bothered.
So if you love to tinker with Contrast/Brightness, etc... Keep that in mind when looking.
For me, they are viewing devices "under duress", I am usually flying out somewhere and trying to solve some complicated problem. They are a fallback. I often get put in a conference room where I hijack the projectors, etc.
I have connected it up, to be able to constrain a presentation...
My next one will be touch screen...
|
|
|
|
|
Buy a dedicated video card with its own RAM. Most integrated boards use the main RAM, which takes away from system use. You don't need to go high end, a mid-range board is sufficient for non-gaming needs.
A few years ago I purchased Skyrim and my old card (low end) couldn't handle it. I purchased a mid-range card, which fixed that problem. It also speeded up the system overall, which surprised me at the time, but it makes sense as the board provides dedicated hardware and reduces the load on the CPU.
|
|
|
|
|