|
|
Thanks!
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
But before you try that, try to just simply disable the Samsung TV plus
Cannot be removed but you can disable it.
|
|
|
|
|
I have an LG 4k TV (my 3rd LG in a row). Display quality is excellent. I don't use it as my main monitor (its main purpose is as a TV) but as a monitor on a mini-PC I have in the lounge. There are no issues with that. (There is an option to turn off overscan for the relevant input, which is important, and it remembers the setting.)
The TV has good built-in apps for Netflix and other stuff.
It remembers the last selected source. You can create "shortcuts" on the remote (you hold down a number for 2 seconds) and use one to select a particular source.
The TV's OS and apps have been udated numerous times over the years and none of them have been intrusive. There are plenty of junk channels and apps available but they're not forced on you.
Incidentally, you can turn WiFi (and wired EtherNet) on and off easily.
Phil
The opinions expressed in this post are not necessarily those of the author, especially if you find them impolite, inaccurate or inflammatory.
|
|
|
|
|
Thank you!
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Are you wirelessly streaming your computer to the tv/monitor? I also have a Samsung 55", with a Dell Dock wired to it. No issues with it remembering the last source. I also have my Xbox connected as a separate source, and if it is the last source it comes up when I turn the tv on.
|
|
|
|
|
Is it a Smart TV? This only affects recent smart TVs, AFAIK
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Yes it is. It even has that silly Samsung Channel, among other such streaming apps
|
|
|
|
|
Hmm, this started doing it *right after* a firmware update on the TV. I got notified of the update, and then it restarted my TV on the samsung channel. Ever since then it has ignored my last source setting in favor of samsung TV. It is possible the firmware update itself didn't make it that way directly, but knocked something loose somewhere else. someone else posted some ways to reset my TV. I have yet to try it, because I was having a terrible day yesterday and I didn't want to make it worse. meh.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Understood. Don't mess with my settings, right? I detest when services that think they know better screw up what I have set up...
|
|
|
|
|
All my TV's are hooked up via HDMI. My UHD monitor runs TV via a TV app on the PC, which is a (full screen or windowed) browser app.
I assume your provider has a "TV app"?
(Yes ... I also find "smart" TV's can play dumb)
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
I don't have a cable provider. I use streaming only, but I like having my remote, and I want to keep it.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
I've been using various TVs as monitors for a number of years. The big drawback is that they think that they are TVs and often have some start page that they like to power up on. Not the HDMIn port that I only and always want to use.
When I enter my workroom and power up my desk a MCU with an IR LED attached sends out commands to power on the TV, waits a bit and sends commands to set the HDMI port.
The Samsung TV tends to switch itself off periodically (in spite of settings telling it not to) so the MCU sends an unused button press every half hour to keep the TV awake.
At the end of my session, the power goes off and in its dying throws the MCU sends a power off command to set the TV into standby.
If your TV has IR this is a possible. No damage to the TV, and independent of upgrades (maybe).
Andy
|
|
|
|
|
That's an interesting widget you made and a great potential IoT project for this site.
My TV doesn't go to sleep until my PC does. In fact, before the firmware update I had very little trouble with it aside from the inconvenience of navigating an on screen menu to switch sources, and the cheapo directional thing on the remote which is erratic about registering presses.
I'm not sure if the samsung smart TV remotes are IR. There is no visible IR transmitter on the widget, but I can control it over wifi, badly, with my phone, except powering it on, which makes it kind of useless.
I'm getting an LG probably. It has source buttons on the remote, and nobody I've talked to has had any problems with them (other than old, eventually defective units) that would prevent their use as a monitor.
Plus the 120Hz OLEDs are supposed to be visually fantastic, although that's almost certainly true of many other brands as well.
I found one for $1100USD on amazon in the 55" size I want/need for my mount. I'll buy locally though, even i pay a bit more due to me being averse to shipping those things more than necessary.
Thanks for your input!
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Gold providing cold ? hole (7)
Gold: Or
providing: if
cold ?: ice
hole: Orifice
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
I got hooked up on Gold ... Au and never thought of French.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I think OR is from the heraldic name ORO (also Italian for gold ) very commonly used in cryptics
Or
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
I had orifice, but I couldn't make "if" stick.
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Do you think the clue was poorly written Peter ? I did think of using may be for if ! but it didn't read well
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
Probably a bit clumsy, but no worse than many of mine.
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since.
When your CPU core(s) aren't performing tasks, they are idle hands.
When your RAM is not allocated, it's doing no useful work. (Still drawing power though!)
While your I/O was idle, it could have been preloading something for you.
I see people complain about resource utilization in modern applications, and I can't help but think of the above.
RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100%
Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly.
Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster.
My point is this:
Utilization is a good thing, in many if not most cases.
What's that old saw? Idle hands are the devil's playground. Your computer is like that.
I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes.
This means my computer isn't wasting my time.
Just sayin'
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
modified 8-Sep-23 9:15am.
|
|
|
|
|
I recall reading a short essay years ago by a senior OS engineer (Microsoft or Apple, not sure) that said much the same. It makes good sense IMO.
Thanks for the reminder.
|
|
|
|
|
I think you didn't think that to the end...
If any single software would take up all the resources it would kill any real productivity...
Let us say VS takes all memory just when opening a solution... Now I ask it to compile that solution... VS - by default IIRC - will compile 8 projects in parallel, so it will try and fire-up 8 instances of msbuild... But there is no memory, so before each and every of those 8 instances the OS will do a memory swap... And memory swap for the 4th instance of msbuild may take memory from the 1st instance as it may be in IO (blocked) and considered inactive... And memory swapping is very expensive...
I do agree that any app should utilize all resource when it needs it, but it also should release it the moment it needs it not...
"If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization." ― Gerald Weinberg
|
|
|
|
|
I did. I said in an ideal world RAM utilization would always be at 100%. That's a hypothetical. It's not intended to be real world, but rather illustrative of a point: RAM is always drawing power, even at idle. The most efficient way to use it is to allocate it for something, even if you do so ahead of time.
I did not say that it would or even should be utilized by one application.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
If I may interject: memory is always used at 100% by an app called "operating system". Parts that are not urgently needed are relinquished to other apps upon request.
In the scenario pointed by Peter, how is VS going to know how much memory MSBuild instances are gonna need? Should they ask VS pretty please to release the memory? Is VS going to act as some type of surrogate OS?
Memory hogging is not a disease of VS only; it's a virus that has spread to browsers and many others.
Mircea
|
|
|
|