You have to check with the hosting services in what way MySQL configured...You have two main options...
1. MySQL accessible only from applications running on the hosted server...
2. MySQL made public and you can use it from the outside world...
In my experience it is always the first option you get...
But! also in this case you can access your data from outside, by adding a layer - of some API - of software to it...
Skipper: We'll fix it. Alex: Fix it? How you gonna fix this? Skipper: Grit, spit and a whole lotta duct tape.
"As i understand your MySQL Database is Hosted somewhere using a Professional hosting company. These companies wont just allow outside connections even you knew your DB login credentials"
Yes, you are right ... and yes, I already added my IP in the server white list .. not working ! And the solution given by the server administrators was to buy a VPS hosting on their server, or a dedicated server ... for me, in this stage, is not worth it ... I wonder if there exist another workaround ... ?
I have developed a windows service and deployed the same in the server. This service is running in local machine without any issues. The service in the server, stops abruptly and below are some errors that I could capture in the EventViewer
Faulting application name : MyAppService.WindowsService.exe, version: 18.104.22.168, time stamp: 0x5537a89d
Faulting module name : clr.dll, version:4.0.30319.18063
Exception Code : 0xc00000fd
Faulting application path : D:\App\MyAppService.WindowsService.exe
Faulting module path : C:\windows\Microsoft.NET\Framework\v4.0.30319\clr.dll
We have a production environment with three nodes under a load balancer. The nodes are windows 2008 R2 servers with IIS 7.5. The web application is installed in all three servers. Recently, we see that many users have issues working with the web application like;
They get logged out (I think session is lost)
The application takes the user to a different page from the page they are working on automatically
Sometimes the css and styles does not gets loaded completely
This is turning out to be critical in our case since we have more than 200 live users working with the system and they all are experiencing problems. At first we thought the issue is with load balancer and so we asked them to use the server directly without hitting the load balancer but there again we found the above issues.
Symantec 11 was installed in these servers and everything was working fine. Recently we upgraded to 12.1.3001.165.
Is this related to Symantec 12.1.3001.165.
I enabled failed request tracing and found few errors like.. Event: MODULE_SET_RESPONSE_ERROR_STATUS (Module Name: IIS Web Core) Event: MODULE_SET_RESPONSE_ERROR_STATUS (Module Name: DynamicCompression)
Could anyone help us fix the above issues. Any help will be appreciated.
Lost sessions are usually caused by combining in-process session state with multiple processes or servers. If it's still happening without the load balancer, check the "Maximum Worker Processes" setting for your AppPool (Advanced Settings -> Process Model). Anything other than "1" means you have a "web garden", with multiple processes serving requests for the same application.
Maximum Worker Processes = 1
The same machine key is specified in all the servers.
CSS works fine but when the user does a ctrl+F5 to reload, some times (very rarely) the css does not load and the site will look odd without the styles.
I am not sure if the session is getting lost or the IIS crashes or if Symantec endpoint is blocking something...its a bit weird..
Management wants people to use remote desktop applications to access testers (multi-million dollar units located in various factories throughout the world) for debug, people in the US can use the ones in Asia during not production hours, etc. to save costs (maximize tester use and minimize travel).
Management reports that people have complained they do not want to because the remote desktop connection is slow. I have asked many many people who do this if there is an issue and they generally do not report any issues, but management insists there is.
Management has therefore asked me to review the remote application software used (currently VNC and Microsoft Remote desktop) and quantify, with data, the performance of these or other options.
My initial thought was to capture screen frame rates using something like Fraps, but Fraps does not seen to catch many non game applications, and installing games on the testers is probably not going to fly.
so given the background, anyone have any thoughts as to how I might measure this in some way?
We have remote desktop servers installed at the office for remote programers to use.
I did a lot of research on this, and they have special hardware that optimizes the remote workstation experience.
Plus you can setup a hyper visor, and run several virtual machines that serve as remote workstations, so you can load up 2 or 3 or them in the same box. With graphics, you need one hell of a GPU like the NVidia Quatro Card for fast processing.
Most slow remote sessions are from the host not being able to run fast enough, to send the remote screen data across the wire and back.
I know of a lot of companies in the middle east, in which the employees remote in from the UAE into Saudi Arabia, so they don't have to live there and work.
I just bought a Dell Solution and let my friend set it up for me, and it works fine. No complaints so far from my remote contractors.
As far as speed testing goes, I did planning first, so it wasn't really an issue.