Not that I have come across. The best way to write Valid Xhtml is to take full advantage of layout. Be VERY strict about tabs and things like that. Pretend you are writing XML (Which you essentially are). Dreamweaver has a reasonably good tool, but I would not rely on it.
Another good posibility are the many firefox addons. These are free and often quite good.
- Christian Graus on "Best books for VBscript"
A big thick one, so you can whack yourself on the head with it.
i have to play videos in my aspx pages but i cant find any control for that. So, is there any control to be downloaded in my application to play videos. In html i know how to use media player control but i need to code in C#.
Christian Graus - Microsoft MVP - C++ Metal Musings - Rex and my new metal blog
I have to implement CSS in my C# asp.net web application.
I made a Css class and used it in my aspx form. But formatting is applied only on backgroud of the form. formatting on the web controls like buttons and labels is not applied. Can anyone guide me to solve this problem.
This is Ashok Varma V. S. working with Cibernet, India.
I found you the right person to answer my question.
I have XSD, XML files and want to validate the XML against the XSD and then only proceed with parsing the XML file.
I used XML::SAX::ParserFactory, XML::Validator::Schema modules but following is the issues i faced ...
1. Failed with this reason --- elementFormDefault in <schema> must be 'unqualified', 'qualified' is not supported.
I modified the attribute 'elementFormDefault' value to 'unqualified' and it worked (practically, i should not do this).
2. The module that i am currently using(give below in code snippet) is not supporting <import> tag.
Can any one please suggest me a better module which does the above.
code snippet i used
my $xsd_file = '/Users/vashokvarma/EID/xmlfile/EID.xsd';
my $xml_file = '/Users/vashokvarma/EID/xmlfile/EID.xml';
my $validator = XML::Validator::Schema->new(file => $xsd_file);
my $parser = XML::SAX::ParserFactory->parser(Handler => $validator);
Is there a way to do the following on a PHP web page hosted on an Apache server?
1) Limit the number of users that can download a given file at any given time
2) Throttle the download speed so that even though the same bandwidth is used on a download, coupled with a concurrent user limit (described above) it would effectively reduce the bandwidth per day.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997 ----- "...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001
I'm no expert in PHP or Apache but the following could be one way:
Firstly the URL for the file should point to a PHP script.
Using a text file with simply a number in it, each time the script runs it will open the file at the start of the script and check the number. If its over the limit script execution will end. If not it'll incriment the number and continue (at this point you'd also have to close the file handle to enable other requests to open it).
Set the Content-Type header to the type of file so the browser handles it correctly. Also set the content length to the size of the file.
To throttle the download speed, open the file to download in the script and read a limited number of bytes (I think the maximum you can actually read at a time is 4096, not sure on that though). Flush those bytes to the response and then suspend the script for a few hundred miliseconds, and repeat until the whole file has been sent. Not 100% sure on how this should be done to enable several multiple requests, it might be a case of reading the whole file at once into a byte array and closing the file handle before sending any response (ofcourse this depends on the size of the file and the amount of memory).
Open the text file again and decriment the number.
I have been in an argument at work involving this and I just want other people to hopefully back me up, but if I am wrong PLEASE let me know.
The current belief is that if you check for enabled scripting and deny anyone without that turned on as well as to disable the "Mouse Right Click" and "CTRL-N" that you can prevent users from seeing the source of the page. This would matter because like I asked before, information would be stored in hidden variables due to not using session variables. I am pushing for this change but for the time being that is what the mindset is. What I am asking is will this work, will it not work, and if not what can I tell these guys to convince them that they are wrong?
Last Visit: 31-Dec-99 19:00 Last Update: 1-Mar-24 11:29