1 - I wasn't using ob_flush() inside the while loop. I added that in and it appeared to free up a lot of memory, enabling larger downloads, but not unlimited.
For example, with memory_limit = 128M i could now download more than 40mb, in fact i could now get up to around 200mb. But this is where it failed again. First memory issue problem solved though.
LESSON 1: Flush your objects!
2 - I was using mysql_query to retrieve the results for my SQL Query. The problem is that it buffers these results, and this was adding to my memory limit issue.
I ended up using mysql_unbuffered_query instead and this now works flawlessly.
This does however come with some limitations, that it locks your table while reading results.
LESSON 2: Don't buffer mysql results if not required! (within programmatic limitations)
All of these fixes work, however, it requires some more testing to ensure that there is no problems with the combination of them.
Also, I have learned a lot more about objects and php memory allocation, I just wish there was a way to visually debug the process a little better than what xdebug offers. If anyone has any ideas on how xdebug could have actually shed some light on this process, please let me know in the comments.
In the email header you are setting the MIME content type as HTML, using the ISO-8859-1 character set - that's the Latin-1 set, so that might explain why the Arabic is coming out wrong.
The simplest solution could be to change the header to use "charset=utf-8" instead of iso-8859-1, and also check that your application uses UTF-8 throughout (in HTTP headers, on the web page, form fields, database fields, etc.)
The problem that we want you to help us in solving it is as follows :
We have an excel sheet containing some data.
We want to import this data to a mySql database to be used by a certain application.
By importing this data through transferring it to CSV format, the data was successfully imported but the Arabic data was not displayed in the application while this data is displayed during browsing the database inside the MySql.
If the data is entered through the application and then retrieved , the Arabic data is displayed correctly.
We need to import this data (excel sheet) inside the MySql such that it is displayed correctly inside the application.
I traditionally live in the client app world, so minimizing lines of code is more important than performance, so my coding style is heavy on common functions, etc.
Now that I am working on a web app (PHP and mySQL), I find myself second guessing everything. Here is an example. I am working on a shopping cart for an existing site. There are multiple places where the price of a particular product is returned:
Cart over view
Buy process page,
In most cases, the product data (price) is already available, but the price can vary in different circumstances. I'm considering one function that gets passed the important variables, calculated the price of the product, and returns that price. However, to do so will require polling the database each time each price is requested (in some cases that is multiple prices on one page, so multiple calls to the database while the page is loading).
So my question is really one of best practice. How much of a noticeable performance hit does each call to the database take? Is it common to try to reduce the number of mysql calls? Do you not care? should each page have its own code, and just copy paste all over, and tweek each place where common data is shown?
There is a lot of how to questions (I have posted them) but I am hoping the community here can offer some more higher level theory.
Thank you in advance!
"We need to apply 21st-century information technology to the health care field. We need to have our medical records put on the I.T." —GW
Some issues first of all if you a can create common function that will be used by multiple page, then make that one function.
From my experience: making software using php sometime helps a lot. specially development time. But I have faced some issue when I cannot depend on php.
I have faced a problem with one of my Software. I had to calculate stock by daily basis, Its not simple calculation which can be based on simple (insert into stock)-sales+(return from customer), Its more complicated than that. Because we have a different kind of stock management just because of type of product. The result is php page suffers big time. it cannot show the result. most case it ran out of memory. So, I solved it by C. I developed this page using C
Another example is, report making. I take support from pdf making class MPDF(they are vanished, i cant find there site anymore). But It cannot handle big data, memory suffers. So, Now I am building them in C. Finally my point is PHP gives a great way of programming but not very good idea in every case...
I find that the iostat in Linux gives the average disk queue length different from Window.
Window use Little's Law, but Linux use "weighted # of milliseconds spent doing I/Os" (Last Field in /proc/diskstats) divided by the duration. I have searched in Google but I cannot find any mathematical explanation on this.