Please confirm that I understand your question correctly; look at my comment to the question.
Here is my idea: at the level of the compiler, it's really hard to do this optimization. In some cases, the compiler does not have enough information. Consider there is a way to figure out that the function is "pure". It should check up not only the body of some functions, but also all called functions, recursively. But not all such functions come with the source code. What would this optimizer do? If all the called code is the .NET code, it's possible to reverse-engineer all code. But face it: ultimately, some code will always do unmanaged calls (unless the OS is something like Singularity, pure managed OS). You see, the CLI standard does not have a slot in metadata which carry information on the "purity" of the function, even if it is a part of .NET BCL.
So, that was my arguments explaining why I don't think that such optimization makes sense (again, please validate that I understood your idea correctly; look at my comment to the question). But now, in all cases, you can easily figure out what is going on in reality. Optimize your compilation, obtain the assembly (IL code) and reverse-engineer it. This is easy to do with very good quality if you use .NET Reflector or open-source ILSpy:
http://en.wikipedia.org/wiki/.NET_Reflector[
^],
http://ilspy.net[
^].
[EDIT]
Please see also my comment to the question, where I question the practical value of this optimization, due to presence of different function arguments.
For those rare cases when it makes a lot of practical sense (function calculation is really slow, calculations with the same parameters or no parameters is quite likely), you can easily introduce such optimization on an application level: create a dictionary of functions results found by the compound key based on input function arguments.
As it's not critical that you still recalculate the function result few extra times, you could have one dictionary per function (you hardly can have many of such functions), not having to put all the information on the function parameters. If could be just a hash value based on parameters value. Say,
using System.Collections.Generic;
type FunctionType =
FunctionType MyFunction(params object[] parameters) { }
Dictionary<int, FunctionType> myFunctionOptimizationDictionary =
new Dictionary<int, FunctionType>();
FunctionType MyFunctionWrapper(params object[] parameters) {
int key = 0;
foreach(object @object in parameters)
key ^= @object.GetHashCode();
FunctionType result;
if (!myFunctionOptimizationDictionary.TryGetValue(key, out result) {
result = MyFunction(parameters);
myFunctionOptimizationDictionary.Add(key, result);
}
return result;
}
Simple, isn't it. You can easily make this algorithm generic.
However, the side effect of such optimization can be the overuse of the memory spent for storing the "optimized" keys and values. Always a trade off. This fact, too, is a point telling us that the practical use of the optimization at the CLR level would be quite questionable.
I hope we can close this issue now. What will you say?
—SA