Click here to Skip to main content
15,881,709 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hi,
What is the difference between Decimal and decimal in c#?
Which one should I use to declare a variable?
Thanks
Posted

They're the same thing. Use whichever one pops you cap.
 
Share this answer
 
Comments
arkiboys 26-Sep-11 10:13am    
Thanks
Decimal is the CLR type and decimal is the c# keyword.

They are both the same thing to your programs and c# creates a Decimal in the background.
 
Share this answer
 
Comments
arkiboys 26-Sep-11 10:13am    
Thanks
To declare a variable always use decimal keyword as it denotes a 128-bit data type.
 
Share this answer
 
Comments
Anuja Pawar Indore 26-Sep-11 10:06am    
refer
http://msdn.microsoft.com/en-us/library/1k2e8atx(v=VS.71).aspx
arkiboys 26-Sep-11 10:14am    
Thanks
To be more precise

Both are same, decimal is alias of Decimal.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900