|
The central goal behind Ada was to make it difficult to make common programming mistakes, and to have the compiler enforce as many conditions for correctness as possible. The language design succeeded in a lot of respects, but it made it difficult to write code in the language. It was very frustrating learning how to appease the compiler.
In my case, we were beta testers for one of the first VAX/VMS Ada compilers. It was difficult to tell the difference between genuine flaws in our source and possible compiler bugs.
Software Zen: delete this;
|
|
|
|
|
Well
This c++ 11 trick is very powerfull, really!
I prefer to do all the hard work from scratch (ok, have R #).
I believe that this threshold "semantic" is outside the domain of a programming language (commonly, it is a "system domain" concept), because it is difficult to predict the particularities of a user-defined (conversion, comparison, integrity, serialization, etc.).
|
|
|
|
|
greydmar wrote: I believe that this threshold "semantic" is outside the domain of a programming language (commonly, it is a "system domain" concept),
Yes, that's what makes it interesting to look at.
Marc
|
|
|
|
|
Well, lets consider the following task:
Imagine the same issue when your development includes (for example) some SQL tables with some (or a lot of) semantic fields (unit of measure, complex number, currency, etc.) and you decides (of course, you are a "programer"!! ) performs some calculations using SQL dialects (T-SQL; stored procedures, udf, package, PL/SQL)...
So, it's a kind of art do ordnary operations (conversion, arithmetics) without "semantic types", no?
|
|
|
|
|
"which completely loses the concept that 51 is an age (in years)."
so write
int ageInYears = 51;
There are many ways to do it in c# (pass the int value in constructor, make an implicit cast operator, ...), but imho the best way is sticking with plain int, as that what "age in years" exactly is.
|
|
|
|
|
CHILL (CCITT HIgh Level Language - CCITT was the old name of ITU-T) probably has the best developed strict type system of any industrial language. One of the features was the distinction between a SYNMODE and NEWMODE definitions (MODE is the CHILL term for type/class): A SYNMODE defines restrictions (like value subranges) or aggregates (like arrays) of existing types, but (within the restrictions) fully compatible with the base mode. A NEWMODE is similar, but defines an incompatible mode. So if you make new integer modes AppleCount and OrangeCount using SYNMODE, you can add apples and oranges. If you define dem using NEWMODE, the compiler won't allow you to add apples and oranges without an explicit cast.
CHILL was developed to be the ITU standard for programming telephone switches, but the language design is just as general as, say, c or java. It never made any success in non-telephone environments (and even there it never took more than about half of the market), which is a pity: CHILL is one of the most thoroughly well-designed languages there is. But then again: The marketplace isn't known for always selecting the best designs... (I won't give c as an exapmple of that, that could hurt some people's feelings).
|
|
|
|
|
Member 7989122 wrote: CHILL (CCITT HIgh Level Language - CCITT was the old name of ITU-T)
Interesting! Your description of restrictions, aggregates, base modes, and incompatibility reminds me of what I was reading about Ada.
Marc
|
|
|
|
|
Marc has already pointed to Smalltalk, someone else pointed out C++, I'd like to add Haskell to the mix.
Getting type safety would be easy:
newtype meter m = meter m deriving (Show, Num, Eq, Ord)
newtype foot f = foot f deriving (Show, Num, Eq, Ord)
-- Usage (ghci> is an iteractive prompt)
ghci> let m1 = meter 5
ghci> let m2 = meter 10
ghci> let m3 = m1 + m2
ghci> m3
meter 15
ghci> let f1 = foot 3
ghci> let f2 = foot 4
ghci> let f3 = f1 + f2
ghci> f3
foot 7
ghci> let e1 = f1 + m1 -- Won't Work - produces error of types mismatching
With more work, it can be extended to add support for conversions, magnitudes (nm, mm, m, km, ...).
Indeed someone has already done all this work for us: Dimensional[^] and Dimensional using type families[^].
All available at the standard repository Hackage[^].
"If you don't fail at least 90 percent of the time, you're not aiming high enough."
Alan Kay.
|
|
|
|
|
Fascinating - I was just reading the Python example below and then looked at your Haskell example. Thanks! Also, I appreciate the links.
Marc
|
|
|
|
|
Don't know if your familiar with Haskell at all, but its a wonderful thing. One day I'll undwerstand it well enough to use on a real world project.
The example given is relatively simple. I particularly like being able to derive properties (type classes) like Ord (a bit like comparable), Num (numeric), Eq (equatable) and Show (convertible to string) automatically for many types.
Be aware that "type classes" are not classes in an OO sense at all. More like interfaces. Even more like "concepts" in C++ (when they finally make it into the language).
I think I've learned more from learning Haskell than any language since I learned Smalltalk.
"If you don't fail at least 90 percent of the time, you're not aiming high enough."
Alan Kay.
|
|
|
|
|
Rob Grainger wrote: Don't know if your familiar with Haskell at all, but its a wonderful thing.
Nope, but I've been reading up on it since your previous post!
Rob Grainger wrote: I think I've learned more from learning Haskell than any language since I learned Smalltalk.
Interestingly, I learned more about the principles of programming from this book[^] than I ever have from any actual comp-sci book. I kid you not - biology and programming have a lot in common.
Marc
|
|
|
|
|
I am not surprised. Alan Kay has stated that biology heavily influenced the invention of OOP. I think he touches on that in This interview[^]
"If you don't fail at least 90 percent of the time, you're not aiming high enough."
Alan Kay.
|
|
|
|
|
Rob Grainger wrote: Alan Kay has stated that biology heavily influenced the invention of OOP
Ancient Greeks established paradigm of capture world events into models (abstract concepts schemes) without cares with empirical verification...
Biology and statistics with your main concepts (store, compare, infers!) had heavily influenced since Medicine up to Mechanical sciences working from a diferent perspective: World evidences first (facts, events, observations). Model concepts later.
An interesting book about this, available only in portuguese (BEHIND THE SCENES OF SCIENCE - Resistances Scientists to Scientific Innovation) it's a very enlightening
|
|
|
|
|
In Python you can inherit from int (something like):
class AgeInYears(int):
unit = "years"
def __new__(cls, age:int):
cls.age = age
return int.__new__(cls,age)
def __str__(self):
return "{} {}".format(self.age, self.unit)
def __add__(self, value:int):
self.age += value
return self
class User():
def __init__(self, age:int):
self.age = AgeInYears(age)
def __str__(self):
return "User: age: {}".format(self.age)
And then use it like:
user = User(29)
print(user)
print(user.age)
print(type(user.age))
user.age += 2
print(user.age)
print(type(user.age))
Of course, you still have to override some default methods of int (new, add).
This should cover the semantics nicely: you have an int, with unit, you can do basic int operations.
Ed
|
|
|
|
|
Well, that is very snazzy - I'll take a closer look at Python now. Thank you for taking the time to put together that example.
Marc
|
|
|
|
|
I'm sorry but this has to be amongst the most pointless discursions this forum has ever seen!
Quote: int age = 51;
A perfect encapsulation of an immutable truth ... age is just a number! What more could you possibly need?
|
|
|
|
|
Member 9082365 wrote: A perfect encapsulation of an immutable truth ... age is just a number! What more could you possibly need?
OK, what's 34 (besides "just a number") ?
Marc
|
|
|
|
|
Unless you're a thoroughgoing Pythagorean that question is no more meaningful than 'how is 34?' or 'why is 34?' It is just a number and as we all know (post pi) number doesn't exist in any real sense at all. As written it is an arbitrary typographical representation of a concept within an equally arbitrary logical system that bears no relation other than by extrapolation with what is, was or will be. It is an abstraction. A useful abstraction but an abstraction nonetheless.
And doubly so once you bring time into it! Contrary to popular opinion even if we were born simultaneously our real ages are almost certainly different! That's relativity for you!
|
|
|
|
|
Actually, most people would consider that numbers are as real as most human concepts, if not more so due to their universal applicability.
By your logic trees don't exist because "tree" is purely a label we've conjuured up to describe the commonality of all trees. Colours do not exist because "colour" is a word we use to describe colours of all objects. So if anything exists then numbers are as real as anything else - just because they have no physical form does not render then nonexistant. If you reject naming things, then you reject language and symbolic systems as a means of communicating. Good luck with that.
I'd suggest Wittgenstein's Philosophical Investigations for a more thorough consideration of this area.
More interesting questions he tackles are of the form "what is a game" - now that really is hard to tie down.
"If you don't fail at least 90 percent of the time, you're not aiming high enough."
Alan Kay.
|
|
|
|
|
But trees don't exist and 'tree' is indeed a label we've conjured up not to describe the commonality of certain plant species but to generalise sufficiently to deal with the fact that there is very little commonality in reality but life is way too short to demand strict accuracy in language. I'm surprised that you find anything to the contrary in Wittgenstein (although I appreciate that interpreting Wittgenstein depends very much on what mood he was in at the time of writing!)
And where do I reject symbolic systems and naming? I simply accept Kant's logic that we never do and never can know what is really there. In other words the fact that you can name something or fit it perfectly into a logical system does not make it an ontological necessity. The ability to speak of 'number' or 'trees' and use those concepts in the most complicated yet logical manner does not mean that number or tree actually exist. Even the most rigorous of ontological 'proofs' will always fall prey to the fact that they are a product of the logical system in which they operate and therefore require an unjustifiable acceptance of that system.
Of course, for the conduct of everyday life I bandy terms like tree and 34 about without the slightest embarrassment. It keeps one out of mental health institutions for starters! But that doesn't require commitment to the notion that they connotate anything real. If Hume could play billiards and at the same time believe that there was no such thing as cause and effect a little duality of thinking can't do any harm!
|
|
|
|
|
|
Member 10199043 wrote: PROLOG
Hmmm.
Prolog is an untyped language. Attempts to introduce types date back to the 1980s,[42][43] and as of 2008 there are still attempts to extend Prolog with types.[44] Type information is useful not only for type safety but also for reasoning about Prolog programs.
Marc
|
|
|
|
|
You could use C/C++
typedef int AgeInYears;
AgeInYears myAge = 51;
No need to define operators etc.
You could even use Ada
type AgeInYears is new Integer;
type AgeInMonths is new Integer;
myAge: AgeInYears;
yourAge: AgeInMonths;
Ada won't let you type
myAge := yourAge / 12;
unless you cast it.
|
|
|
|
|
An interesting language for doing this type of thing is Julia (http://julialang.org/[^]). In Julia you can A) Use typedefs to indicate that AgeInYears is a typedef for Int or B) Make a new type AgeInYears that is a subtype of Integer and implement a converter function (these are a standard Julia concept) from Int to AgeInYears so that age::AgeInYears = 5 will resolve correctly.
|
|
|
|
|
aschmahmann wrote: An interesting language for doing this type of thing is Julia (http://julialang.org/[^]).
Reading the docs, that looks very very interesting! Thank you for pointing out Julia!
Marc
|
|
|
|