|
Damn R.I.P.
The less you need, the more you have.
Even a blind squirrel gets a nut...occasionally.
JaxCoder.com
|
|
|
|
|
I never heard of him personally, but then again I am not from England... or 1968...
But that doesn't mean he wasn't famous or important.
R.I.P.
|
|
|
|
|
I think I'm having Double Vision over that Cold as Ice comment.
Don't say anything to make me Hot Blooded.
I’ve given up trying to be calm. However, I am open to feeling slightly less agitated.
|
|
|
|
|
Hey no disrespect, my dad when he was alive probably listened to him...
Okay, like I bet some people hear might not have ever heard of Bob Saget, and he just died...
I was just saying that I didn't know who he was, but I wish he hadn't died...
|
|
|
|
|
That was a "whoosh."
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Fu. I still love Foreigners... as do most of the QA peons, it's always Urgent.
GCS d--(d-) s-/++ a C++++ U+++ P- L+@ E-- W++ N+ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system:
Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.)
Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish)
Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
|
|
|
|
|
Memtha wrote: debunked "compiled" vs "interpreted"
I don't think that "debunked" is quite the best term, but yes, there's nothing about a language itself which means that it must or must not fit into only one of those buckets -- for the most part, any language could be in either or both -- it's just the difficulty of implementation.
"Turing-complete" (vs not) and "general purpose" vs "domain specific" are decent attributes though. As well as how rich the set of supported datatypes is.
|
|
|
|
|
|
I prefer "strongly typed" and "cr@p".
But that's just an opinion based on seeing the maintenance problems you can avoid with strongly typed languages.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Do latently-typed or duck-typed languages count as "strongly typed"? This is always a point of confusion for me with strong vs weak distinctions - there is no line in the sand. Except for the outliers (rigorously-typed vs un-typed) you can make arguments for the majority of type systems being both strong and weak in different regards. I think Typescript is a great example of this since it has a gradual, structural type system.
|
|
|
|
|
In my opinion: No.
Typing should be explicitly visible in the program text, and clearly identified as a type.
Polymorphism, through subclasses, is OK. You can force run time type errors through casting, but casting is explicit.
As pointed out: No language is absolutely bound to being interpreted or compiled, but strict typing leans quite strongly towards a complete parsing of the source code before execution starts. When you do that, why not go all the way and generate the code? So strong typing leans towards compilation rather than interpretation, although not by definition.
|
|
|
|
|
That's really interesting that you consider readability as an attribute of strong typing. Not that it's wrong in any way; just that most attributes I've seen that people have come up with have been more function-oriented. Since your criteria have to do more with form, what's your opinion on inferred type systems like F# and Haskell?
|
|
|
|
|
I have not spent much time with pure functional languages at all - not enough to have any qualified opinion. For one project, 30 years ago, we evaluated Erlang, but rejected it for our use. My main experience with Lisp is from emacs F# and Haskell I know at "Wikipedia level", not from any practical use.
My professional upbringing is from the days when software designers still did data modelling. I know it is not comme il faut to state, in 2022, that Entity Relationship modelling has something to be said in favor of it, but I have seen ER being used very successfully in a number of projects to really get a grasp on the problem domain. And, it is an excellent tool for communicating with a customer: They will easily understand the concepts so that they can participate in development of the ER model, and when that is in place, they can successfully teach you the operations to be done on the data. (And sometimes they also realize that the current state of data collections and handling procedures is a mess ...)
So I have always been on the data model side, rather than the function oriented one. I guess that is an essential reason why I consider strong typing essential.
|
|
|
|
|
Agreed that strongly-typed is very important. Example: I semi-recently had to debug a C# method that had not been touched in 3+ years. The return type was expando. There were 6 returns. One of them was missing a property. It took 3 years to run into a niche case in prod where that return was hit. If C# was used correctly, strongly typed, it would have been a compiler error 3 years ago and no problem in prod. It took 2 days to find because the line number that attempted to access the missing property was in another part of the app completely; the function that called the expando-method turned the result to json and stuck it in the db for later retrieval by the method that would eventually fail, on a monstrous line that should have been 20 different lines and the error was an unhelpful null reference that could have been any property. Which is also why I try to avoid the fluffy shortcuts like anonymous methods and types.
|
|
|
|
|
That is what I call a submarine error.
You are sailing along nicely, and out of no where, it pops up and blows you out of the water.
“Where did that (null) come from?”
Nice catch.
This where a constructor that requires all values might protect you.
Or else have a validate method such that you
return obj.validate();
I remember one particular nasty code path, but with only 4 paths.
I defined a handful of booleans and left them uninitialized. In each code branch, I initialized some of the booleans. At the bottom of the method, I added a check like
Bool cya = b1 or b2 or b3 etc.
Compiler would catch me if it I tried to use an uninitialized variable.
After testing, you can remove the extra checks.
|
|
|
|
|
Quote: Typing should be explicitly visible in the program text, and clearly identified as a type.
I agree that it should be, but I do not believe this is a requirement for a language to be strongly-typed. Specifically, c++ auto and c# var break this rule but both are strongly typed just because misusing an object is likely to result in a compiler error.
|
|
|
|
|
var is still strongly typed, it is simply syntactic sugar so we can write and read the code more fluently. For ALL intents and purposes it represents a strong type reference that must be pre-compiled before execution.
|
|
|
|
|
For some uses of var : Yes, it goes as syntactical sugar.
Using var E.g. when used with LINQ can be extremely useful and flexible, but it can hardly be called 'strongly typed'. More like "You just have to take what you get".
|
|
|
|
|
Exactly. Each use of var represents a strong type, though which type is not explicitly visible in the program text. C# is strongly typed despite having var. That's what I was saying.
|
|
|
|
|
With var, we can write the code easier, but reading it in some cases is open to interpretation. It should not be necessary to look at a function call to see what it returns to glean the type returned that goes into the variable typed as 'var'.
|
|
|
|
|
I'd say C/C++ are weakly typed because you can always convert one type into random other type. Java/C# only allow limited conversions between related types.
|
|
|
|
|
Imo the difference in that respect between c++ and c# is part of the broader difference that C#'s runtime second-guesses your every instruction while c++ takes your word for it. In both cases, the cast makes it past the syntax check with the same meaning: "trust me, these are the same". The exception is when C#/java will fail validation if the known strong-type cannot possibly also be an instance of the casted type; but that is not applicable to c++ because of multiple-inheritance, an object could always exist that inherits from both of them.
C#'s second-guessing compares "the actual type" via reflection to the casted type, whereas most if not all c/c++ programs at runtime make no definition of type and no support for reflection; that's a runtime distinction, not part of the language for the same reason that compiled vs interpreted is not part of the language. As far as if the language itself is strongly typed goes, they are the same. A c++ compiler and runtime could be invented that does the same as C# without changes to the language itself, or vice versa. I believe such a c++ compiler could even be compliant, with enough effort.
The same applies to accessing an array with an invalid index, c# throws an error right away, because it first checked that the index you gave it matched what it knows to be valid, whereas c++ will trust you and perform the operation, likely resulting in a subsequent error (because the cpu second-guesses you at a more security-oriented level i.e., DEP).
modified 14-Feb-22 23:05pm.
|
|
|
|
|
Which eliminates most uses of the C# var statement.
|
|
|
|
|
and then there's APL
let max = list[0];
for (let i=0; i<list.length; i++) {
max = Math.max(list[i], max);
}
or the APL version
⌈/
The less you need, the more you have.
Even a blind squirrel gets a nut...occasionally.
JaxCoder.com
|
|
|
|