|
Quote: I fondly remember discovering that the byte type is signed (why, really why ???)
I know the answer to this one: it's because the representation of signed integers is specified (as two's complement), so there is no problem with bit-shifting (or other bit operations) them as there is with bit-shifting signed numbers in C++ (which doesn't specify the integer representation).
In C++, for example, you cannot write this (to check if variable 'a' overflowed):
if (a + 1 < a) { ... }
and expect it to work on all conforming compilers because there is no guarantee that the underlying representation uses twos-complement. In Java the representation is guaranteed to be twos complement.
In C++, the result of:
a = b >> 3;
Is undefined if 'b' is signed. In java it's defined (right-shift twos-complement).
Sure, those are sh*tty reasons to leave out unsigned types, but they're still reasons. The only place you're likely to run into problems is when trying to add two signed 64-bit numbers that overflow the signed 64-bit type.
|
|
|
|
|
Thanks for the explanation.
Member 13301679 wrote: Sure, those are sh*tty reasons to leave out unsigned types, but they're still reasons. The only place you're likely to run into problems is when trying to add two signed 64-bit numbers that overflow the signed 64-bit type.
I am OK with the design decisions made, the issue there was conflicting expectations. The Java language designers choose to treat bytes as signed integers, while in the embedded world a byte is simply a string of 8 bits and endianess.
In my case, the problem wasn't math but reading data from a serial port. I was getting "garbage" until my colleagues told me that in Java bytes are signed.
|
|
|
|
|
Quote: I am OK with the design decisions made, the issue there was conflicting expectations.
Yes, that is the problem: if you're going to have a type called 'byte' in your language, it's better represent a pattern of bits (no matter the size of the byte itself).
As I get older, I get more and more irritated that all languages are not keeping to The Law of least Astonishment; they have rules, then exceptions to those rules, then overloaded meanings for keywords, then overloaded meanings for characters ...
I keep meaning to design a language that is easier to read than to write: most modern languages right now make the reading very difficult unless you have unreasonable context memorised. C++ is a great example of producing "easy to write, but impossible to read" compound statements - most lines in C++ require knowing complex rules, some part of the standard and some part of the program.
In particular, I'd like to shoot whoever thought that pass-by-reference must only be indicated in the declaration, and "helpfully" filled in by the compiler at the point of call. This means that looking at any function or method call is meaningless even if you just want to ignore it while looking for something else.
|
|
|
|
|
NelsonGoncalves wrote: I fondly remember discovering that the byte type is signed (why, really why ???) Since my student days (long ago!) I have been fighting this concept that "inside a computer, everything is a number". No, it isn't! Data are bit patterns, that are bit patterns, and not "zeroes and ones". A type defines which bit patterns are used (on a given machine) to represent various values, such as 'x' or 'y'. They are letters, da**it, no sort of 'numbers'. Similarly, colors are colors. Dog breeds are dog breeds. Weekdays are weekdays. Seasons are seasons.
One problem is that computer professionals are among the most fierce defenders of this 'number' concept, arguing that 'A' really is 65 (or, most would prefer 0x41, but still a 'number'). They think it perfectly natural to divide 'o' by two does not give 'c' (as you might think from the graphical image), but '7' -and that is a perfectly valid operation because 'o' is really not a letter but the numeric value 111, and '7' is really 55.
Even programmers who have worked with objects and abstractions and abstractions of abstractions still are unable to see a bit pattern as directly representing something that is not numerical. They cannot relate to the bit pattern as a representation of abstract information of arbitrary type, but must go via a numeric interpretation.
So we get this idea that an uninterpreted octet (the ISO term, partially accepted even outside ISO), a.k.a an 8 bit 'byte', in spite of its uninterpretedness does have a numeric interpretation, by being signed.
I shake my head: How much has the IT world progressed the last three to four decades (i.e. since High Level Languages took over) at all in the direction of a 'scientific discipline'? When we can't even manage abstractions at the octet level, but insist on a numeric interpretation when it isn't, then I think we are quite remote from a science on a solid academic foundation.
The bad thing is that we are not making very fast progress. 40+ years ago, you could, in Pascal, declare 'type season = (winter, spring, summer, fall)', and they are not numeric: You cannot divide summer by two to get spring (the way you can in C and lots of its derivates). There is no strong movement among software developers for a proper enumeration, discrete value, concept: We have written so much software that depends on spring+2 being fall. It would created havoc if we abandon the idea of a season as an integer value.
The entire software field really should run a complete set of regression tests every month, and if any regression (say, from the Pascal enumeration concept to that of C derivate languages) is detected, actions should be taken immediately to remedy it and bring the profession back to a proper state!
|
|
|
|
|
trønderen wrote: We have written so much software that depends on spring+2 being fall. It would created havoc if we abandon the idea of a season as an integer value.
I actually think it is a good thing that we can do 'fall = spring + 2'. First because it makes sense intuitively. Second, because although computers hold bits you need to do operations on those bits otherwise computers are not useful. And (some sort of) math seems to be a good enough common denominator that mostly everybody can rely and build on.
Personally I tend to view software development more as art, and less as science.
|
|
|
|
|
It is your view because you are a programmer, ask Ops.
As mentioned above, versioning. Tell me another software with so many different version number referencing the same thing.
Versioning 2.0 : Have you found your version number? Then here is another error message for you: class can not be loaded because it is version 52 but your JVM supports only up to version 51.
Well, disk space was cheap THEN, so maybe it was not an issue that a 4th digit upgrade installed the full thing in a new directory. Or at least it was not a problem till someone set the JAVA_HOME to one of them, and yum removed the other one...
The usual Java software customizable to an amazing degree via command line switches and config files, but very slow to start up, because it has to read all those small XML files.
Config 2.0: The main selling point of one commercial Tomcat clone is that you don't have to hunt tens of config files to set up the thing.
|
|
|
|
|
Quote: I don't understand the snarky comments one sees about Java. Confused |
Because language-hating on $OLD_AND_STABLE_LANGUAGE is currently fashionable. FCOL, 3 years ago I was reading commenters on reddit asking "Why would anyone in their right mind start a new project in C when Rust is available?".
In that time, however, Rust changed so much that it is slightly incompatible.
My observation is that, feature-wise, all languages converge towards Lisp, while syntax-wise, all languages converge towards C++ (which itself is, syntax-wise, already past the madness event-horizon and still accelerating).
Java (and C#) when initially released were fairly easy to read for anyone coming from almost any existing language (C, PHP, Perl, etc). Each feature added, added to the syntax instead of replacing existing grammar (so they didn't break existing programs), leading to the situation now where a symbol may mean almost anything.
I look forward to a future where source code looks more like BF than like Pascal /s ...
|
|
|
|
|
i personally dislike javans because they look down on other people. that is something you haven't noticed because you were on "the winning" team, java and oop. haven't you heard jokes about perl, c and javascript in the last 20 years?
it's a personal feeling you get that you cannot describe to others. it's subjective. let me try to put a lighte on it by using this totally unrelated article:
Linux Mint users are surprisingly irresponsible regarding updates
who is this guy telling other people how to use their computers?
now imagine this fairy tail rewriten to fit java propaganda from 10 years ago (now javans start to taste the truth that perl and javascript were vastly more powerful than java since inception and the only thing java's got is the story of oop) that everyone is irresponsible by using languages like c, tcl, javascript... any other language usage than java is irresponsible. who needs all those obsolete languages and why are they not facing the inevitable, that there should and will be only java and oop.
all this without any real reason for javans to look down upon others.
java was never a better c++ as it was advertised. it may be better for some people, but not for others.
"Java owes much of its initial popularity to the most intense marketing campaign ever mounted for a programming language." - Bjarne Stroustrup
and in that campaign advocates of java were visiting corporations telling people bad things about c++ and what not about c or other languages.
every now and then an article would pop up, like "why the c language refuses to die?!" and why is it now difficult to understand that some people want revenge? even if we have to change java with something more bizarre like a purely functional dictatorship static typed language that looks down on java and everybody else at the same time. that's so lame.
here is an article from java oop programmer Execution in the Kingdom of Nouns
here is a lisp programmer Why Lisp? | Hacker News
notice the: "A Lisp programmer who notices a common pattern in their code can write a macro to give themselves a source-level abstraction of that pattern. A Java programmer who notices the same pattern has to convince Sun that this particular abstraction is worth adding to the language"
same works for c, "These languages [Java] solve problems by adding more language features. A language like C solves problems by writing more C code."
all in all, i dislike java because i was walking along my way when i heard shouting from the other side of the street, 'hey, you are no good. i'm better', for which i could not find a reasonable explanation.
i disliked having to answer job interview question "what are the benefits of oop" for straight 15 years in a row and bs article posts by people who tried to stay relevant and prove the benefits of oop by giving a small and elegant oop example vs an grotesque, usually procedural, code that resembles someone who is learning basic with line numbers. but it was never the other way around. they never show you the flip-side.
Alan Kay (1997) “I invented the term object-oriented, and I can tell you I did not have C++ in mind.” and “Java and C++ make you think that the new ideas are like the old ones. Java is the most distressing thing to happen to computing since MS-DOS.”
sadly i can go on forever which will be a waste of time. i have a particular bookmarked folder for every anti-class based-oop post i have stomp upon written by people who by my opinion have competent or expert skills at programming including oop.
|
|
|
|
|
Quote: i have a particular bookmarked folder for every anti-class based-oop post i have stomp upon written by people who by my opinion have competent or expert skills at programming including oop.
Maybe you should share that
|
|
|
|
|
|
From the C# side, it feels clunky and verbose and despite having added a few modern features like lambdas overall feels like I'm stuck in C# 1.x.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Going to Java after C# is really tedious, especially for multi-threaded applications.
|
|
|
|
|
I tried writing an accounting program in Java, but gave up, when I got tired of trying to outwit UI classes that didn't do the job, and having to write factory classes to construct other classes to do the simplest things in the most complicated way.
|
|
|
|
|
Were you using Swing or JavaFX?
Get me coffee and no one gets hurt!
|
|
|
|
|
It was a very long time ago, at the time Swing was being developed. So, that may have fixed some of the formatting issues. I wanted an HTML-like table with wrapped text in the cells. I think I succeeded, but the low level code I had to do to wrap text was not pretty. I still have a problem with all of those factory classes.
|
|
|
|
|
Well, today the task would probably be a lot simpler: Create a TableView with a TextArea in the cells. That should do it, but you would probably need JavaFX. I have created a number of TableViews with controls like CheckBoxes and Rectangles in the cells.
Get me coffee and no one gets hurt!
|
|
|
|
|
That's nice to hear! Just about 20 years too late for me, but still nice to hear, in case I ever use Java again.
|
|
|
|
|
As an embedded engineer, I've only worked on one project that had enough memory to support java. The only snark I'd have for Java is the lack of unsigned integers! I will say that my favorite editor is java based (jEdit).
|
|
|
|
|
Depends on what you what domain you are working in. I wrote a DAB (digital Aufio Broadcasting) decoder in C++ and simplified versions - just as a programming exercise - in Ada and in java.
The type of program of such a decoder requires extensive interaction with libraries written in C
(to name a few, device handling, fft transforms, and aac decoding).
In my personal opinion, binding java structures to C libraries is a crime.
btw the Ada binding is simpler since I was using the Gnat compiler system, but even then ...
Java is just a language, it is not my choice, but for many applications it seems more or less OK.
Personally I do not like the GUI handling, but that is probably a matter of taste.
The misery with binding to non-java (read: C) libraries is such that I would not recommend it
for applications where one depends on that kind of libraries
(I dislike all kinds of so-called integrated environments such as IntelliJ or whatever,
right now I am writing some stuff where I (more or less) have to use VS as development
environment. It is probably my ignorance, but I absolutely dislike the destruction of
the formats I use in my coding, and the error messages are a horror. For me the command line tools such as vim, qmake, mae and the GCC suite - with gdb as debugger under Linux - are
the ideal development tools)
|
|
|
|
|
What's wrong with Java is that it's 30 years old or something, and its architecture is a prisoner of what was available at the time: it no longer makes any sense now that richer and more capable systems exist. And it doesn't have properties, which is ridiculous.
|
|
|
|
|
Haters are going to Hate! No matter what.
The VB guys have been living with B.S. for years.
Wear a mask! Wash your hands too. The life you save might be your own.
|
|
|
|
|
Oracle.
They're really bad at making software ecosystems that are pleasant to work with.
Nothing wrong with the language on it's own though.
|
|
|
|
|
I've done a lot of work in Java and I can say that I'm not a huge fan of it. If it works good for you and you like it then there is nothing wrong with it.
I can't even place exactly why I don't like Java. I like C#, VB.net (nostalgic), C, Rust, javascript. it's funny because at one time I didn't like C, and bashed on it because C++ was "better", getting older now I actually enjoy C more than C++, and literally haven't touched C++ in a decade at least.
there's no point in bashing languages, if they become unpopular enough they go away on their own or adapt.
|
|
|
|
|
Yes, 'journalism' is in quotes, because that's being very generous - and I just couldn't think of anything else to call it. Suggestions welcome!
A couple of days back ZDNet published an articled, entitled: "What is Agile software development? Everything you need to know about delivering better code, faster". So what's my gripe?
* Agile must have been around for nearly 20 years by now, so this isn't news.
* The article is a one minute read, so I don't think it covers everything we need to know.
* It's just regurgitating stuff that's been said a million times already.
* Without making any effort to show proof, it repeats the Agile marketing mantra: "better code, faster". See sub-rant below!
I'm guessing what happened is:
ZDNet Editor: "Guys we don't have any stuff for our site today".
Mark The Trainee: "I've heard of this thing called 'Agile'. Should I do something on that?".
ZDNet Editor: "Ermmmmmm, anyone else got anything?".
Everyone else: Carries on playing PacMan [They are 20 years behind!]
ZDNet Editor: "OK Mark, that will have to do!"
Sub-rant:
Yes, I want proof that we get "better code, faster". Did anyone ever actually put this claim to the test? I'd like to see two teams, develop exactly the same application. One using Agile, the other using their chosen, non-Agile, methodology - e.g. Waterfall. Record the man-hours taken and have them, independently, code reviewed and tested. Yes, I'm a sceptic. About, pretty much, everything. Is that a bad thing?
I hate to give the article any more hits than it deserves, (which is zero), but here's the proof:
What is Agile software development? Everything you need to know about delivering better code, faster | ZDNet[^]
|
|
|
|
|
I agree.
So many writers/bloggers just don't understand the subject and continually repeat wrong information and hype -- and it becomes a sad sad cycle.
On the other hand... v1 is never good. But by v3 things start to get good -- because you have user feedback. So getting to v3 faster is a good thing.
And Agile is all about getting that user feedback sooner rather than later.
But Agile will not magically make v1 good.
Of course, you may also irritate your early-adopters by constantly changing the app on them, so maintaining a dialogue with them important.
|
|
|
|
|