|
Nagy's doing okay. He's still a happy worker bee.
|
|
|
|
|
I'm fine, just busy on things.
watches another deadline flying past<
veni bibi saltavi
|
|
|
|
|
Hello people, my name is Emmanuel Katto and I'm from Uganda. I wanted to learn Java can anyone help me in learning Java language?
|
|
|
|
|
The official documentation is probably a good place to start:
The Java™ Tutorials[^]
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
|
Here is an online tutorial that allows you to compile examples in the browser:
Java Tutorial[^]
|
|
|
|
|
Thank you so much for providing the guide.
|
|
|
|
|
I learnt Java from here.
https://see.stanford.edu/Course/CS106A
Though this series is dated, this is indeed a great way to learn programming in general. There may be few syntax differences between this and today's Java. But the concepts sink in for sure.
Many of my juniors have also thoroughly liked the way of teaching here.
|
|
|
|
|
Amarnath S wrote: Though this series is dated
So is java
|
|
|
|
|
What i meant was that Java would have had so many deprecations since then. Though there may be many syntax changes, the concepts do remain same.
|
|
|
|
|
|
Yes Java is getting long in the tooth, but I nevertheless enjoyed learning it. AND: It caused me the gravitate to learn programming Android apps (through the Kotlin language) and that was even more fun! Now Lenovo is going to bring out desktops that run on Android! Will the fun never stop?
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
Has no idea about this reduced. (8)
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Clueless ?
This = clue
reduced = less
Has no idea = Clueless
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
modified 27-Oct-23 5:15am.
|
|
|
|
|
YAUM! Care to edit in the explanation?
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Should that be
this == clue
?
😊
Tricky!
|
|
|
|
|
I'm currently moving all of my professional and hobby project development over to the ARM Cortex family of platforms.
ARM Cortex M7 > ESP32
Microsoft is doing similar with their operating system.
Apple already has, with the M1 and m2, AFAIK.
ARM > Intel
There's no getting around that x86 is showing its age architecturally. Even discounting all the ancient backward compatibility, like "real mode", it's getting awkward.
I read this thread with some interest. Aside from some disagreements in the comments, overall it was very interesting, if taken with a grain of salt.
the_end_for_isa_x86[^]
One nice advantage for me is the ARM Cortex architecture is largely continuous from their little M0 real time chips all the way up to their multicore A line.
That means I can create code that will perform well across little devices and PCs.
This also has to be a huge win for developers of phone and tablet applications, that their work is more transferable to future PCs now.
The fact that ARM doesn't manufacture is also a huge win. They leave fabrication to outfits like NXP. ARM just designs chips. I read somewhere that their time to market for a new offering is about half that of Intel's.
Start moving your stock.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
modified 27-Oct-23 10:01am.
|
|
|
|
|
My wife who is greatly technical had bought an ACER laptop after her HP ran the fan always like a spaz. The Acer out of the box did the same thing. I am typing this on an ASUS Q502 i5 with windows 7, this is my main dev ride, and it runs it fan only when I make it do so by accident. Long story short, My wife ditched both aforementioned machines for an M1 mac with NO FAN. - sorcery I say. She recently traded that one in for an M2. After having being saddled with cpu coolers since the 386sx (with MMX) , it's crazy to see a machine so quick and responsive with no rotational cooling going on.
|
|
|
|
|
I have been fighting x86 since 1992-93, but lost the first battle: The IT department of the Tech. College where I was teaching had two alternatives for a set of 30 new machines, to be used for Unix software and also one of my courses, Computer Architecture (with assembler coding). The choice was made in a democratic manner: The educational staff of the department, including me, came out in favor of an M68030 based system. The department head was in favor of the x86 based solution. When he saw that the majority went against his preference, he announced: I can't be the head of a department that works against me. I quit! Find another department head! So the next day, we repeated the democratic voting, and this time the majority was in favor of the department head's preference, and he didn't quit. I had to teach Introduction to Computer Architecture on the messiest architecture around.
(Btw, Denmark got into EU by a similar democratic vote. They had a referendum, giving an 'no' to join EU. The Danish authorities told the people that the answer was wrong, and gave the people another chance to give the right answer. The second time, The People understood what was expected of them, and Denmark joined EU. Hooray for democratic processes! At least as long as they give The Right Answer.)
M68K didn't survive in the big markets. If it had, the RISC wave would have been mostly superfluous. So let's cross our fingers that the ARM architecture will be strong enough to fight down the x86/x64.
Although ARM started as a 'clean' RISC, it certainly isn't any more today! The very first 'Thumb' instruction set laid the first ground for irregular instruction coding, need for an intermediate decoding level and reduced regularity of the instruction set. That has grown 'worser and worser' with every new architecture revision; it is today very far from the RISC ideal of instruction word bits directly activating the various logic circuits. They have had to introduce caching and pipelining and lookahead and speculative execution and out-of-order execution and whathaveuyou of hardware speedup techniques. The instruction set has grown and grown and grown and ... Certainly not always in an orderly, well designed manner. AArch hasn't had as many years as x86/64 to grow cancer, but the old word saying that 'any sufficiently high-versioned standard is indistinguishable from a can of worms' is beginning to bite ARM as well.
Note that the discussion you are referring to is more than three years old. The thread is almost void of references to the Aarch64 alternatives that were available even then, but with plenty of references to the M1 of 2007. It it tempting to suspect that a fair share of the commenters are not fully aware of the more recent (even then) updates to the architecture.
If you go for the detail, 'the ARM Cortex architecture is largely continuous from their little M0 real time chips all the way up to their multicore A line' does hold true for a sizable core. Not for the Thumb instruction sets. A number of the 'ordinary' instructions didn't make it to the 64 bit architecture. Compatibility at the binary level is significantly less than at the assembler source code level; some of the top Aarch64 models have completely dropped support for Aarch32. Vector instructions are now in its second version of the second generation.
Yet: I do like the general ARM architecture. I have come to love the register based philosophy, with less reliance on the stack. I have seen how the system architecture for 'peripherals' integral to the CPU is great for extending the CPU in a SoC. I am really hoping that traditional PC manufacturer soon will come up with a broader range of ARM based machines, covering even the more 'classical' kind of desktop machines in large cabinets, allowing for extensions with peripherals, memory etc. that you can't do with a portable or tablet.
'The fact that ARM doesn't manufacture is also a huge win' - it is, but don't overestimate it. ARM provides a CPU core, for anyone else to extend with their own (on-chip) peripherals, several architectural features are optional, and every manufacturer will pack the chip to his preferences. So you will rarely if ever see a 'plugin compatible' chip from an alternate vendor. If you have to switch to another chip manufacturer, be prepared for another pin layout, maybe your old chip had some useful peripherals missing in the new one (and if the new one has some similar peripheral, it is almost certainly managed differently) and some instruction codes may be invalid because that option was left out of your new replacement chip.
A common core is of course a great win. But the salesman speak is often a lot more rosy than realities, especially if you are making use of optional functions and on-chip peripherals.
|
|
|
|
|
Good info/background.
Thanks
|
|
|
|
|
ARM is interesting, and certainly the Raspberry-PI situation has made ARM almost a household name. I'm watching RISC-V with interest. As a royalty-free instruction set, it might have legs. On the other hand, one of the big RISC-V development companys, SiFive SiFive - Leading the RISC-V Revolution just laid off 20 % of its work force. So maybe RISC-5 is not quite the industry darling some make it out to be.
I'm curious if anyone has any experience with RISC-V, and if so, is it or or meh
Keep Calm and Carry On
|
|
|
|
|
I've only tinkered slightly on some of the RISC-V based ESP32s. Nothing special about them to me.
Sure the instruction set is open, but they aren't entrenched. Inertia is everything in this arena, so for better or worse, I think ARM is the future, at least in the near to mid term. I don't think Risc-V will get the traction necessary to unseat it, particularly when you have everyone from Qualcomm to NXP manufacturing them.
I think RISC-V will find it's niche in IoT more than anything, with companies like Espressif using it to spin off cheap MCUs, but I'd be more surprised to start finding it in things like high end phones.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
RISC-V is suffering from the same problem as a lot of 'open' projects: It is open to anyone to make their own additions, extensions, modifications, all in different directions. And people do. It may work out if there is a strong core under central management. I am not sure that the core is strong enough, and the central management tight enough.
From the outset, the architecture looks like it tries to be everything to everybody: Address space of 32, 64 or 128 bits. Big endian and little endian. Lots of what is basic functionality in a modern x86/64 are extensions that may or may not be there, and anyone can make their own proprietary extensions. The CPU is fundamentally 32 bits, but then comes the 64 bits extensions. An opening for an alternate 16 bit instruction set (similar to ARM Thumb).
I suspect that the flexibility and openness will create such a "rich" (another possible word is "messy") world of options and extensions that it will lack the focus to become a mainstream success in general markets, where you are dependent on a lot of manufacturers offering identical facilities, to run identical programs in identical ways.
My guess is that it has a greater future in fixed code applications, like embedded/IoT, where the core functionality is more significant than the extensions and compatibility with other software is almost irrelevant. Also, for embedded/IoT solutions, the architecture license fee makes up a larger fraction of the unit cost, compared to e.g. a desktop computer, giving RISC-V a competitive advantage.
I'm happy with RISC-V entering my micro devices, but I strongly doubt that my next desktop machine will have a RISC-V CPU.
|
|
|
|
|
I agree that x86 is old and limited, but I'll believe that ARM is taking over once I begin to see ARM PC devices for sale on NewEgg.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
I mean, they don't sell Apples at Newegg AFAIK, but given that Apple has two ARM based offerings now, it's only a matter of time before other manufacturers follow suit.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|