|
abmv wrote: select * from usa_digital_currency
No, the codez is
SELECT MONEYZ FROM DUAL;
Oh sanctissimi Wilhelmus, Theodorus, et Fredericus!
|
|
|
|
|
Microsoft is finally planning to block Visual Basic for Applications (VBA) macros by default in a variety of Office apps. Feels like I've reported this many times before, but that was just Excel
|
|
|
|
|
What's changed is that instead of the Mark of the Internet triggering a banner you need to click in Excel to enable them (along with all the other read only crippling), you need to clear the mark in Explorer because dismissing the read only banner doesn't enable macros from downloaded files anymore.
Since read only mode disables features as basic as sorting/filtering this is an improcement for anyone who doesn't use macros even if it's also ly stupid.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
Having a design system makes it easy for developers to onboard new team members, quickly find answers to basic questions without having to DM someone and wait for an answer, download common assets like logos and icons, and so much more. But I thought all developers were great at design?
|
|
|
|
|
We have a folder containing project documents, what do we call it?
A Design System!
|
|
|
|
|
Because "a folder" isn't manager-sexy!
TTFN - Kent
|
|
|
|
|
Looking at microsoft and their icon fever... they are.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
The grandly-named Internet Computer is, at first glance, something between a blockchain and a decentralized cloud provider. Oh, we're back on the blockchain gang
|
|
|
|
|
'The cloud' is a synonym to 'someone else's computer'.
Any processing provided across the internet is the cloud.
|
|
|
|
|
Cross-compilers aren't good enough if you want to encourage developers to port apps for Windows on Arm. We have no right to bear ARM?
|
|
|
|
|
I always wished that ARM have termed their GPUs 'Low Energy Graphics'.
I always wanted a machine with two CPU chips and two GPUs working in tandem, having two ARMs and two LEGs.
|
|
|
|
|
But how much would it cost?
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
at least an arm and a leg plus a pound of flesh ?
«The mind is not a vessel to be filled but a fire to be kindled» Plutarch
|
|
|
|
|
When you’re learning computer science, you typically learn that programming languages fall into two categories. I guess it depends on how you interpret your compiler?
|
|
|
|
|
Kent Sharkey wrote: how you interpret your compiler? A magical program that emits error messages.
|
|
|
|
|
Kent Sharkey wrote: I guess it depends on how you interpret your compiler?
Wrong! It's all about how you compile your interpreter!
"The only place where Success comes before Work is in the dictionary." Vidal Sassoon, 1928 - 2012
|
|
|
|
|
Kornfeld Eliyahu Peter wrote: It's all about how you compile your interpreter!
Exactly. Interpeters are compiled. Then again, somewhere in the dark recesses of ancient history I remember writing an interpreter on top of an interpreter. Probably in my Commodore PET / C64 days.
|
|
|
|
|
Quote: If we take just the definitions above, we see they don’t really mean anything:
1. Any “interpreted language” can be compiled, by making a compiler which emits the interpreter bundled with the source program.
2. Any “compiled language” can be interpreted, by making an interpreter which compiles the program and then immediately runs it.
Yes, but then in case 1, you've now created a compiled language, and in case 2 (gads, interpreting machine code???) you've written an interpreter.
He clearly needs to take some courses in logic.
|
|
|
|
|
I'd argue that the IL (Intermediate Language) bytecode for both dotNet and Java are interpreted in some instances and compiled in others.
I think the real distinction comes from the fact that good compilers do a lot of static analysis of the code to identify errors before the code is executed, while interpreters wait for code execution before doing even this level of analysis.
|
|
|
|
|
obermd wrote: I'd argue that the IL (Intermediate Language) bytecode for both dotNet and Java are interpreted in some instances and compiled in others. Are there any interpreter of dotNet IL in existence? The language was never designed for direct interpretation, and my gut feeling is that it would require quite some machinery to realize. And I don't see any advantage of direct interpretation, rather than doing JIT compilation the "normal" way for IL.
Java bytecode was, on the other hand, explicitly designed for direct interpretation (strongly inspired by the quite successful Pascal P4 bytecode). For years, it also was interpreted directly: Compilation down to native code didn't come until the critical voices about performance became too strong. (I don't know when the first Java bytecode compiler was created, but it sure took a number of years after the release of Java to become widespread.)
In the first years of JVM, it was marketed as a front end solution: The code should be dynamically deployed to (typically) web browsers, to be interpreted in the JVM of the browser. There were browser implementations; I am not sure whether I ever touched one personally. It was a big flop, anyway. No one in those days talked about the browser compiling the byte code to native code.
I guess that the dotNet people had similar visions of IL being distributed to browsers, and JIT compiled there. Maybe it could have succeeded if Javascript hadn't been around. I wouldn't exactly say that JS is any "better" solution, from a technical viewpoint, but we just have to accept that it won the battle for browser control. We have to live with it.
|
|
|
|
|
IL isn't interpreted. It's simply an intermediate symbolism that is then Just-in-Time compiled to the target processor's instruction set so it can be optimized for the particular processor.
|
|
|
|
|
Any interpreter that before execution makes a complete syntactical analysis of the source code, as well as all the semantic analysis that can reasonably be performed statically, is indistinguishable from a compiler.
But what's the use then? If you spend all those resources on detecting syntactic and semantic errors in code that is not yet to be executed (and may end up not being executed at all), you will be doing the major part of the work required for generating 'compiled' code. So why not go the full length?
Ten or twenty years ago, people were still arguing in favor of interpretation because 'You don't have to wait for a lengthy compile'. Last time I met this argument, about fifteen years ago, I dug up the log from my last compile, showing up to eight module compilations completing per second. (We used an old style make system that compiled only modules affected by code changes; the number of recompiled modules was usually quite limited.) Later, I have laughed off such arguments.
I'd much rather have a compiler telling me of a problem in, say, an exception or error handling code before the software is shipped to the customer. Even if the execution start is delayed by a whole second (often it is far less, on today's machines), I think it is worth it.
My first statement is not completely correct, though: A modern compiler will often do a lot more, such as code optimization and flow analysis to determine stack requirements. I take this as extra bonus advantages of compilation. Even without it, the complete syntactical check and static semantic analysis can alone justify a delay of a second, or even two seconds if required.
Sure, there are linters and similar tools adapted to interpreted languages. Except for huge, costly static analysis software (in the class of, say, Coverity), they do a much poorer analysis job than a decent compiler, and certainly no code optimization. If you think that the F5 delay in Visual Studio is frustrating, so you rather run lint before execution (not to speak of Coverity!), then you are on the wrong track
A much more essential distinction between languages is the type strictness (and static-ness). To some degree, this correlates to interpreted vs. compiled: On the average, compiled languages have stricter type control, and are able to point out semantic issues at compile time. Few languages designed for interpretation provide the same type control as those designed for compilation. This is just correlation, not causality. So the real issue is kind of type control, regardless of how the code is processed.
|
|
|
|
|
When you tell people something important, you want it to be easily understood. I'm not sure this is. "Marketing is far too important to be left only to the marketing department!."
|
|
|
|
|
|
Make your codebase easy for everyone to get acquainted with "Welcome to my nightmare. I think you're gonna like it. I think you're gonna feel you belong"
|
|
|
|