|
Sander Rossel wrote: I thought all that old stuff was abbreviated to save memory.
Sort of. I seem to recall that early linkers had only 8 (or maybe 16) character limit for external identifiers, so that too played a part in the name of system functions.
Keep Calm and Carry On
|
|
|
|
|
I am sure that you are right.
My next question is how much time your typical application spends inside stpbrk(). I can imagine that you can set up testbeds where it exceeds one percent of the total CPU load. That is for a testbed.
Can you set up a true, user level, application solving a true user problem, where more than a single percent of the CPU time is spent inside stpbrk()? At a single percent, doubling the speed of spbrk() might speed up the application by a whooping half percent. Woooah!
Sure: I see that thirty or seventy-five such optimizations together might be significant, taken as a whole. So go ahead with the twenty-nine, or seventy-four, other optimizations. Then serve the pudding.
The proof of the pudding is the pudding you serve to the end user.
|
|
|
|
|
I am using real world data collected from an online repository at TMDB.com. I have 200kB of actual data from their repository, and then I synthesized 20MB of similar data in the same schema. I could have downloaded 20MB of JSON from TMDB.com. The only problem is then I'm downloading 20MB of data from tmdb.com and their rate limiting will hate me.
Now, for a real world scenario, where you're actually using TMDB's data, you'll likely end up mirroring their repository as you retrieve parts of it. For example, their repository contains every show and movie you'll find at IMDb.com, but in JSON format. Now if I only want shows from 2019, i can get those, but the point is this process is like fetch on request, and then cache. If you were to retrieve all the data then the entire repository would be mirrored locally.
It is from this mirror that i'd want to extract data.
So yes, that's a real world scenario. I even have a C# library that does this for tmdb.com already, but not using this json parser, which is in C++.
I've profiled it using the GNU profiler on linux, but nothing else.
Most of the function time is in strpbrk() at least for long scans.
More importantly, I know my actual throughput. I'm currently getting 2/3 of the throughput I got on a linux machine, on a windows machine whose hardware is maybe 10 times as fast or more.
And i know what function primarily impacts that throughput because I've already profiled.
It's skipToAny() which in the best case, uses strpbrk() - it can't on arduinos but it will on windows.
Real programmers use butterflies
|
|
|
|
|
Want to use MS VC++ under windows with VS Code?
Good luck. Microsoft in their infinite wisdom
A) Set it so you can't use MSVC without running a batch file first
B) Made the batch file completely unreadable. I can't even tell where it sets PATH. How do you even do that?
C) Is just generally is terrible.
D) Negates all the "Run VS Code here" shell extensions since they are useless because you need to launch code from the batch file in order for it to work.
Tell me: Why in the world would *anyone* think it was a good idea to install MSVC++, not put it in the PATH, and then make it near impossible for you to do it yourself? Why?
Do they *want* me to move away from Windows for all my C++ development?
Real programmers use butterflies
|
|
|
|
|
Well, possibly they want you to move away from C++ for all your Windows development ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Probably. I just opened a nastygram of an issue over at the VS Code C++ extension's github repo.
This is just unacceptable.
Real programmers use butterflies
|
|
|
|
|
Hmmm, All you need to do is execute vcvarsall.bat to setup your environment?
honey the codewitch wrote: Tell me: Why in the world would *anyone* think it was a good idea to install MSVC++, not put it in the PATH Because most C++ developers are using multiple tools and compilers (and multiple compiler versions). Keep in mind that Visual Studio allows you to compile with older versions of CL and ancient linkers.
|
|
|
|
|
Yes I know how to do that.
That is not the problem.
The problem is that that kills workflow. I can no longer click on my project folder and go "Open with VS Code" because Microsoft stinks.
If microsoft didn't stink (like that will ever happen) they would run vcvars from inside VS Code when you're using the C++ extension
Adding: "people use ancient compilers" isn't really an acceptable justification. Workflow should not be killed when using the standard compiler just because you might use an ancient one. You can OVERWRITE environment variables, after all. Linux gets it. Microsoft is clueless.
Real programmers use butterflies
|
|
|
|
|
Well,
VS Code is open sourced under MIT license so you are free to modify the behavior. Or you can open an issue to request a new feature. It sounds like a great feature to add to the VS Code C++ extension.
|
|
|
|
|
I've opened an issue already.
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: I've opened an issue already. Great.
In the old days I would always manually remove the build tools from the %PATH% environment variable.
I don't know how long you've been developing with C++ on Windows but in the old days (90's- 2000's) the build tools were added to the %PATH% environment variable. But it was causing alot of problems because developers would install the Windows SDK which had it's own compiler and linker. Also device driver developers would install the DDK which yet again had it's own compiler and linker. Then there were some guys (like me) that would have VC6,VS2005,VS2008,VS2010,VS2012.NET,VS2013 all installed on the same workstation. I was so happy when VS2015 allowed me to compile with older versions. It meant that I didn't have to install 5 different versions of Visual Studio.
Best Wishes,
-David Delaune
|
|
|
|
|
Randor wrote: which had it's own compiler and linker
I think I see the problem right there. I wonder why microsoft didn't?
Randor wrote: Then there were some guys (like me) that would have VC6,VS2005,VS2008,VS2010,VS2012.NET,VS2013 all installed on the same workstation.
Sane thing (meaning not in the cards for MS): set the env vars to the latest compiler, and allow the user to run a batch file to set the env vars for the older compilers. Better yet, make the latest compiler support older compilation**
** which would have been easier if microsoft hadn't spent years ignoring the C++ standard
Insane thing: Make everyone's life harder by not having sane defaults, and by using crap compilers for years before finally deciding that standards matter.
Real programmers use butterflies
|
|
|
|
|
It's been a really long time ago but I believe with VS2005 there was a post-install step "Add build tools to environment" or some such. It caused alot of problems because developers would install VS2005 *after* VS2008/VS2010 and then they would go and compile the boost library or something and BJAM would use the older VS2005 compiler.
Anyway what you are proposing is more viable today now that VS integrates the older build tools.
|
|
|
|
|
Yeah, see I wouldn't have made that decision. I would have included "Add build tools to environment" with every version, and made it replace the old PATH variables and such. More work, but way better in the end. I want to say (but it's a guess) that this is what happens when GCC is installed, or something similar, but I could be wrong.
Real programmers use butterflies
|
|
|
|
|
gcc (on Linux) generally installs different versions as symlinks in /usr/bin to architecture/version specific executables. Each symlink has the version appended - this is what my /usr/bin contains:
lrwxrwxrwx 1 root root 22 Dec 4 2019 /usr/bin/gcc-7 -> x86_64-linux-gnu-gcc-7
lrwxrwxrwx 1 root root 22 Mar 10 2020 /usr/bin/gcc-8 -> x86_64-linux-gnu-gcc-8
lrwxrwxrwx 1 root root 22 Apr 23 2020 /usr/bin/gcc-9 -> x86_64-linux-gnu-gcc-9
And then gcc (with no version) is a symlink to one of the versioned symlinks (e.g. gcc -> gcc-8 ). All this means that using a specific (major) version of gcc is pretty simple...
If you want to compile 32-bit code rather than 64-bit, then install g++-multilib and compile with the -m32 flag.
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
|
|
|
|
|
Randor wrote: Then there were some guys (like me) that would have VC6,VS2005,VS2008,VS2010,VS2012.NET,VS2013 all installed on the same workstation
As far as I'm concerned, VMs are the best thing to have been invented to keep my build systems clean and manageable. Separation of concerns and that type of thing.
My understanding is that containers are intended to take things one step further. Maybe it's because I'm now set in my ways, but nothing I've read so far about containers actually seem to make things simpler still to make me want to change my methodology.
Same with multibooting. Why bother? In this day and age, as far as I'm concerned, rebooting is a sin. If someone has to abandon what they're working on to boot another OS, IMO they're doing it wrong.
|
|
|
|
|
Yeah,
The only time I use a VM is when I am doing device driver development. That way when I break the operating system I can just rollback. It's also a necessity for attaching WinDbg and debugging from the host machine.
Regarding the VC6,VS2005,VS2008,VS2010,VS2012.NET,VS2013 setup... I've got a second workstation right next to me with that setup. There are zero problems running all those versions of visual studio as long as you do not have default build tools in your %PATH% environment variable.
dandy72 wrote: Same with multibooting. Agree, my workstation is too fast, I can run 3 operating systems in a VM concurrently and it doesn't seem to impact anything. The solid-state drives make a huge difference. I don't even need this much speed.
|
|
|
|
|
Randor wrote: I can run 3 operating systems in a VM concurrently and it doesn't seem to impact anything
My current VM host is an i7-4820K (Ivy Bridge generation, I believe), overclockable but I never bother with this sort of stuff. While it's getting long in the tooth in terms of age, I currently have 9 VMs running simultaneously as I'm writing this (roughly twice as many powered off), with individual VMs being allocated anywhere between 2 and 16GB of RAM.
With a total of 64GB of RAM, memory limitations is a problem long before CPU performance. Mind you, I've never run (for example) VS natively on the host, so I don't have figures to compare with, but suffice it to say that I've never felt that anything running on any of the VMs was "slow" in the sense that I'd be expecting anything to run much faster if it was running on the raw hardware.
And really, the only reason my VM host was upgraded from a Sandy Bridge CPU is that the board it was sitting on had a limit of 32GB of RAM, and I was constantly running out. That machine got repurposed from a VM host into a gaming rig. My current VM host will probably go through the same path, once I decide my gaming machine is getting too slow. Right now, I fully expect that, when I decide to replace the machine, it'll be because I want to double the RAM rather than anything to do with CPU performance.
|
|
|
|
|
Moving in that direction myself. The tools out there you think you want can so fry your machine. I cite two examples:
1) Platform Builder from MS. After installing, wiped my build environment. Never again.
2) Intel Vtune. Ouch again
You just have to protect yourself. Rollbacks don't cut it any longer.
Charlie Gilley
<italic>Stuck in a dysfunctional matrix from which I must escape...
"Where liberty dwells, there is my country." B. Franklin, 1783
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
|
|
|
|
|
why, and I ask in all professional seriousness, would you want an application using a variety of compilers and linkers and hoping Microsoft got it right?
Sounds like a recipe for a) suicide b) grief c) project failure d) infinite income (I'm a contractor).
Charlie Gilley
<italic>Stuck in a dysfunctional matrix from which I must escape...
"Where liberty dwells, there is my country." B. Franklin, 1783
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
|
|
|
|
|
Hmmmm,
charlieg wrote: would you want an application using a variety of compilers and linkers Due to how C++ dependencies work... (and Windows SDK incompatibilites/changes) most project maintainers use the same compiler version for patching and bug fixing for older projects. If the software you are maintaining is 10 years old... then it's probably running on older hardware and operating systems.
|
|
|
|
|
CALL "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvarsall" x64
in a command script does it for me (or x86 if you want 32 bit).
I can then run the compiler (cl) from the command prompt. I have tried to use VSCode extensions and projects but never managed to get it to work.
|
|
|
|
|
Yes I know how to do that.
That is not the problem.
The problem is that that kills workflow. I can no longer click on my project folder and go "Open with VS Code" because Microsoft stinks.
If microsoft didn't stink (like that will ever happen) they would run vcvars from inside VS Code when you're using the C++ extension
Real programmers use butterflies
|
|
|
|
|
You're not the only person to feel that pain...
As it happens, I use MSVC with VS Code for most of my development. Intellisense works fine, and I build from the integrated terminal. This relies on a batch script I wrote years ago that would call the appropriate Visual Studio setup script before either dropping back to the CMD prompt or jumping into a bash shell (used to be msys2 bash, now I use WSL bash). The way I've configured it, VSCode uses the script to set its integrated terminal up for MSVC2017/x86.
Here's the Visual Studio 2015 part of that script's code (my script supports MSVC 2003, 2008, 2010, 2013, 2015, 2017 and 2019 - fortunately, 2013 is the oldest one I use at all now!):
if not exist "C:\Program Files (x86)\Microsoft Visual Studio 14.0\vc\vcvarsall.bat" (
call :not_installed "Visual Studio 2015"
exit /b 1
)
rem %~1 is either x86 or amd64
call "C:\Program Files (x86)\Microsoft Visual Studio 14.0\vc\vcvarsall.bat" %~1
goto :eof
Yes, hard-coded paths suck, but ¯\_(ツ)_/¯
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
|
|
|
|
|
I like the way that tweet described the batch file.
Also, thanks!
Real programmers use butterflies
|
|
|
|
|