|
They only said yes because they thought he was overpaying in the first place.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
Dan Neely wrote: They only said yes because they thought knew he was overpaying in the first place.
FTFY.
|
|
|
|
|
He's got enough headaches!
Wven though I've never been on Twitter, from what I've seen it would have been a refreshing change.
i
The most expensive tool is a cheap tool. Gareth Branwyn
JaxCoder.com
|
|
|
|
|
I keep a twitter account because local municipalities will use it to post information.
I think today I got a road closure announcement from one of the city departments I subscribe to.
My local police (and since I'm in the US, my state troopers) post events and sometimes that's helpful for knowing where to avoid. City council meetings get posted in my area as well.
That's my use case and why I would recommend using it for that narrow band of information. I don't see the point in following celebs or politicians, personally. Or arguing with people over meaningless things since twitter isn't an online community, unlike this forum.
That's the value I see.
Here's a sample:
In case you missed Gilbert Police Dept.'s Tweet
Heads up #GilbertAZ! 🚔
We have multiple reports of downed trees and debris blocking roads throughout town in the area between Williams Field and Baseline Roads and Greenfield and Cooper Roads.
Please avoid travel if possible while crews work to clear the roads. https://pic.twitter.com/FBUzCxKzzg
|
|
|
|
|
It has it's uses.
I had a FB account for many years so I could keep up with my Nam buddies, but it just got to be politics, arguments, and dirty laundry so I deleted my account. IMHO FB is the scourge of the earth.
BTW: I grew up in Phoenix, last time I visited my brother we went to Gilbert. Nice town!
The most expensive tool is a cheap tool. Gareth Branwyn
JaxCoder.com
|
|
|
|
|
I got sucked into forums with long political fights ages before facebook so I knew what I'd get if I signed up for that one. I never crossed that line in spite of pressure from friends. Sometimes they'd only post a party invite on Facebook and I'd chew them out over that.
Cool that you grew up in Phoenix. I grew up in Denver, so of course I'm not used to the heat here.
But the environment is very healthy for my body, overall, which is why I moved here. Denver was killing me and it is lightweight compared to some places with snow. The oxygen is nice, too.
|
|
|
|
|
scott mcnulty 2021 wrote: so of course I'm not used to the heat here.
Takes a while to get used to the dry heat. Where I'm at now (FL) it's hot and humid and I'm old so it's hard on me.
The most expensive tool is a cheap tool. Gareth Branwyn
JaxCoder.com
|
|
|
|
|
I don't know, you might think that someone forking over $44B might have done a hell of a lot of due diligence before making the offer.
Unless you are a petulant child who wants it and wanted it now!
|
|
|
|
|
Due diligence isn't normally done until an offer is made, and provisionally accepted: it requires a detailed review of a company including information that isn't normally made public.
Make sure a business is worth buying: due diligence[^]
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Thank you. I was going to make that very point.
An offer is made.
Pending due Diligence, and usually other stipulations (like not reorganizing, firing key people,
making earth-shattering side-deals, like sell the assets of the company to another entity).
And, I am not sure if "Not having LIED to the SEC about the NUMBERS you report" is actually put in writing,
or if it is allowed to be considered in ONE of the many "... Assuming their are no material abnormalities with
publicly available information..." (which I am confident any BILLION dollar offer comes with).
|
|
|
|
|
Why are ppl defending a well known corp that lies & bans people that don't adhere to state narratives... and then just strait up lies about it.
Most tribalistic NPC stuff I've ever seen.
|
|
|
|
|
You spend 15 minutes trying to work out why your SQL table only has 1000 rows.
And then, after reviewing every piece of code, every config setting, every stored procedure and every hacky line of code that may say "let's cap this at 1000 in case we run out of space" you realise you're looking at output generated by SELECT TOP (1000)...
cheers
Chris Maunder
|
|
|
|
|
rooky mistake!
or maybe need better night sleep?!
|
|
|
|
|
usually such fixed number of record sounds suspicious...
diligent hands rule....
|
|
|
|
|
One of the rules I follow when programming is never to make infinite loops. For-loops with an explicit test on an iteration counter are reasonably "safe", and I prefer those when they don't mess up too much.
Sometimes while-loops come a lot more natural. Unless there are other explicit and obvious tests that guarantee loop termination, regardless of (loop) external conditions, I always add something like
int iterationCount = 0;
while (<some 'unsafe' condition>) {
if (iterationCount++ > 10000) break; The limit is arbitrarily chosen, but must of course be larger than the actual iteration count in any 'normal' situation. If the loop still runs after (say) 10000 iterations, something is surely wrong.
Yes, it takes two extra lines. It takes a couple of extra instructions for each iteration. But it makes sure that the code never is stuck in an infinite loop when it shouldn't.
|
|
|
|
|
I use techniques like that and usually make the safety stop a multiple of the container size being processed.
I always use this pattern when I am removing processed items from a list and expect the list to be empty as the normal loop termination.
After loop exit, dump any items that do remain in the list
|
|
|
|
|
It's been awhile so this story may be dated. I was using some recursive tree balancing code I got from Dr. Dobb's Journal (for the young-uns very cool magazine). It was very solid and got used all over the place. Decided to test it with windows 10 (64 bit) visual studio. Once recursion hit 64 levels, windows halted the program with no messages, errors or otherwise. Just quit. Not documented that I could find, so go figure. Sort of the like the 1000 limit you encountered. A hard round number clearly not an accident.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
I'm curious: What did the 64+ recursion levels express? My first guess is that descending one level in the tree would lead to another recursion level. But did you really handle a tree 64 levels deep? What the elephant kind of information did that tree store? What kind of tree was it? Binary? Even if binary, it would have to be extremely unbalanced if it had a 'reasonable' number of nodes, yet 64 levels deep.
Even if recursing was used for other uses than descending one tree level, exceeding 64 recursion levels is almost always a result of a faulty termination test, or a data structure fault (such as a circularity in a tree), or some other error. Mathematicians are allowed to use unbounded recursive definitions, but it doesn't map well to program code
Obviously, you should have been given some sort of error message. I have had stack overflows that were not caught, just crashing as you report your program to do; chances are that the real problem was a stack overflow. I have worked with people who strongly object to all sorts of 'unnecessary' runtime tests, arguing that if you really need more stack space, it is your responsibility to set it up, not the runtime code to check for it. Maybe those who made the runtime system you were using was of that kind.
|
|
|
|
|
The input to the tree balancing code was a sorted list of names (tree n-deep) and various versions of randomly listed of names (random distributed tree of mixed depths). Don't recall how long the lists were but pretty long.
I was curious about the code, because it was so terse and worked like magic if not for stack overflow. Somewhere I have that old code. If I find it I will post it.
I agree that one should make ones own stack, because you can control the stack depth and test for circularity, etc.
I have written a number of tree libraries (n-nary mostly) and that's way I do it.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
|
Yes, I aware of these earlier journal's. Had a subscription for many years.
Balance tree code come from a 1992 Dr. Dobb's (I think Jan. edition)
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
Yes, balanced binary tree is result of routine.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
use
SELECT TOP (1000) instead of
SELECT TOP (1024)
"In testa che avete, Signor di Ceprano?"
-- Rigoletto
|
|
|
|
|
The old Macs counted 1000 bytes as a kB.
They always boasted to have 2.4% more disk space than DOS/ Windows with the same hard drive.
|
|
|
|
|
Disk manufacturers always specify capacity in decimal units - kilo, mega, giga, tera. I recently bought a 16 tera disk: 'Properties' says that it holds 16 000 881 782 784 bytes, or 14.5 TB. (It is a pity that Windows won't use the proper unit, Te bytes, not T bytes!)
Network people also use decimal units. Plus, they count bits, not bytes. When we switched from analog modems to ISDN, some people where complaining loudly: They expected a B-channel to carry 65 536 bytes/sec, while it did carry 64 000 bits/sec (or 8000 bytes/sec).
|
|
|
|