|
Bing pays me to use them. (Seriously, $5 gift cards for Amazon.)
|
|
|
|
|
I use it too - it's the only search engine accessible from my workplace network. It can't find the broad side of the only barn in a desert even if manually rotating its head towards it and pointing it with both fingers.
GCS d-- s-/++ a- C++++ U+++ P- L+@ E-- W++ N+ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- ++>+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
Sounds like you've either got some kind of malware, or your profile is corrupt. I've been using FF since it first came out, and I've literally never seen it reset my default search engine.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
And I remember "cursing in tongues" when they were pushing Yahoo.
Director of Transmogrification Services
Shinobi of Query Language
Master of Yoda Conditional
|
|
|
|
|
Yup, that's 11GB for 35,000 rows of data in one table.
That's 323K per row.
Why?
Because it's storing XML data, and the XML data is huge. And this is how the company stores its mission critical customer data. As XML blobs. Now, mind you, that's just a small subset. The complete set has several million records. And this isn't the only table that contains these XML blobs, as they get replicated into different tables as the part of the various workflows.
To me, that just seems in[s]ane. But the core software is built on a third party product that puts everything in XML because that way its, um, extensible!!!
[update]SQL Server's "sp_spaceused" says that the index_size is 49,024,928 KB [/update]
|
|
|
|
|
Wow...
We have tables of hundred of millions (6 zeroes) of rows, but the larges DB (of the largest client) doesn't cross the 90GB...
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|
|
another advantage you missed: xml compresses really well.
Signature ready for installation. Please Reboot now.
|
|
|
|
|
Jeremy Falcon
|
|
|
|
|
Think about that. The XML tags are basically the metadata of of the columns -- metadata (describing the data).
Those are analagous to relational db columns (names and type metadata).
So, what do people do? Store the metadata in a database that already has a way to track the metadata.
Sounds efficient.
|
|
|
|
|
raddevus wrote: Those are analagous to relational db columns (names and type metadata).
Exactly, except that essentially, the schema keeps changing, so instead of doing DB migrations, they just add elements to the XML.
|
|
|
|
|
As one that does database administration, I would find that preferable. Less work for me. All the rework would be on the programming side of the house (also me) to incorporate the new XML schema changes.
if (Object.DividedByZero == true) { Universe.Implode(); }
Meus ratio ex fortis machina. Simplicitatis de formae ac munus. -Foothill, 2016
|
|
|
|
|
At my previous site, we had a vendor application where most of the data was in a single table: the 'Test' table. Another site decided they wanted ALL site data and overdid the pull... their database was over 100G, they couldn't do a backup (not enough room).
We implemented a solution to 'archive' older test data to another database with joins in the stored procedures to get the data as needed.
Insanity reigns.
|
|
|
|
|
It can be optimized
One DB
One Table
One Field
One Record
Why else we have linq2xml
[Edit]
For the sake of backup convenience maybe a second column int autokey would be appropriate. Record '0' is then the current version and all the others are backups.
modified 5-Dec-17 10:25am.
|
|
|
|
|
JustWatchLittle wrote: Why else we have linq2xml
because we needed a new acronym for GIGO
- the latter is last centuries jargon... and as the kids say "that's like 100 years ago."
Signature ready for installation. Please Reboot now.
|
|
|
|
|
I see, I'm definitely too old to discuss These things
|
|
|
|
|
JustWatchLittle wrote: It can be optimized
I could live with:
Three servers for the IT Department basement trolls
Seven databases for the Manager-lords in their window'd offices
Nine tables for Mortal Men doomed to their cubicles
One record for the CEO to monitor them all!
|
|
|
|
|
Cold be RAM and disk and case
And cold be the CPU under stress
Never more to access a data set
Never, till the CEO fails and the Company's dead
In the back room the DB shall die
And still on paper let them write
Till the dark admin lifts up his hand
And reboots the server at 3 AM
GCS d-- s-/++ a- C++++ U+++ P- L+@ E-- W++ N+ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- ++>+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
And a partidge in a pair tree?
Someone's therapist knows all about you!
|
|
|
|
|
One ring to rule them all ...
I'd rather be phishing!
|
|
|
|
|
One Ring to bring them all and in the darkness bind them ...
|
|
|
|
|
I think that's why people are turning towards NoSQL platforms nowadays.
Not that it's an easy solution but it does take away from slowing down the database engines which were not originally built for XML.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
This is a time bomb. On any day or any reason this "database" will crash..
Press F1 for help or google it.
Greetings from Germany
|
|
|
|
|
KarstenK wrote: This is a time bomb. On any day or any reason this "database" will crash..
Yup. I made that prediction within a month of being here. People look at me like I'm one of those homeless people holding up a "the world will end tomorrow" sign.
|
|
|
|
|
What you should do is set up a one-time agent that exports the data into a new table that contains a reasonable schema so you can illustrate the folly of their ways. It shouldn't take you more than an hour or so to do.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
Never mind - I just saw that the data keeps changing. In that case, saving the data as a comma-delimited blob would be better, because then you could include the header row, and thus you can be reasonably confident that you can correctly parse it out of the blob.
If you need it, I have a CSV parser that parses the data into a DataTable without intervention from the programmer. Just aim a stream at it and let it eat (and it even determines appropriate data types, unlike Microslop's importer).
Go here for an explanation of the class and to get the download - it's part of a larger application, but it should be a simple matter to extract what you need.
SQLXAgent - Jobs for SQL Express - Part 3 of 6[^]
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
modified 5-Dec-17 14:24pm.
|
|
|
|