|If we use bit groupings for the ranks of each candidate, that would look something like:
Bits per candidate: floor(log_2(candidateCount)) + 1
Bits per vote: candidateCount * bitsPerCandidate
Total space for election: voters * bitsPerVote
So given 300,000,000 voters with 12 candidates:
space = 300,000,000 * 12 * (floor(log_2(12)) + 1)
space = 14,400,000,000 bits or 1.8 GB
Given the same amount of voters with 24 candidates:
space = 300,000,000 * 24 * (floor(log_2(24)) + 1)
space = 36,000,000,000 bits or 4.5 GB
Space definitely isn't the issue. What about the algorithm?
1) Bin every vote by first-pick candidate.
2) If no bin is >50% of the votes, re-bin the lowest bin and repeat #2.
3) The >50% bin is your winner.
I honestly don't see why this would be an issue. Worst case scenario something like v+(c-2)*(v/2) run-time where v is the vote count and c is the number of candidates. v/2 is just a napkin-math average since the redistributed bins would start out small and grow as the candidate list got shorter.
Again, just a bunch of napkin math so I probably missed something but I never bought the "it's too hard to compute" excuse either And I'm sure there are clever tricks to reduce the time required.
EDIT: For reference, if processing a vote takes:
1 microsecond: 30 minutes to determine a winner @ 300M votes, 12 candidates
1 millisecond: 20 days, 20 hours to determine a winner @ 300M votes, 12 candidates
And this is a single-threaded, un-optimized, brute-force approach.
modified 6-Jul-21 21:48pm.