[svlug] Curing Cancer requires windows
shaeffer at got.net
Mon Apr 9 07:47:01 PDT 2001
On Mon, Apr 09, 2001 at 07:28:36AM -0700, Erik Steffl wrote:
> Aaron Lehmann wrote:
> > But it isn't foolproof. If by freak chance you give two clients the
> > same data for redundancy and they both return bogus data, AND that
> > data was the prizewinning RSA key or whatever, you're screwed.
> > Redundancy helps but it's not a _solution_. It's more of a
> > syptom-reliever than a cure.
> you can say the same about e.g. cryptography - there is a chance of
> data being uncrypted. the point is to design the system so that
> probability of failure is low enough... few things in life are
> deterministic (I guess you could say none in real life...)
> the question in this case is whether you can achieve reasonable
> probability of having good results (whther positive or negative) without
> loosing too much resources...
And in this case, you don't really need a large sample. If you have 10 randomly
selected clients performing the same calculations, then you can learn quite a bit
by the details of the results. Consider negative results. If all ten produce the
same negative results, then you can be sure it's negative. So, the point is you
transmit descriptive parameters that describe the results in sufficient detail to
enable you to ascertain if two negatives are exactly the same or not. The question
becomes: Do the negatives all report the same _negative_ solution? If not, then you
probably want to repeat the process with a second set of randomly selected clients.
Then you can have a hierarchically dependent algorithm consider all the results. In
the worst case, I can't imagine needing more than 3 or 4 iterations to conclude the
results with a high degree of certainty. Indeed, the original sample of 10 could
probably be reduced to somewhere between 5 and 8.
Neuralscape; Santa Cruz, Ca. 95060
shaeffer at neuralscape.com http://www.neuralscape.com
More information about the svlug