YHBW.

Jan 26, 2010 23:22

Observation just now from Radu Sion during the FC rump session: in the cloud, it costs about $5 million to brute-force 64 bits of symmetric key.

security

Leave a comment

krfsm January 27 2010, 09:38:12 UTC
What time-scale are we looking at? Or is this a pure time-vs-power tradeoff, so $5M buys me enough computations, either in parallel or over time, to brute-force? Five minutes but very visible to the cloud providers, or five days, but much less visible?

Reply

vatine January 27 2010, 11:50:03 UTC
I think CPU pricing is in CPU seconds, so using X CPU for 2T is (approximately) the same as using 2X CPU for T (though there's probably a RAM charge that makes the total prices 2TX+2RT and 2TX+RT).

Reply

krfsm January 27 2010, 14:54:44 UTC
Why am I thinking of the Mailman from "True Names" here?

Reply

jrtom January 27 2010, 18:11:52 UTC
Presumably it depends on how parallelizable (or, since they apparently have a specific method in mind, parallelized) the computation is.

What I'd like to see is a graph of bucks-per-bits (i.e., how many $ does it take to brute-force a 96-bit key, and so on?).

Reply

docstrange January 27 2010, 20:43:09 UTC
Given a crypto algo without known weaknesses, it should double per extra bit, no?

Reply

docstrange January 27 2010, 20:43:51 UTC
(And given we're talking brute force, the weakness isn't relevant to the measurement...)

Reply

jrtom January 27 2010, 21:05:13 UTC
That's what I'd expect, yes. But what I expect is not always what is true, so it's good to have data that confirm (or deny) my understanding. Also, I'm not a cryptanalyst, so I don't know whether there are any nuances to the "double per extra bit" rule of thumb.

Reply

vatine January 28 2010, 13:58:09 UTC
I think the right answer is "roughly". It depends on how much the extra key space influences the actual encryption. A typical example would be 3DES, with triple the number of key bits, for squaring the amount of effort to brute-force (as to exactly why that is, ask a cryptographer, I can sorta see it but not explain it).

Reply

maradydd January 29 2010, 06:52:25 UTC
I think he's done that, though it wasn't in the talk. 80 bits (IIRC, and this was three days and a continent ago) was something like $384B.

Reply

maradydd January 29 2010, 06:51:26 UTC
Yeah, he started by figuring out what a cycle costs in picocents in various environments. The cloud is far cheaper than desktops or even in-house server installations due to economies of scale -- it turns out that once you get above a certain size, power consumption outpaces the cost of support, which is traditionally the limiting factor.

Reply

maradydd January 29 2010, 06:49:38 UTC
$5M buys you enough computation and it parallelizes easily. He didn't discuss visibility, as it was a five-minute rump session talk, but I'm sure he'd be up for talking about that.

Reply


Leave a comment

Up