In cryptography, rubber-hose cryptanalysis is a euphemism for the extraction of cryptographic secrets (e.g. the password to an encrypted file) from a person by coercion or torture[1]—such as beating that person with a rubber hose, hence the name—in contrast to a mathematical or technical cryptanalytic attack.

Details

According to Amnesty International and the UN, many countries in the world routinely torture people.[2][3][4][5] It is therefore logical to assume that at least some of those countries use (or would be willing to use) some form of rubber-hose cryptanalysis. In practice, psychological coercion can prove as effective as physical torture. Not physically violent but highly intimidating methods include such tactics as the threat of harsh legal penalties. The incentive to cooperate may be some form of plea bargain, such as an offer to drop or reduce criminal charges against a suspect in return for full co-operation with investigators. Alternatively, in some countries threats may be made to prosecute as co-conspirators (or inflict violence upon) close relatives (e.g. spouse, children, or parents) of the person being questioned unless they co-operate.[3][6]

In some contexts, rubber-hose cryptanalysis may not be a viable attack because of a need to decrypt data covertly; information such as a password may lose its value if it is known to have been compromised. It has been argued that one of the purposes of strong cryptography is to force adversaries to resort to less covert attacks.[7]

The earliest known use of the term was on the sci.crypt newsgroup, in a message posted 16 October 1990 by Marcus J. Ranum, alluding to corporal punishment:

...the rubber-hose technique of cryptanalysis. (in which a rubber hose is applied forcefully and frequently to the soles of the feet until the key to the cryptosystem is discovered, a process that can take a surprisingly short time and is quite computationally inexpensive).[8]

Although the term is used tongue-in-cheek, its implications are serious: in modern cryptosystems, the weakest link is often the human user.[9] A direct attack on a cipher algorithm, or the cryptographic protocols used, is likely to be much more expensive and difficult than targeting the people who use or manage the system. Thus, many cryptosystems and security systems are designed with special emphasis on keeping human vulnerability to a minimum. For example, in public-key cryptography, the defender may hold the key to encrypt the message, but not the decryption key needed to decipher it. The problem here is that the defender may be unable to convince the attacker to stop coercion. In plausibly deniable encryption, a second key is created which unlocks a second convincing but relatively harmless message (for example, apparently personal writings expressing "deviant" thoughts or desires of some type that are lawful but taboo), so the defender can prove to have handed over the keys whilst the attacker remains unaware of the primary hidden message. In this case, the designer's expectation is that the attacker will not realize this, and forego threats or actual torture. The risk, however, is that the attacker may be aware of deniable encryption and will assume the defender knows more than one key, meaning the attacker may refuse to stop coercing the defender even if one or more keys are revealed, on the assumption the defender is still withholding additional keys which hold additional information.

In law

In some jurisdictions, statutes assume the opposite—that human operators know (or have access to) such things as session keys, an assumption which parallels that made by rubber-hose practitioners. An example is the United Kingdom's Regulation of Investigatory Powers Act,[10][11] which makes it a crime not to surrender encryption keys on demand from a government official authorized by the act.

According to the Home Office, the burden of proof that an accused person is in possession of a key rests on the prosecution; moreover, the act contains a defense for operators who have lost or forgotten a key, and they are not liable if they are judged to have done what they can to recover a key.[10][11]

Possible case

In the lead-up to the 2017 Kenyan general election, the head of information, communication, and technology at the Independent Electoral and Boundaries Commission, Christopher Msando, was murdered. He had played a major role in developing the new voting system for the election. His body showed apparent marks of torture, and there were concerns that the murderers had tried to get password information out of him.[12]

  • A well-known xkcd comic (xkcd 538: Security) describes the issue. In the first panel a crypto nerd imagines that due to his advanced encryption (4096-bit RSA), the crackers will be ultimately defeated, despite having access to million-dollar hardware. In the second panel, the people with the desire to access this information use a five-dollar wrench and torture to coerce the nerd to give them the password.[13]

See also

  • Black-bag cryptanalysis – Acquisition of cryptographic secrets via burglary, or other covert means
  • Deniable encryption – Encryption techniques where an adversary cannot prove that the plaintext data exists
  • Key disclosure law – Legislation that requires individuals to surrender cryptographic keys to law enforcement
  • Social engineering – Psychological manipulation of people into performing actions or divulging confidential information
  • Rubberhose – Computer Code System created by Julian Assange (encrypted filesystem)

References

  1. Schneier, Bruce (October 27, 2008). "Rubber-Hose Cryptanalysis". Schneier on Security. Retrieved August 29, 2009.
  2. Pincock, Stephen (November 1, 2003). "Exposing the horror of torture". The Lancet. 362 (9394): 1462–1463. doi:10.1016/S0140-6736(03)14730-7. PMID 14603923. S2CID 54239764. Retrieved August 29, 2009.
  3. 1 2 "Many countries still appear willing to use torture, warns UN human rights official" (Press release). UN News Service. October 27, 2004. Retrieved August 28, 2009.
  4. Modvig, J.; Pagaduan-Lopez, J.; Rodenburg, J.; Salud, CMD; Cabigon, RV; Panelo, CIA (November 18, 2000). "Torture and trauma in post-conflict East Timor". The Lancet. 356 (9243): 1763. doi:10.1016/S0140-6736(00)03218-9. PMID 11095275. S2CID 43717344. Archived from the original on June 8, 2011. Retrieved August 29, 2009. Alt URL Archived 2022-11-05 at the Wayback Machine
  5. Iacopino, Vincent (November 30, 1996). "Turkish physicians coerced to conceal systematic torture". The Lancet. 348 (9040): 1500. doi:10.1016/S0140-6736(05)65892-8. PMID 11654536. S2CID 20335366. Retrieved August 29, 2009.
  6. Hoffman, Russell D. (February 2, 1996). "Interview with author of PGP (Pretty Good Privacy)". High Tech Today. Retrieved August 29, 2009.
  7. Percival, Colin (May 13, 2010). "Everything you need to know about cryptography in 1 hour (conference slides)" (PDF). Retrieved December 29, 2011.
  8. Ranum, Marcus J. (October 16, 1990). "Re: Cryptography and the Law..." Newsgroup: sci.crypt. Usenet: 1990Oct16.050000.4965@decuac.dec.com. Retrieved October 11, 2013.
  9. "The Weakest Link: The Human Factor Lessons Learned from the German WWII Enigma Cryptosystem". SANS. Retrieved 6 June 2013.
  10. 1 2 "The RIP Act". The Guardian. London. October 25, 2001.
  11. 1 2 "Regulation of Investigatory Powers Bill; in Session 1999-2000, Internet Publications, Other Bills before Parliament". House of Lords. 9 May 2000. Archived from the original on 8 November 2011. Retrieved 5 Jan 2011.
  12. David Pilling (Aug 11, 2017). "Ghost of Chris Msando haunts Kenyan election". Financial Times. Archived from the original on 2022-12-10.
  13. Bruce Schneier. xkcd on Cryptanalysis
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.