Advertisement
Guest User

Steven Checkoway, response to NSA project

a guest
Nov 26th, 2013
977
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. I apologize about the length of this email, but I think this is important.
  2.  
  3. I find parts of this to be unethical and completely unacceptable.
  4.  
  5. Part of doing responsible computer security research involves considerations of disclosure. There are several choices a researcher who has discovered a vulnerability can make. I'd like to briefly lay them out and make a case for several of them and then I want to discuss the proposals.
  6.  
  7. 1. Full disclosure. Full disclosure happens when you publicly reveal all details of a vulnerability.
  8. 2. Limited, public disclosure. Here, you reveal that there is a problem and give enough detail that others can be convinced that you have actually found a problem. Typically, there is an initial step where the vendor is notified with full details and given time to fix the problem.
  9. 3. No public disclosure, selling the vulnerability on the grey market to security vendors.
  10. 4. Selling the vulnerabilities to a government.
  11. 5. Selling the vulnerability on the black market or exploiting it yourself.
  12.  
  13. There are strong proponents of full disclosure. Typically, the argument goes that the only way to get the vendor to fix the vulnerability is to publicly shame them into doing so. Well-known security expert Bruce Schneier has stated that he thinks full disclosure is a good idea. This can follow disclosing to the vendor and seeing no action on their part.
  14.  
  15. Limited, public disclosure (not a standard name) is my preferred method of disclosure. Unless a company has a demonstrated track record of attacking security researchers with legal threats, I try to give the company as much advance notice as possible. After the public disclosure, skilled individuals are likely, with significant effort, able to duplicate my work. I claim that the good done by disclosure outweighs the harm since I have taken steps to minimize the harm.
  16.  
  17. Selling the vulnerability on the grey market is tricky, ethically. Some security researchers, such as Charlie Miller, believe that this is acceptable since, generally, these vendors are not going to be releasing the attacks in the wild.
  18.  
  19. Selling (or giving) a vulnerability to a government without public disclosure is never acceptable. If you think this position is too harsh, consider selling to a government with which your own government is in conflict.
  20.  
  21. Selling the vulnerability on the black market or exploiting it yourself is likewise never acceptable.
  22.  
  23. So what about these projects? Let me address them in turn:
  24.  
  25. 1. Reverse engineering a technically noteworthy exploit is a great project! At least, it's a great project as long as the exploit is already known or you disclose it. Developing your own payload is almost certainly ethically unacceptable. In the security community there are two standard proofs of concept: root shell on a *nix machine ("shell code") and opening the calculator program on Windows ("calc code"). By themselves, these demonstrate that a whole host of other malicious behavior is possible without the danger that an exploit will pose an unacceptable level of harm if it escapes the confines of your research machine(s). Anything beyond that is almost certainly unacceptable.
  26.  
  27. 2. Without knowing more, this seems fine. It's essentially asking the question: Can you find malware. This seems pretty similar to anti-virus. Of course, the flip side is if it cannot be done, then one can hide malware safe in the knowledge that it won't be detected. I don't know how one would prove that negative though.
  28.  
  29. 3. Reverse engineer a significant piece of malware is a great project! Again, it's a great project assuming you follow one of the standard disclosure models. Repurposing the malware to perform additional functions is absolutely unacceptable. This is, in effect, creating new malware that not only has your new functionality, but it also has all of the original functionality.
  30.  
  31. Any project that involves revealing new, sensitive information to the NSA without going public at the exact same time is acting unethically.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement