C
Camille Kokozaki
Guest
View attachment 983 I attended tonight an event planned, organized, and presented by IEEE Communications Society of Santa Clara Valley (ComSocSCV) and co-sponsored by IEEE Computer Society of Silicon Valley which consisted of a presentation by Ed Talbot & Tom Kroeger from Sandia National Laboratories (Livermore) entitled Demystifying Cyber security - Myths vs Realities. It highlighted some commonly perceived criteria of goodness for security that are in fact myths. The most common myths were (it seems there were many, and the speakers had 14 at last count, and I am somewhat paraphrasing):
- The more layers of defense, the better.
- Burdensome security is better security (like strong passwords).
- Security exists because the user runs their executables on their data using their own system that they control.
Counterexamples were provided with an interesting set of cases/graphics/pictures that illustrated the fallacies, pitfalls and risks of the false sense of security (pun intended) of believing in those precepts. They highlighted the reality of countermeasures being invariably defensive and asymmetric (as in defensive measures must defend against every possible exploit whereas attacks need to only find one unprotected vulnerability). With vulnerabilities due to various implementations being limitless and with threats exploiting vulnerabilities faster than they can be detected, the presenters stated that we are reaching the point where more needs to be done to address this issue comprehensively with formal quantitative assertions, new creative approaches and fool proof methods rather than reactive trial and error rules-of-thumb.
Without getting into more details of the presentation (which will be made available here if and when it gets posted), the talk was interesting and was sprinkled with actual cases, quotes, analogies, bon-mots and thoughtful observations. An animated discussion ensued with skeptical statements pointing to 'these were issues known in the seventies' and 'you will always have the social engineering and insiders defeating any security measure' angles. One attendee pointed out that one of the Wikileaks consequence was making data useless (to hackers/users alike) by the mere fact of being revealed. Good point.
I managed to throw in what I termed a 'radical' suggestion to address the issue. I posited that if we start declaring any computer device that has a virus on it a defective device subject to return, the computer industry will quickly find ways to solve this issue in a major way (even if 80% of problem cases are fixed, this is a good thing) and instead of having a cottage industry selling anti-virus software to consumers, they can better serve the public if they focus on helping the computer industry build better immune hardware or operating systems, so we pay an extra $50 for that extra hardware, I can live with that. Come to think of it this may be why McAfee was bought by Intel. I also happen to resent having to continually pay to fix things that should work by definition. Would you pay yearly to have every appliance (as in microwave, washer-dryer) you own remain immune to threats?
I also think that we need to quit trying to protect everything, but instead we should prioritize the mission critical threats/problems and work vey hard on protecting those. Some problems are inherently entropic and are disruptive by definition and we cannot make them behave. This reminds me of Deming's advice of managing quality with a built-in steady state run-rate as opposed to fitting every peg in every slot in a production line because you will never get there. But I digress and I probably overdid the analogy thing.
What do you think? Is this a solvable problem? Are things out of control?
As always, all errors of transcriptions are mine and if anyone attended, feel free to chime in, opine, correct, point out omissions, counterpoint and all that.
<script type="text/javascript" src="http://platform.linkedin.com/in.js"></script><script type="in/share"></script>
- The more layers of defense, the better.
- Burdensome security is better security (like strong passwords).
- Security exists because the user runs their executables on their data using their own system that they control.
Counterexamples were provided with an interesting set of cases/graphics/pictures that illustrated the fallacies, pitfalls and risks of the false sense of security (pun intended) of believing in those precepts. They highlighted the reality of countermeasures being invariably defensive and asymmetric (as in defensive measures must defend against every possible exploit whereas attacks need to only find one unprotected vulnerability). With vulnerabilities due to various implementations being limitless and with threats exploiting vulnerabilities faster than they can be detected, the presenters stated that we are reaching the point where more needs to be done to address this issue comprehensively with formal quantitative assertions, new creative approaches and fool proof methods rather than reactive trial and error rules-of-thumb.
Without getting into more details of the presentation (which will be made available here if and when it gets posted), the talk was interesting and was sprinkled with actual cases, quotes, analogies, bon-mots and thoughtful observations. An animated discussion ensued with skeptical statements pointing to 'these were issues known in the seventies' and 'you will always have the social engineering and insiders defeating any security measure' angles. One attendee pointed out that one of the Wikileaks consequence was making data useless (to hackers/users alike) by the mere fact of being revealed. Good point.
I managed to throw in what I termed a 'radical' suggestion to address the issue. I posited that if we start declaring any computer device that has a virus on it a defective device subject to return, the computer industry will quickly find ways to solve this issue in a major way (even if 80% of problem cases are fixed, this is a good thing) and instead of having a cottage industry selling anti-virus software to consumers, they can better serve the public if they focus on helping the computer industry build better immune hardware or operating systems, so we pay an extra $50 for that extra hardware, I can live with that. Come to think of it this may be why McAfee was bought by Intel. I also happen to resent having to continually pay to fix things that should work by definition. Would you pay yearly to have every appliance (as in microwave, washer-dryer) you own remain immune to threats?
I also think that we need to quit trying to protect everything, but instead we should prioritize the mission critical threats/problems and work vey hard on protecting those. Some problems are inherently entropic and are disruptive by definition and we cannot make them behave. This reminds me of Deming's advice of managing quality with a built-in steady state run-rate as opposed to fitting every peg in every slot in a production line because you will never get there. But I digress and I probably overdid the analogy thing.
What do you think? Is this a solvable problem? Are things out of control?
As always, all errors of transcriptions are mine and if anyone attended, feel free to chime in, opine, correct, point out omissions, counterpoint and all that.
<script type="text/javascript" src="http://platform.linkedin.com/in.js"></script><script type="in/share"></script>
Last edited by a moderator: