Wednesday, June 23, 2010
On Buffer Overflow Attacks
Although the issue of buffer overflows existed as early as the 1980s, it gained popularity as an immediate security threat only after when the wrath of the infamous Morris Worm [1] took in. As the first well-known computer worm to have made use of the concept behind buffer overflow, Morris Worm set the sail for an effort among computer professionals to develop tools and techniques that would mitigate and prevent further exploitations of such a common vulnerability.
Several efforts had been made that aim to prevent, and stop exploitation of such vulnerabilities. Each of them was proven to be successful in their own rights [1, 4]. But as time goes by, these preventive mechanisms were seen to have loop holes also, which defeated the security they provide [4, 5].
Although new programming technologies i.e. type safe languages, engrained bound checking in languages like Java, decreased if not completely eradicated such a threat, the existence of legacy systems and the continued support to it, keep the risks pose by such attacks at the highest level. I believe, in fairness to a language like C, that the problem is not really in the language per se; rather it is on the individuals who write programs using it. In general, without proper awareness and the surge of different factors during software development, it becomes harder for software developers to examine each of the different aspects of a computer program. Unfortunately, program security is one such aspect which is simply forgotten during development. Although some studies are being conducted on this particular area, for example [3], it still lacks major breakthroughs that would enable the adaptation of a standard security framework for building secured software applications.
Because it is in the nature of computer hackers to unravel vulnerabilities in existing systems widely used by the computing community and exploit them “for fun and profit” [2], it is a constant struggle for security professionals to cope and move ahead of them.
References:
[1] Crispin Cowan, Perry Wagle, Calton Pu, Steve Beattie, and Jonathan Walpole. Buffer Overflows: Attacks and Defences of the Vulnerability of the Decade
[2] “Aleph One”. Smashing The Stack For Fun And Profit. Phrack, 7(49), November 1996
[3] Gary McGraw, Brian Chess, Sammy Migues. Software [In]security: The Building Security In Maturity Model (BSIMM). InformIT, at [http://www.informit.com/articles/article.aspx?p=1332285]. March 16, 2009
[4] Alexander Sotirov, Mark Dowd. Bypassing Browser Memory Protection – Setting back browser security by 10 years.
[5] Bulba and Kil3r. Bypassing Stackguard and Stackshield. Phrack, at [http://www.phrack.org/issues.html?issue=56&id=5], May, 2000
Saturday, June 12, 2010
On the CRS Report for Congress “Botnets, Cybercrime, and Cyberterrorism: Vulnerabilities and Policy Issues for Congress” [6]
The report articulated several different scenarios by which how institutions and groups of individuals who have interest against the US government can make use of existing technologies to cripple the country’s economy. The report explored the possibilities of a coordinated attacked against US government-owned IT infrastructures. Although possibilities exist, concerned agencies downplay the extent of the real damage they can cause. They argued that recovery from such attacks can be handled in a way similar to how they handled natural calamities i.e. flooding, earthquake, or random machine breakdown in the past. Also, they argued that the cost of such attacks out-weighs the benefits they give, so these would deter anyone from even doing such things.
Another concern discussed in the report is the commercialization of the tools and the technical skills necessary to do cybercrimes. The ease by which one can earn from stealing financial information, trade secrets, etc. and selling them to underground markets lure more “brilliant” individuals into this kind of activity. The motivation of these attacks are no longer pure financial in nature. Some are initiated by groups to push political and social reforms [6].
The report made mention of Botnets all throughout. Russian-based Kaspersky Lab reported that the major threat plaguing the Internet today is the threat of botnets [1]. Botnets (Bot networks) are networks of compromised machine controlled by an attacker called the “bot master” [2]. Botnets are mostly responsible for the spread of malwares across the Internet that leads to theft of personal information and other sensitive data from government institutions and companies who store confidential customer information. Furthermore, botnets had been used in DDoS attacks and proved to be very efficient [4]. In most cases, computers which are generally infected are home-based personal computers which are usually unprotected or whose owners are not well aware of these security threats.
One of the major problems that security researchers faces in dealing with botnets and other security threats alike is the high level of technical proficiency of the individuals behind these threats. The technical complexities of the tools and techniques i.e. code encryption and obfuscation, which these hackers are using, gets higher such that security researchers are not able to get close at them. In most cases, such individuals monitor the activities of security researchers who are hitting on them which enable them to develop even better ways to avert and prevent detection [5].
Another factor is the severity of the infection it already caused to the Internet. The large number of infected computers and established C&C servers makes the complete take down of these networks much more difficult [3]. The use of peer-peer network architecture instead of the traditional C&C structure on botnets surfacing nowadays makes even harder for security professionals to alleviate the severity of the threats they cause.
In most cases, proliferation of malicious programs or Trojan horses (which turn a computer into a zombie) can be attributed largely to unsuspecting Internet users who are unaware of the different security risks lurking in the World Wide Web. I believe that a sufficient and massive information campaign of these security risks to the majority of the population of Internet users should be considered. I think it is safe to assume that majority of Internet users are not really aware of these prevailing security issues which make them even more vulnerable. We should increase everyone’s awareness about these security trends. Preventive security should be initiated at the end-user/client level. I don’t mean to cause paranoia among individuals who does not really want to be bothered of these things. But since it is the case that unsuspecting Internet users play major player in the spread of these botnets, we don’t have a choice but to force the issue on them. I am not saying also that this will put a stop to this kind of cyber-attacks. But at the very least, this initiative should at least decrease the number of infected systems and possibly prevent further infections in the future. And during these times, every bit of help we can get counts.
References:
1. [http://searchsecurity.techtarget.com/sDefinition/0,,sid14_gci1030284,00.html]
2. Grizzard, Julian B. , et al., “Peer to peer Botnets: Overview and Case Study”. UneNix.org, at
[http://www.usenix.org/ event/hotbots07/tech/full_papers/grizzard/grizzard_html/]
3. Fisher, Dennis. “Botnets using ubiquity as security”. ThreatPost.com, at [http://threatpost.com/ en_us/blogs/botnets-using-ubiquity-security-060710]
4. “Robot Wars – How Botnets Work”. WindowSecurity.com, at [http://www.windowsecurity.com/ articles/Robot-Wars-How-Botnets-Work.html]
5. VitalyK. “Gumblar: Farewell Japan”. Securelist.com, at [http://www.securelist.com/en/blog/2132/Gumblar_Farewell_Japan].
6. Wilson, Clay, “ Botnets, Cybercrime, and Cyberterrorism: Vulnerabilities and Policy Issues for Congress”. CRS Report for Congress. January 29, 2008.
Tuesday, June 8, 2010
On Ken Thompson's Reflection on Trusting Trust
Early on, one can hear the humbleness from his voice. He expressed the importance of team work by recognizing other individuals who had collaborated and worked with him. When individuals in a team perform in a synergetic manner complementing the weaknesses of each other and taking advantage of each others strength, it leads to a situation where “ the whole is greater than the sum of its parts” 1.
By following his lecture closely, one can see that his delivery was invigorating in a way that he showed the build-up of his capability as a programmer in a gradual manner. It seems that he is saying implicitly that anyone, who has the interest and determination, can excel in the field as long as he/she is persistent. For the purpose of progress and advances of the literature in our field, this thought is very welcoming. The real issue comes when the intentions of such individuals possessing such valuable knowledge is questioned. Are they the bad or the good guys?
On the subject of security, specifically in programming or software development, one can see the dilemma that we are facing. In our field where “don’t reinvent the wheel” is the common mantra, we are faced with trust issues each time we decide to use a 3rd party application or API in the application that we developed. When we reuse a piece code, it’s easy for us to check if it contains malicious instruction. But when question on the integrity on the low level aspects i.e. compiler, assembler, etc. of the programming environment we are using, there comes the problem esp. if it’s not open-source2
Ken Thompson’s choice of topic to discuss in his lecture during a time when practitioners in our field faces moral and ethical issues because of the boom of individuals who called themselves hackers who undermine the integrity of the majority was very timely. He expressed vividly his position regarding the importance of honesty and trust in our kind of profession. His lecture was more of an open challenge to us practitioners of becoming morally and ethically ready when we work.
1 by Aristotle