Internet Storm Center Infocon Status The Internet Traffic Report monitors the flow of data around the world. It then displays a value between zero and 100. Higher values indicate faster and more reliable connections.

Aug 21, 2007

Open Source and ClamAV

I've been a user of ClamAV (and its Windows cousin, ClamWin) for years. Needless to say, I was very pleased to hear about the AntiVirus Fight Club Results. This was, according to the site, an "all-out public test of different anti-virus vendors to see how they really compare." The field was impressive, though there were some players that weren't included that I would like to have seen (specifically FSecure, NOD32, AVG, TrendMicro, and Panda). Having done a fair amount of research on AV solutions a little over a year ago, I wasn't surprised to see Kaspersky at the top of the heap. I was, however, pleasantly surprised to see ClamAV right up there, along with Norton. In some cases, ClamAV was substantially better than some of the other choices. Having only personal experience to go on, I always thought that ClamAV was one of the best, but I have never had the time (or, to be honest, the inclination) to do extensive side-by-side testing. I thought ClamAV was one of the best, and as an advocate for all things Open Source, I actively hoped it was one of the best, but I never had solid proof. Until now. Kudos to the ClamAV folks. Nicely done.

On a related note, ClamAV was recently acquired by Sourcefire, the folks who brought us Snort. As you may recall, Sourcefire went public this last March which was, I think, I good thing. I've used Snort for so long I don't even remember when I first started tinkering with it. Now with the acquisition of ClamAV, the idea of further integration between Snort and ClamAV is certainly appealing. I do have one concern with regard to Sourcefire and Snort, though. Prior to the release of GPL 3.0, the Snort license stated that it was covered by GPL 2.0 or later. Once GPL 3.0 was released, however, the license was quietly changed to state explicitly that Snort was covered by only GPL 2.0. What does this mean? Frankly, I'm not completely sure. I've read a lot of posts from Marty Roesch (Mr. Snort himself) and lots of others. Some claim that the change means nothing. Others are claiming that this is the death knell. Personally, I'm not sure what to think. I haven't stopped using Snort. I still love Snort and don't have any plans to give it up. Not yet, anyway. I have, however, started brushing up on Bro IDS, just in case I need to jump ship.


Jun 22, 2007

Use of Language

I just finished reading a very entertaining post on Ars Technica on "The ten most hated words on the Internet." Though I'm in the field of information security, my undergraduate and graduate days in college were spent in the field of English Literature1, so I always appreciate posts like the one from Ars Technica. After reading the post, I got to thinking about the various uses (and misuses) of the English language that drive me nuts, so I thought I'd post them here just for fun. I'd enjoy hearing about the words or phrases that drive you nuts. My feeling is that while it isn't necessary for a person to be a superlative writer/speaker, one should at least have a firm grasp of the fundamentals of their own language. Is that too much to ask? So without further ado...

Words and Phrases That Should Be Banned2

  • ping - As in "I need to ping Bob about that meeting tomorrow." Grrrrrr.
  • irregardless - This one makes my skin crawl. The word is "regardless."
  • the misuse of "me" and "I" - Sadly, this one is so common, most people probably aren't even aware of the fact that it is often used incorrectly. Here's an example of misuse I just heard earlier today: "If you have any questions, give John or I a call." **shudder** Here's the trick I learned from my 6th grade teacher. If you aren't sure whether to use "me" or "I," drop the other part (in the above example we would drop "John or") and the answer becomes obvious. You woudn't say "If you have any questions, give I a call," so the correct word in this context is "me."
  • iAnything - Personally, I thought this got old after iMac.
  • moot vs. mute - As in "that's a mute point." Fortunately, I don't hear this one as often as I used to, but I still here it with some regularity. The word is "moot." A moot point is a point that needn't be decided as the result of a change in circumstances.
  • incentivize - There are lots of words like this, where people tack on an "ize" ending and try to make a verb out of a noun. Don't do it. As soon as I hear someone use one of these made up *ize words, my first thought is "Oh, you're one of those."
  • "blog" as a verb - I don't really care for this word at all, but I can deal with it as a noun, as in "Have you read my blog?" What I can't abide, though, is its use as a verb. "I'll have to blog about this," or "I blogged about that yesterday."

1Specifically, Medieval English Literature, with secondary foci on Shakespeare and Classical Greek Drama. Not the most useful of skills by today's standards, but if you ever need to conjugate a verb or decline a noun in Middle English, I'm your guy.

2If not outright banned, at a minimum there should be penalty of a heavy fine and 20 hours of community service for each infraction.

Jun 20, 2007

The Anti-Mentor

I just finished reading an interesting article that brushed up against a theory that I've had for a while. In the article, the author refers to the "Anti-Mentor," a manager or boss that provides ample learning opportunities by way of what not to do. Specifically, the author gives reference to the "polished veneer" of his Anti-Mentor. In part, this comes down to integrity, which I discussed in a previous post. Beyond that, though, we move into the area of my theory: that such pathological disingenuous behavior is a form of psychosis. When that thought first occurred to me, it was very much tongue-in-cheek. After years of working in numerous environments, however, the facetiousness of that statement has steadily decreased. Consider the definition of psychosis from the Full American Heritage Stedman's Medical Dictionary : "A severe mental disorder, with or without organic damage, characterized by derangement of personality and loss of contact with reality and causing deterioration of normal social functioning." Speaking for myself, I can't count the number of times I've had one of these Anti-Mentors change personalities right in front of my eyes, or (my personal favorite) be helpful and supportive to me and then turn right around and try to sell me out in an attempt to conceal their own incompetence. It reminds me of a good ol' Southern phrase I heard a long time ago: "What do you expect from a pig, but a grunt?" I realize that a.) I am not a medical professional and am in no way qualified to make a diagnosis such as psychosis; and b.) I am stretching the definition of psychosis to (and probably past) the breaking point. Even so, it does help to cast the situation in a different light. These Anti-Mentors are infuriating to say the very least. However, it is probably worth viewing them with understanding and a touch of pity. When confronted with an Anti-Mentor, know them for what they are and expect that they will fundamentally always be true to their Anti-Mentor nature. Knowing what they are and what to expect from them makes dealing with them a little less painful.

Jun 15, 2007

Collaborative Incident Response

This is an idea I've had in my head for a while now, and in light of the recent DDoS attack against Estonia, I got to thinking about it again: the need for collaborative incident response and investigation. The attack against Estonia (which was significant enough to attract the attention of NATO) was effective and performed by individuals who were at least somewhat more sophisticated than your average script kiddie. Before going any further, let me provide a quick background on the idea of collaborative incident response.

Several years ago, I was in charge of designing, training, and implementing a Computer Security Incident Response Team (hereafter referred to as CSIRT) at one of the local hospitals, at which I was employed at the time. The team was well organized and broken into complimentary (and slightly overlapping) areas of expertise. Once everyone on the team was trained and familiar not only with their role but the roles of the rest of the team members, we began performing firedrills in earnest. Scenarios were devised that the CSIRT would address, first as simply roundtable exercises, and then finally real-time, live drills. The idea of the drills was not only to hone the skills of the team, but to identify areas of weakness that we would then attempt to address before the next drill (or actual incident). All told, the drills were effective, useful, and to be perfectly honest, fun. It was at this point that I was contact by our company's disaster preparedness person who told me that there was actually going to be a city-wide disaster drill, and wanted to know if the CSIRT wanted to be included. Naturally, I said yes. I was given the basic scenario for the city-wide drill (a plane crash at the local airport), and then I devised the CSIRT drill around that. Thinking on a city-wide scale really got the ol' wheels turning. What would we do in the event of a security event massive enough to exceed the resources and abilities of the CSIRT? If this was a city-wide (or larger) event, other organizations would potentially be in the same boat, and perhaps we would be able to assist each other.

The ideal situation would be to have local businesses and other organizations come together in a community CSIRT that could be called upon in the event of a significant security incident. Obviously I'm not talking about giving people from other organizations the keys to the kingdom. Far from it. What I am suggesting is having the community CSIRT function primarily in a research and logistical support capacity. In addition, in most cases it would be possible to do this without divulging too much about your inner workings. Let's consider an example. For the sake of our discussion, let's say that our organization, Company Q, is hit with a massive security incident. Key servers are unreliable, portions of our network infrastructure are up and down, workstations all over the enterprise are crashing in a cascading fashion. An event of this size would push the security of any organization to (and quite probably past) the breaking point. Here's where the community CSIRT could come into play. Folks from our organization would be able to sit down with the community CSIRT and begin to dissect the problem. We're in the heat of the battle, so having the assistance of some people who aren't directly affected could be very useful. The initial Crisis Action Meeting would consist not only of our people but key people from the community CSIRT. This would be particularly helpful in identifying the problem as different people from different organizations will bring with them their own experience which, by definition, will be unique. They'll be able to look at the problem in ways that we might not be able to. And if this is a large enough problem, what about fatigue? On the CSIRT that I put together, we implemented the rule that during an incident, a CSIRT member could only put in 10 to 12 consecutive hours before being required to stand down and get some rest. Don't ever underestimate the impact of fatigue during a prolonged engagement. Having some extra people who could be doing things like parsing logs and the like will allow our people to focus on mitigation and recovery, and, as needed, get some rest.

To be certain, there are a number of key points that would need to be worked out in advance. The idea is that it would be a mutually beneficial relationship for all involved. Company Q has an incident so they engage the community CSIRT which consists of Company R and Company S. A month from now, maybe Company R is the one having the problem, and we (Company Q) and Company S lend a hand. It works sort of like the way villages used to fight fires back in the old days; everyone came out to help, because the next house to catch on fire could be yours. Even if your competitors are part of the community CSIRT, they can still be valuable resources.

Obviously, the members of the CSIRT would have to hold themselves to an extraordinarily high ethical standard. Members would have to be chosen carefully. A good place to start my be the local InfraGard chapter. I am the Vice President of my local chapter, InfraGard Springfield. An advantage to starting with InfraGard is that each InfraGard member is vetted by the FBI. Not to say that only InfraGard members could be on the CSIRT, but it is as good of a place to start as any. With some effort, a community CSIRT could become the de facto hub for local IT security matters.

Jun 7, 2007

Remote log injection

I love a good, clever hack. In the past, I've espoused the virtues of OSSEC, and I use it in more interesting and creative ways on almost a daily basis. Recently, OSSEC author Daniel Cid posted a great paper on remote log injection entitled "Attacking Log Analysis Tools." I just finished reading the paper and found it very interesting and a little disturbing. I've tinkered with one of the vulnerable tools he mentions, DenyHosts, and thought it was actually a fairly handy tool. After reading Daniel's paper, though, I'll have no choice but to make sure that it isn't running on any of my systems until after a patch is released.

Nice paper, Daniel.

Jun 1, 2007

Just in case you weren't paranoid enough...

As I've mentioned previously, I'm a big fan of Richard Bejtlich's TaoSecurity blog. Yesterday, he made a post entitled "I Have Seen the Future, and It Is Monitored." The post is interesting and very, very disturbing. From the perspective of the Infosec professional, it is enlightening and provides ample material for further research. From the perspective of the gearhead, though, it scares me silly. Given the widespread use of National Security Letters, one doesn't have to be a conspiracy theorist or paranoiac with an Orwellian, dystopian view of the future to see where this could lead. The points both for and against the level of access described in Bejtlich's post are numerous and compelling. I certainly can see the need, but I'm also not comfortable with the potential ramifications. One would hope that such access wouldn't be misused, but human nature being what it is, I've got a dollar that says it would start being misused the instant it became available.

May 25, 2007

Confusing Strategy with Tactics

This seems to happen all the time. I, personally, encounter it with disturbing frequency. One of the most common mistakes I see made is the mixing of strategy and tactics by managers. I can feel some of you pulling away from me. Before you do that, though, let me explain. It is the job of management to enterprise strategy or "corporate vision," if you prefer cheezy management buzz phrases. With that, I couldn't agree more. Management has the encompassing goals and they are the ones who must solidify these goals and transmit them to the rest of the enterprise. In short, they define the "what" portion of the overall equation, in the sense of "Here is what we are going to do." Where I often see things go awry, though, is when management then goes into micromanagement mode and proceeds to tell the engineers how to meet these goals.

It is important to understand that I'm not saying that management should be completely hands off. It is their job to provide guidance and to set the rules of engagement, so to speak. Once those rules are set, though, they should step back and let the engineers work their mojo. Having been given the strategy ("what"), which includes the rules of engagement, it is the job engineers to determine the tactics ("how") appropriate for accomplishing that strategy. Let me provide an example to illustrate my point.

I once worked for Company X in the capacity of information security analyst. Though the company was fairly large, there were only two of us who were security analysts, so we were stretched pretty thin. We had a monthly enterprise vulnerability assessment that required sifting through a mountain of data. We were using a commercial vulnerability scanner that could only export the data as PDF files or as HTML files, so to do any real work with the data, PDF was out of the question, so we had to dump the data as HTML files. Fine. An inelegant solution, but still workable. Once exported to HTMl files, though, each file had to be opened individually and certain rows of data had to be exported to Excel. From there, we performed some calculations and data normalization (much of which had to be done by hand), and they that data was copied into a Word document, which was, in turn, converted into a PDF file for the final report. When I first encountered this absurd series of events, my first thought was "wow...too much effort. I'll put together a Linux box with Nessus and then I'll write some Perl scripts to parse the Nessus data and create the reports." So with the approval of my manager, I did this and I successfully reduced the amount of time required to collected the data and generate the final report from several weeks to a little more than a day. I showed the end result to my manager and he was pleased, so we were poised to roll out the new solution. As luck would have it, just prior to the roll out, the CIO decided to put forth a new corporate vision: we were to be a 100% Microsoft shop. I assumed that he meant that we were to use Microsoft products except for those cases where there was no Microsoft product, as in the case of the vulnerability scan and the subsequent report. As it turned out, I was wrong. So we had to scrap the whole project just before it was ready to go into production. The situation was explained to the CIO by my manager, at which time the CIO actually became somewhat combative and slammed his fist on his desk saying "I said that we are going to be Microsoft only, and I meant it!"

So there I was, back to having to jump through a comical number of hoops to generate what ended up being a report of less than 10 pages. "Well," I thought, "instead of having to extract this data by hand from these HTML files, I'll put together a perl script to do it for me. That'll speed things up a little bit." It was at this time that I got a call from another of the managers.

"Uh, yeah, you can't do that."

I blinked, stunned. "Can't do what, exactly?"

"Those scripts. Yeah, you can't write those in perl."

"Why not?"

"Perl isn't an approved language here at Company X."

"But they're just for my own use on my own machine. I'm just going to use them to parse some data that I'd otherwise have to parse by hand."

"Yeah, I know, but you still can't write them in perl. If you want to write them in Javascript, which is approved, that'd be ok. But you just can't write them in perl."

So there you have it. What I desperately wanted to say (and rightly did not) was "I have a job to do. If you want me to do it, then get out of my way and let me do it." This is what happens when management confuses strategy with tactics.

Managers: set the strategy, define the goals, set the rules of engagement, and convey that information to the masses. Then kindly step back a little bit and let the engineers do their jobs. Don't micromanage. It is irritating and insulting to the engineers and doesn't speak too well of your management skill.

Engineers: listen carefully and respectfully when you are given strategy, goals, and rules of engagement. Then do everything within your power to achieve those goals as quickly, efficiently, and effectively as possible. Play by the rules and give regular progress reports. And don't be patronizing. It gives us all a bad name. Besides, it never pays to antagonize management.

Mar 9, 2007

Clobbering Spam

Chalk one up for the good guys. Yesterday, the SEC announced that it has suspended trading of 35 companies that have been accused of stock spam. Spam is an enormous problem for everyone. I have an email address that I haven't even told anyone about yet, nor have I actually used it for anything yet, and it is already receiving spam. Hopefully, the actions of the SEC portend the fall of the leviathan that is spam. We can all do our own part, too. If you haven't already, I strongly suggest you join KnujOn. Join SpamCop. Support Spamhaus. If we all take a tiny, incremental chunk out of spammers, it will be to everyone's benefit.


Mar 7, 2007

Integrity and the lack thereof

Recently, I ran into a situation that highlights the absolute necessity for integrity among information security professionals. Unfortunately, in this case, I got to see what could happen when someone else demonstrates a significant lack of integrity.

In many regards, security professionals are not unlike attorneys or psychiatrists in the sense that during the course of your duties, you may become privy to certain information that, under no circumstances, can be shared. Obviously there are certain ethical obligations that come into play here. If you become aware of illegal activity or something along those lines, you are duty-bound to report it. However, when the information is clearly sensitive and there is no reason to divulge such information (other than to attempt to display to others how much you are "in the know"), to reveal such information is egregiously unethical. Here's the story that brought this to light. I'll try to keep it brief. All names have been removed from the information below.

I currently work for Company A. Several months ago, Company B, a consulting firm, approached me and asked if I would be interested in looking at a few positions they had open. Let me emphasize that they came to me. I was content with my work at Company A, but in my experience, it always pays to keep your options open. So I agreed to hear about these positions. Here's where an unfortunate series of coincidences comes into play. A person currently working for Company B (whom I have never met, by the way) used to hold my position at Company A. Let's call him Bob. Further, when Bob held my position at Company A, he worked for the same manager that I currently work for. Let's call the manager Tom. So Bob is a security person. His focus in the security field is substantially different from mine, but a security person nonetheless. For reasons I don't entirely understand, Company B asks Bob to take a look at my resume. At this point, Bob, who is ethically obligated to keep company-sensitive information private, promptly gets in touch with my manager (and his former manager, Tom) and says "Hey, Kurt is looking for a new job." So a couple weeks later, Company B makes me an offer that I'd have been a fool to decline, so I took it. I then go to my manager, Tom, and put in my two week notice. Imagine my surprise when it became clear that he already knew about this position. I did a little investigation and quickly discovered the chain of events outlined above. By blind luck, there don't appear to have been any negative ramifications of this. (Or, at least none that I'm aware of at the moment.) But that doesn't excuse the fact that it happened in the first place. If I'd had a different manager (I have a pretty good professional relationship with Tom), this could have gone very bad, very quickly. I could have been fired, it could have besmirched my professional reputation, etc., etc. In this particular case, I appear to have dodged a bullet, but I'm still pretty ticked that I got shot at in the first place. I'm reminded of the line from Shakespeare's Othello: "...he who filches from me my good name, robs me of that which enriches him not and make me poor indeed."

Here's the deal. Those of us who are security people need to hold ourselves to a very high ethical standard. Let's be some point in the past, we've all probably done things (hopefully very minor things) we shouldn't have or possibly used our position to our advantage. To some degree, that's human nature. (Think of a police officer pulling strings to get out of a speeding ticket, for example.) The key words there, though, are "in the past" and "used our position to our advantage." In this case, Bob had absolutely nothing to gain by releasing this information, other than to attempt to impress his former manager, Tom, with how "wired-in" he is. Were there some sort of governing body for security professionals, I would have reported Bob in a heartbeat. There isn't, though, so Bob gets to go on his merry way, coming into contact with sensitive information and potentially divulging it to others as he sees fit. In short, Bob should be ashamed of himself. It is incumbent upon us as professionals to give careful thought to the potential ramifications of leaking information to which we become privy. The actions of Bob were disgraceful and we, as professionals, must do our best to to stamp out such behavior whenever and wherever we find it.

Feb 2, 2007

Linux Service Boot Order

I'm going to start including little notes and tidbits here for my own reference and hopefully for the reference of others. I'll label these as "notes."

To change the load order of services at boot time, first determine the runlevel ([root@host]# runlevel). Once done, go to the appropriate runlevel directory. I'm using CentOS and I'm running at runlevel 3, thus the directory I want is /etc/rc.d/rc3.d. There are two groups of scripts in this directory: those that start with K (these are the kill scripts) and those that start with S (these are.....SURPRISE! the startup scripts). A representative file listing might look like this:

And so on ad infinitum. The number represents the execution order. Most recently, I wanted to move the order in which Shorewall was started. By default it was "S99shorewall." I wanted it to start right after networking (which was S10network), so I renamed the file to "S11shorewall". Simple as that.

Dec 11, 2006

An Open Letter to the Open Source Community

Sorry for the delay between posts. Between the whole holiday season thing, having a cold, having a 1st birthday for my younger daughter, etc., time sorta got away from me. So I figure I'll get things restarted with something that has irked me for quite some time, and it came to the surface again this morning.

This morning, I got an IM from a friend of mine. Here it is: "...but I'm NOT using ANYTHING called Ubuntu: Feisty Fawn. What kind of idiot slapped that on?" My friend touched upon something that is, I think, indicative of a significant hurdle that Open Source projects will need to overcome if they ever expect to be taken seriously and to ever have even the tiniest chance of being able to step out of the shadows. Before I dive in, let me state for the record that I am a die-hard member of the Open Source community. I am an ardent supporter of Open Source; if there is an Open Source equivalent for something, I'm using it. That being the case, while the following may come of as a bit vitriolic here and there, it is not to be taken as a slap at the Open Source community in general. It is merely an attempt at a wake-up call to the community members, and, hopefully, a call to action.

In short, I humbly ask the Open Source Community to please, please, please stop giving software (and branches, tags, and sub-versions thereof) stupid names. Seriously. I know that you may think it is funny, but it really isn't. The aforementioned "Feisty Fawn" thing just illustrates the point. There are tons of such names out there, ranging from absurd to, quite frankly, offensive. Every place I've worked, I have been a major advocate for Open Source software. It is very difficult to be taken seriously in meetings with management when you say "I have a potential solution," and then explain that your solution involves the use of Feisty Fawn, Tiny Sofa, Oinkmaster, BitchX, SheepShaver, awffull, lame, moomps, seahorse, smeg, gimp, spit, yoltia, suck, torsmo, valknut, vomit, and/or zile. Naming things, whether we're talking about naming software, children, or pets, can be a difficult process. When giving something a name, though, you have to ask yourself a few simple questions.

  1. Am I using this name because I think it is clever or cute? If the answer is "yes," then keep looking. You might think it is cute or particularly clever today, but odds are that you won't always find it so amusing. (Here I cite a person my sister-in-law knows whose first name is Frodo. Yeah, as in Baggins. I'm sure Frodo's parents thought the name was funny and probably even a little cute. I've got a dollar, though, that says if we asked our friend Frodo what he thought of his name, he'd have a somewhat different opinion.)
  2. Am I using this name because it is an inside joke? This is really just a slight variation on the previous question. Again, if you answer "yes," do yourself and everyone else a favor and keep looking.
  3. Is this a name that I'll be happy with 10 years from now? This one seems pretty obvious, but I'm always shocked at the number of people who don't really think this one all the way through.
  4. Is this name something I would be embarrassed to say in front of my grandmother? I like to call this one "the grandma rule." Here I cite such names as "suck" and "vomit." Inherently offensive? Not necessarily. Good names for software? Not even close.
  5. And finally, is this a name that I'll get tired of hearing?
While we're still on the subject of what is and isn't good naming style for an Open Source project, let me touch briefly on the subject of acronyms or initials. In general, try to avoid it. Sometimes it works, take PERL and even NATO for example. Most of the time, though, it doesn't. It usually ends up producing some sort of gibberish that is difficult to spell, impossible to pronounce, and equally impossible to remember. Even in cases where you can pronounce and remember the acronym, it still may be a bad idea. The definitive example of this is GIMP (GNU Image Manipulation Program). This acronym is derogatory and offensive. I can hear people already "but it was a joke," (see question #2, above) or "it isn't intended to be insulting." To this I reply that, in general, things operate not on reality but on the perception of reality. It may not have originally been intended to be insulting, but it is. So change it, simple as that. Ethereal successfully changed its name to Wireshark, so if they can do it, so can GIMP. (The Wireshark name change came about for legal reasons so they had no choice but to change, but the name change concept applies equally well to GIMP.) And then, of course, we have the matter of recursive acronyms. Once upon a time, this was a strange tradition and apparently seemed like a good idea at the time. Here are a few examples of recursive acronyms: GNU stands for "GNU's Not Unix." Clever, huh? And PHP stands for "PHP Hypertext Preprocessor." And LAME stands for "LAME Ain't an MP3 Encoder." Please oh please oh please put an end to this. It never was funny or clever and over time, it has only become more and more annoying.

So what conclusions can we draw from all of this? Basically, take care when naming Open Source projects. If Open Source is ever to come into its own, it must be taken seriously by those who develop it. While GIMP and PHP and Oinkmaster may have become serious, production-quality software, their names suggest that at the early stage, they were each named because someone thought it was funny. If we, as members of the Open Source community, want our efforts, our software, and our plight to be taken seriously by the industry at large, we must first take ourselves seriously. This is the root of much of the resistance to Open Source software. Even Microsoft's previous attempts at disinformation about Open Source software hinge upon this. How could we expect others to take us seriously when we (apparently) don't even take ourselves seriously? Am I saying that Open Source software needs to become stuffy and boring? Of course not. But the Weltanschauung of the industry at large stems predominantly from how we perceive ourselves. Times have changed and as Darwin suggests, we must either adapt or die. As such, we must treat our work within the Open Source community with care and humility, and perhaps even a touch of reverence. To do otherwise is a disservice to our work, to ourselves, and to our community.

Don't Kill the Penguin!
Recursive Acronym
Ubuntu Development Code Names

Nov 14, 2006

About @#$%ing time...

Microsoft has finally released a Hotfix for the Windows XP Wireless Client, and all I can say is that it is about friggin' time. Internet Storm Center has a description of the Hotfix HERE. Among other things, this fix addresses one of the most annoying things (from a Windows XP wireless perspective) I've encountered in a long time: the random Windows XP wireless network. If you've ever used Kismet in the vicinity of Windows XP machines, you know what I'm talking about. Not only does XP continue to cycle through its list of preferred wireless networks (leaks far too much information and makes it waaaaaaay too easy to determine whose laptop you're looking at), but you also get the weird random SSID strings. If you just let Kismet run for days or weeks at a time, it isn't at all uncommon to have a list of several hundred or even several thousand probe requests just because of this odd XP behavior. Here's a little piece from the Hotfix page:

In Windows XP with Service Pack 2, Wireless Auto Configuration tries to match preferred wireless networks to wireless networks that broadcast their network name. If no network matches a preferred wireless network, Wireless Auto Configuration sends probe requests to determine whether the preferred networks are nonbroadcast networks. In this manner, a Windows XP wireless client advertises its list of preferred wireless networks. An observer may monitor these probe requests and configure a wireless network by using a name that matches a preferred wireless network. If the wireless network is not secured, this network could enable unauthorized connections to the computer.
I understand Microsoft's intent in designing their wireless client to work this way. Obviously, they are trying to make the connection to wireless networks easy. They've made it easy at the expense of security. And on an OS that is notoriously difficult to protect without extensive 3rd party software.

By strange coincidence, this Hotfix was released almost to the day of the 5th anniversary of the release of Windows XP. This unusual wireless behavior has been a known issue since that time. Why in the world did it take 5 years to release a fix for this? Ok, I grant you that some of the other things that this Hotfix addresses weren't big issues 5 years ago. But that strange "parking" behavior? C'mon. If I'm a Bad Guy, all I have to do is sit in the parking lot with Kismet running and listen for Windows XP machines to start cycling through their list of preferred networks. Depending upon the number and frequency of these probes, I can start making some fairly educated guesses about these wireless clients, and with a little extra effort on my part, I could setup my trusty Linux laptop in AP mode and start trying to trick unsuspecting users into connecting to me, at which time I can start collecting usernames and passwords and whatnot. If I'm so inclined, I can then take this information and compare it to data that I pull down from and I can even start making guesses about where these users are located and places they frequent, based solely on this hemorraghing of information from the Windows XP Wireless Client. If you use Windows XP wirelessly, install this Hotfix immediately. In addition, be very careful with who you are talking to wirelessly. You never know who might be listening.

Nov 8, 2006

Tools of the Trade, Part III

A few more "must-have" tools to keep on hand:

  • 3D Traceroute (). Portable! Gotta have a good traceroute program, and 3D Traceroute is about as good as it gets.
  • Sam Spade (). Fantastic tool for IP lookups, DNS info, etc., etc. The site appears to be unavailable at the moment, but the Sam Spade tool is available for download at lots of sites around the net.
  • Wireshark (). A quality packet sniffer is just something you must have. You can't even hope to dig into what is going on throughout your network if you don't have a good packet sniffer. Formerly known as Ethereal, Wireshark is the cream of the crop.
  • Cygwin (). Cygwin provides a Linux-like environment in Windows. If you can afford the disk space, it is probably worth doing a full install. Tons of tools that we know and love from Linux now available in Windows. For me, it makes life much less stressful.

Mapping wireless networks

I recently had reason to do a little wireless investigation at work. There was some concern that there may be a wireless access point attached to the network that had been setup insecurely. So I grabbed my laptop and my USB GPS device and scampered off like a kid on his way to the candy store. I did some passive investigation from the parking lot with and . If you aren't familiar with these tools, I can't recommend them strongly enough. When using these tools together, the WiFi data you can collect is amazing, especially if you use them in conjunction with GPS. Ok, so you've got this what? That's where comes into play. WiGLE, the Wireless Geographic Logging Engine, is a clearing house for files collected by people all over the world when wardriving, warwalking, wardancing, or warskippingaboutlikealoon. You upload your file to the WiGLE site and it crunches the data and makes the results available for download. Using one of the WiGLE clients (I really like the Java-based client, JiGLE), you can download data for any number of areas and it gives you maps and locations of all of the identified APs. JiGLE allows you to view area polygons, displaying the coverage area of a given AP, as shown here:

With a little bit of effort, you can even import JiGLE data into Google Earth. Now that, friends and neighbors, is cool; simple as that. WiGLE is a great tool to have in your back pocket.

Nov 6, 2006

Who says network people aren't funny?

I was working on a couple ideas for a few new posts and I happened to blindly stumble across this story: . With a title like that, I had to investigate. Ahhhhh.....good humor. Don't get me wrong, it won't have you howling with laughter or anything, but it was just the thing to lighten up an otherwise dreary Monday morning.