|
A couple of years ago I started a mailing list where folks not necessarily involved with the vetted, trusted, closed and snobbish circles of cyber crime fighting (some founded by me) could share information and be informed of threats.
In this post I explore some of the history behind information sharing online, and explain the concept behind the botnets mailing list. Feel free to skip ahead if you find the history boring. Also, do note the history in this post is mixed with my own opinions. As I am one of the only people who where there in the beginning though and lived through all of it, I feel free to do so (in my own blog post).
As I conclude, we may not be able to always share our resources, but it is time to change the tide of the cyber crime war, and strategize. One of the strategies we need to use, or at least try, is public information sharing of “lesser evils” already in the public domain.
History
It was my strong conviction that the bad guys (criminals!) already had access to all this data—now we know they do, and further, could test their own creations against anti virus detection (on their own to see they are not detected or using a tool such as VirusTotal). They could use honey pots and any number of other sources of public information. Then, they could also always measure success ratios—they do.
On the other hand, the Good Guys (TM) did not share. What sharing did happen was very limited and limiting. Aside to that, because it was so scarce, it was (and to a level still is) kept secret to a select group of friends. Others would not be allowed in very easily, nor should they for obvious trust issues.
System administrators and security researchers had to get their information from their own logs or public reports of limited value from vendors. This secrecy also had the consequence of the public not being aware a cyber crime problem even exists and later on, always being roughly three to six years behind the curve on accepting what is actually happening.
By extension, when after the Estonian “war” many countries and organizations became-literally-scared, they started creating tech policy, based on misconceptions and information glimpsed from the news media and vendor reports.
The black hat effect
The anti virus industry has a history of being strict on sharing. That is as it should be and quite proper. In the early 1990s there used to be roughly one virus released every month. Then someone released a study on one, and within a month 50 new variants came out. Disclosure was a bad idea. However, times, they are a-changing.
When malware can be found by anyone running a honey pot, surfing the web, opening their inbox or Googling for it, the strong restrictions on sharing made little sense as far as “aiding the bad guys” (read criminals). The strong argument remaining to be strict on sharing was “we are not black hats, we are careful with these things!”
This is fine, and acceptable. It is also burying our heads in the sand. While sympathetic, change was required as the big worms were out (circa 2003-4) and security professionals all over the world had no information. Worse, when most security vendors and therefore the media were concentrating on the big worms, exponentially bigger botnets were out there, undisturbed.
A new industry formed which would later be called “Anti Trojan”, as they would detect these bots (Trojan horses) and remove them, while many anti viruses considered them:
Beginning 1997, I made many approaches and tried to get the anti virus industry involved, telling them they are only detecting 20 to 30 per cent of all malware, to no avail. In 2004-5 they started playing catch-up. This happened again two to three years late with spyware (new industry, two years late to the game, etc.) and two to three years late with rootkits.
At that point in time active sharing was established between vendors (not just anti virus), academia and others. Companies such as Checkpoint, Cisco or “God forbid” Microsoft had “no business” dealing with samples according to the anti virus industry, as they went elsewhere, with people such as myself driving this sharing and, yes, taking the heat.
The strict sharing policies had an extra motive (on part of the anti virus industry), which made little sense except for business sense. They had every marketing intention of maintaining an iron grip on malware samples, so that only they could sell products and control the information flow. It was brilliant for a few years, but they also self-marginalized themselves and were forced to become more generic security vendors to catch up, due to inability to change in time.
They now had massive competition and were out of touch. This reminds me of the copyright wars in the music industry.
This grip was broken as such information became readily available (which was, as mentioned, ignored by the anti virus community). I can take a very big part of the credit for breaking this iron grip, by fascilitating sharing communities where vendors, researchers, law enforcement and others not directly of the anti virus world could exchange samples as well as analysis. Being a part of the anti virus world, this made me persona-non-grata by some, but thankfully not for more than a year or so.
Still, vetting and silence were a pre-requisite in the newly formed communities. Trust was key. Some of the new mailing lists and communities formed by me were DA and MWP. Later copy-cats include malaware and II (not as vetted, but now more relevant as far as malware sharing goes).
Others still would have to create their own communities, such as the ISP world, fighting this problem on the network side. They would later on not accept the researchers much the like researchers would not accept them—for the very same reasons, and only to change their minds once these folks started working on their own (on mailing lists such as DA and MWP).
No one wants to be considered a black hat, but times change and necessities facilitate evolution.
Sharing C&C information
It was a long journey, but we kept running into the same problems. We’d be fighting malware infecting a hundred thousand to three million users a day, with hundreds such incidents every single day. Yet, the public did not know about it, and the security vendors would be behind—concentrating naturally on their own niche.
We changed the world, enabled better sharing and created new trust models. And still, we would not truly cooperate. Cooperation and resource sharing aside (after all, many in the industry have financial agendas, as they should), we could not get the bigger picture straightened out. We needed to share intelligence on millions of stolen identities every day, but still couldn’t get this malware sharing out of the way.
Command and control (C&C or C2) for botnets, for example, was information barred and restricted by the security and network operations communities now newly formed. After all, sharing would cause us to help the criminals. No? More than that, we’d no longer have control.
Much like with the anti virus industry before them, the anti terrorism folks in government and any other reactive fighters, the ISPs and operations professionals—me included—were indeed doing great work. We’d be fighting malware and botnets, but the problems just got worse, even if we were more organized.
A couple of years later, getting these C&Cs off-line was no longer useful, as they had graceful degradation and backup, immediately “jumping” somewhere else, undisturbed.
New researchers and organizations were refused acceptance once again, and started working on the problem on their own, sharing their information and eventually out-growing the original communities now set in their ways. Such is the way of the world. This showed me how sometimes diversity, rather than cooperation, can be great. Repeating mistakes and seeing how they no longer are mistakes due to a changed landscape, was something I now appreciated.
My advocacy was to treat C&Cs as intelligence sources rather than targets, but the intelligence discussion is for another time in another post.
Soon, C&C information was publicly available, and yet—to the public and policy makers, the cyber crime problem did not exist.
Enter the botnets@ mailing list
It was time for a change. Facing much resistance I created a public mailing list where the public, the sysadmins and the security researchers could share information, learn and fight cyber-crime.
The response was staggering. Dozens of contributors emailed in with detailed information, and yet—we felt uncomfortable about it. We treated folks like they were doing something wrong sharing in public, and sent mixed messages.
New groups were formed, and older groups got new recruits (such as Shadowserver, which the mailing list helped). It was still a win situation, but the mailing list had to go.
Today, about two years later, the botnets mailing list has been revived and in the past day the response has once again been staggering.
Folks share their information, get informed of new threats in a language they understand (tech) and talk to each other. More over, they understand the risks and the ugly face of Internet security is out there for all to see. This time we need to be ready to accept this change.
Public fighting
Sharing information with the public has always been something I was personally attacked for, and yet, how else are you supposed to win a war if the people you fight for don’t even know it is happening, or needed?
Last year, Estonia was attacked on the Internet by Russians [PDF]. It can not be proven if it was a public uprising, Internet-style, or state-sponsored action. Still, it re-affirmed some of my beliefs about affecting change and community forming.
To fight a war, you have to be involved and engaged. On the Internet that is very difficult, but the Russians found a way. It is a fact that while we made much progress in our efforts fighting cyber crime, we had nearly no effect what-so-ever on the criminals and the attackers. Non. They maintain their business and we play at writing analysis and whack-a-mole.
Using the botnets mailing list, I am borrowing a page from the apparent Russian cyber war doctrine, getting people involved, engaged. Personally aware and a part of what’s going on.
It can’t hurt us, and perhaps now, four years over-due and two years after the previous attempt, we may be ready to give it a go and test the concept.
Perhaps now regular malware can become something regular professionals deal with, low anti virus detection of samples can become public knowledge, and vetted communities can think strategically and respond to more problematic matters such as intelligence handling of millions of stolen identities, or criminal organizations operating—not only in Russia and China, but from the San Fracisco bay area.
We may not be able to always share our resources, but it is time to change the tide of the cyber crime war, and strategize. One of the strategies we need to use, or at least try, is public information sharing of “lesser evils” already in the public domain.
Sponsored byRadix
Sponsored byIPv4.Global
Sponsored byCSC
Sponsored byWhoisXML API
Sponsored byDNIB.com
Sponsored byVerisign
Sponsored byVerisign