Threat Intelligence |
Sponsored by |
As discussed in the several studies on name collisions published to date, determining which queries are at risk, and thus how to mitigate the risk, requires qualitative analysis. Blocking a second level domain (SLD) simply on the basis that it was queried for in a past sample set runs a significant risk of false positives. SLDs that could have been delegated safely may be excluded on quantitative evidence alone, limiting the value of the new gTLD until the status of the SLD can be proven otherwise.
As each day brings new revelations about surveillance online, we are starting to see increasing activity in national legislatures intended either to establish more control over what the security services can do to their nationals (in countries like the US), or to limit access by foreign secret services to the personal information of their citizens (countries like Brazil). Unfortunately, neither of these approaches address the underlying problem: we have a paradigm for surveillance that's fit for the analogue past, not the digital present, let alone the future.
For several years, DNS-OARC has been collecting DNS query data "from busy and interesting DNS name servers" as part of an annual "Day-in-the-Life" (DITL) effort (an effort originated by CAIDA in 2002) that I discussed in the first blog post in this series. DNS-OARC currently offers eight such data sets, covering the queries to many but not all of the 13 DNS root servers (and some non-root data) over a two-day period or longer each year from 2006 to present.
As widely discussed recently, observed within the ICANN community several years ago, and anticipated in the broader technical community even earlier, the introduction of a new generic top-level domain (gTLD) at the global DNS root could result in name collisions with previously installed systems. Such systems sometimes send queries to the global DNS with domain name suffixes that, under reasonable assumptions at the time the systems were designed, may not have been expected to be delegated as gTLDs.
It is a safe assumption that if you are reading this post, you like technology. If that is the case, then you understand the tremendous economic, cultural, and human rights benefits an open, universal, and free Internet provides. That freedom is under attack. And it is our responsibilities, as stakeholders in a successful Internet, to balance governments and have an open dialog on the topic.
How do we harden the Internet against the kinds of pervasive monitoring and surveillance that has been in recent news? While full solutions may require political and legal actions, are there technical improvements that can be made to underlying Internet infrastructure? As discussed by IETF Chair Jari Arkko in a recent post on the IETF blog, "Plenary on Internet Hardening", the Technical Plenary at next weeks IETF 88 meeting in Vancouver, BC, Canada, will focus on this incredibly critical issue.
DNS tunneling -- the ability to encode the data of other programs or protocols in DNS queries and responses -- has been a concern since the late 1990s. If you don't follow DNS closely, however, DNS tunneling likely isn't an issue you would be familiar with. Originally, DNS tunneling was designed simply to bypass the captive portals of Wi-Fi providers, but as with many things on the Web it can be used for nefarious purposes. For many organizations, tunneling isn't even a known suspect and therefore a significant security risk.
Google has received a lot of press regarding their Project Shield announcement at the Google Ideas Summit. The effort is being applauded as a milestone in social consciousness. While on the surface the endeavor appears admirable, the long-term impact of the service may manifest more than Google had hoped for. Project Shield is an invite-only service that combines Google's DDoS mitigation technology and Page Speed service...
Today at the RIPE 67 event in Athens, Greece, IETF Chair Jari Arkko gave a presentation on "Pervasive Monitoring and the Internet" where he spoke about the ongoing surveillance issues and: What do we know? What are the implications? What can we do? Similar to his earlier article on the topic, Jari looked at the overall issues and spoke about how Internet technology should better support security and privacy.
I often think there are only two types of stories about the Internet. One is a continuing story of prodigious technology that continues to shrink in physical size and at the same time continue to dazzle and amaze us... The other is a darker evolving story of the associated vulnerabilities of this technology where we've seen "hacking" turn into organised crime and from there into a scale of sophistication that is sometimes termed "cyber warfare". And in this same darker theme one could add the current set of stories about various forms of state sponsored surveillance and espionage on the net.