|
“Smartphones (and tablets, WdN) are invading the battlefield”, reports the Economist on its website of 8 October 2011. On the same day the hacking of U.S. drones is reported on by several news sites. (“They appear friendly”. Keyloggers???) Is this a coincidence?
Because we can
Yes, of course it is, but still. This hack is one in a long row of reports on hacking or vulnerabilities in the past year. Whether at private companies, websites, security certificates, the EU Commission, smart phones, voice mail boxes of politicians, the list is endless. And here comes this highly optimistic article on the way the Internet will invade battlefields and make the lives of our young soldiers safer. Some quotes.
GPS will show their position and that of friendlies as well, so friendly fire will become less of a risk. Through an app, information from drones and reconnaissance balloons can be shown on the screen in real time. The handheld devices will be made more robust and battery life can be enhanced. Apps can be downloaded in seconds and, if need be, adjusted within hours. This good stuff will come from consumer companies as they have a vast budget to do research and development, much larger than the military. And, hang on here, soldiers will bring their own devices to the battlefield.
I already was getting a bit perplexed reading this article, but reading that last line, made me drop out of my chair. Again, it seems people are lured by the technical possibilities, in combination with stringent budget cuts. But also war as a computer game sprang to mind, with real kids.
What are potential risks?
Let’s look at this from a very basic level. The U.S. military is going to make use of smartphones and tablets on the battlefield. Just a few thoughts.
a) Let’s start with where almost all devices are made nowadays. China? If I remember correctly Misha Glenny already wrote about unsolicited software inserted on devices in Chinese factories in his book McMafia (2008). b) The software mentioned in the article is Android. As far as I am aware, which comes from reports I read, Android is the most unsafe operating system, because it offers an open platform. c) The first malware hosting apps, keyloggers and autodialers for smartphones have already been reported on. d) AV security is not a standard product delivered on smartphones. Remember the I-bring-my-own sentence? e) If I can think up the possibility of a GPS hack, probably some smart hacker will eventually figure out how to do it. What a soldier is shown on the screen of his smartphone after this, is anybody’s guess. And, who else can read the location of this soldier? f) and what can they do with a hacked drone? g) What is the security level of the private companies involved? Who checks these levels and where is their software made? h) What does a young soldier, bored out of his mind, deep in the desert, do with smart devices in his spare time? Especially when it’s his own? Let’s guess.
The truth is, that the chances of a hack through unsafe, or worse just plain use are not imaginary, they are a potential threat to every soldier, army, battle. The Internet was never created for this sort of use. And still people continue forward in this unchartered cyber environment. They boldly go, because they can.
Cyber awareness seems fundamentally lacking
Cyber awareness is at present one of the most underestimated measures in the world. It is time that people responsible for strategic choices, whether in the military, government, industry, etc., start to become aware of the issues at stake and the risks involved and stop being mesmerized by possibilities (and saving money). (The same goes for everyone else for that matter.) Not everything that is technically possible is also smart to follow up on. When something cyber is involved for one reason or another people tend to stop thinking clear. I am not in a position to judge whether the security level of all that was mentioned in this article is adequate or not. For now I do say, that I will not be surprised when something goes horribly wrong. Not even when the rest of the world says its usual line: “Now how was that possible? We never foresaw that this could happen”.
The match between possible and careful
Sometimes it seems to me as if the people defending their systems from the Internet and the people inventing, deciding on and adapting new tools live on a different planet, with the first group always losing. Or is it that the knowledge of decision makers and their teams is not profound enough? There needs to be a match between these two groups. I can’t remember who told me “think before you act” for the first time. It was a long, long time ago. Take the lesson to heed I’d say and get the right people in before you decide on your next Internet steps.
Sponsored byVerisign
Sponsored byRadix
Sponsored byIPv4.Global
Sponsored byVerisign
Sponsored byDNIB.com
Sponsored byCSC
Sponsored byWhoisXML API
1. The US army predator drone malware turned out to be some tracking / keylogger software added in by the army itself, but without telling anybody else. Something about tracking the efficiency of predator drone operators.
2. An open platform doesn’t necessarily make an OS insecure (like say linux would be a good counter example). Any OS at all, for a phone or a PC, can be designed and configured with sensible security defaults.
Portable phones / smartphones are already being made milspec and with standard military security policies enforced [standard security policies are well known in the corporate world as well, but the engineering of the actual phones has to be rather more robust + there are other requirements, for it to be classed milspec]
Soldiers on active deployment may or may not be allowed to take along personal gear of any sort (they’re sometimes asked to leave even letters from home, etc, behind, let alone personal cellphones ..).
In a statement, published by the Washington Post, the US military claimed that the malware was a keylogger designed to capture passwords in the gaming environment. In other words, the issue is downplayed. Remember the claim “it appeared friendly” at first?
The first questions that spring to mind are:
- What else did the keylogger record;
- What was done with that info?
- How did it get there in the first place?
The bored soldier? I can’t really think of another way. These are very fundamental and worrying question as we’re discussing national security.
The concept of BYO is popular. It is e.g. a topic in the seminars around an IT trade fair event next month in The Netherlands. But let’s face it. ICT people in organisations are asked to connect private devices to the organisation’s network at a daily basis. Do they have the internal position, as in power, to say “no”!, at least for as long as it takes to make an inventory of the consequences and do they have the knowledge to make that risk assessment? Reputation has a tendency to leave an organisation quite fast, but where ICT is concerned people concerned don’t seem to realise this.
I liked this quote (sorry, I don’t know by whom). “No matter how foolproof we make our software, someone will come up with a bigger fool”. So, isn’t it also about giving fools less opportunity?