Android, apps, Computers and Internet, Encryption, Security, Trust

Code Identity and the Android Master Key Bug

Android invasion, Sydney, Australia

Android invasion, Sydney, Australia (Photo credit: Pranav Bhatt)

I was part of the effort to drag MSFT security into the modern era. It was extremely painful. I assumed (perhaps stupidly) that our highly-public lessons would mean other late-comers to the security party would look at our wrecked living room, burned furniture and bad tattoos and then not make the same mistakes we made in our irresponsible youth.

But perhaps no. This Android bug could prove to be extraordinarily bad.

Blowing hash and signing functions so that the underlying code can be changed without the hash and sigs changing is horrifyingly atrocious. This is the code equivalent of impersonating a person with a mask so good nobody, not even the real person themselves, can tell the difference.

The entire value of a chain of trust is that you are limiting the surface area of vulnerability to the code-signing and hashing itself. This bug, if it’s as described, destroys the chain. All bets are off. You’d be better off without the assertions and chain at all: Treat everyone as adversarial and move all critical operations off-device and into something you know you can trust.

I am not saying this automagically makes Android phones infinitely vulnerable to horrible deeds. It doesn’t. As of July 4th 2013 there are no known exploits in the wild that make use of this attack. To really hit something out of the park based on this bug the bad guys are going to need a way to get an offending app onto a phone. This means getting it through a heretofore unknown exploit in Google Play or onto the phone via side-loading or another distribution method.

So we’re all okay, right? Well, no. Not necessarily. Perimeter security – which is what Google uses to keep bad apps off of phones in the first place – is notoriously bad. It’s so bad that Google (and Apple, MSFT, and everybody else) use techniques like sandboxing (perimeters within perimeters), privilege, code signing and code validation to make up for its deficiencies.

Malicious software has an annoying habit of finding it’s way onto devices with considerably stronger perimeters than Android so validation of code that is on the system is critically important.

Unfortunately it’s not just the exploit that is distressing. One of the the things we eventually got good at at MSFT back when we routinely had our pants around our ankles on security was in our responses. There’s no way you can survive forever in an environment of constant adversarial attack if you don’t get much better at defending yourself technically AND much better at working with the public about what you’re doing.

In this blog post, Google advocate that companies “should fix critical vulnerabilities within 60 days”  and that “after 7 days have elapsed without a patch or advisory, we will support researchers making details available so that users can take steps to protect themselves”.

Google espouses 60 days to fix exploitable bugs and going public one week after private notification. According to Bluebox they told Google about this via bug  8219321 in February 2013. That’s a little bit more than 60 days ago. Seeing as it’s now July, I think (and I’m not very good at math, so bear with me here) that’s at least twice as many. It’s especially more than 7 days. I’m not sure how Google are following their own disclosure policy.

Let me speak from personal experience (again) that you need to be really good at dealing with the public on security stuff. If you are going to make clear and solid statements that have numbers in them (eg 60, 7) then you really REALLY need to make sure you are always on the right side of those numbers.

I am also not saying this automagically makes Google evil. As I said at the beginning of this post – I’ve been there when it was bad. Sometimes you are trying your hardest to be good but you’re tripping and falling down. People see you fucking up and assume it means you are evil when really you’re just trying to stay alive long enough to fix your broken shit and learn so you can be better.

I don’t wish anyone at Google any ill will over this. I’ve been there, it’s no fun.

Standard

15 thoughts on “Code Identity and the Android Master Key Bug

  1. Pingback: On the Android security bug | Ediary Blog

  2. Chris says:

    From http://thedroidguy.com/2013/07/why-you-should-not-worry-too-much-about-the-android-master-key-flaw/

    “Last April, Google tightened security on the Google Play store by forbidding Android app developers from issuing updates to apps available on Google Play outside of the store. So as of now, if an Android app is downloaded from the Google Play store, it will only be updated from the Play Store.”

    That was probably in response to this bug from Bluebox.

  3. seems like something you might do on purpose.
    with the Snowden ordeal, we can quote Frank Zappa and say, “the odds be fifty-fifty,” that this would make a pretty barn sized backdoor for programs such as Prism. it would be asinine to dissuade this thought.

  4. Grugnog says:

    “This means getting it through a heretofore unknown exploit in Google Play or onto the phone via side-loading or another distribution method.”

    If they are able to get an unknown exploit into Google Play, then it seems that would be an issue without the signing bug (since Google would just sign the exploit).

    In my using side loading with “untrusted sources” disabled is pretty rare, since most packages users side load are unsigned. Hence, the exploit doesn’t affect most users who actually use side loading.

    Given this, the above methods you mentioned don’t seem like viable attacks – rather the issue would seem to be things like MITM attacks (e.g. triggering updates via untrusted wifi and adding the exploit into the package), or as an assist to another exploit (e.g. making a bug that allows forced side loading not need “untrusted sources” disabled to practially exploit).

    • blunden says:

      What are you talking about? Google does not sign Play Store apps. They are signed by the developer before being uploaded.

      All Android apps have to be properly signed (though it can be a self-signed cert) to installable. There is no such thing as “sideloading unsigned apps”.

  5. Pingback: Android bug - Sports Forums, Sports Hoopla College Football Forums

  6. Pingback: On the Android security bug

  7. thrace says:

    “Google espouses 60 days to fix exploitable bugs and going public one week after private notification. ” — I think that’s incorrect, the link you point to talks about exploits under active attack. As you call out, this bug is not under active attack, which leaves it in a gray area, I guess.

    • Waiting around until you’re under active attack is one of those things those Google security researchers would probably make fun of, don’t you think?

      There’s always a “how bad will we look when people find out?” axis as well.

  8. Pingback: On the Android security bug

  9. Pingback: Instanews #117: In Soviet Russia, America Spies on You!

Leave a comment