Bill Gates says big tech companies are inviting government regulation

I don’t necessarily disagree with Bill Gates on this- big tech companies are doing things that are concerning to people in the government. Those concerned government officials have the power to write law’s forcing the tech industry to comply.

The case that Gates references is the the California worker who shot up a Christmas party because he was mad at his fellow colleagues. The suspect had an iPhone 6 (I think it was a 6) and he had locked down his phone. The FBI wanted to read his imessages but couldn’t crack the phones encryption. The FBI asked Apple to get into the phone for them, but they refused, citing their privacy policy. They also said that they were unable to crack their own encryption, even if they wanted to.

This is a tough case overall. There was a corporate policy to protect the users data by Apple. That policy was backed up by a security system on the phone that not only encrypted the data, but if enough attempts were made to unlock the phone that failed, the data would be lost forever. Apple claimed they had no way to break their own security measures. They also said they had no ‘backdoor’ entry into the phone. Lastly, they said even if they could get into the phone, they wouldn’t do it because of their commitment to user security.

This caused a big curfuffle in the tech industry and in law enforcement. The LEO community was angry because they were investigating a mass murder from a shooting. They wanted to look at the suspects phone to see if he had any help from other people. This sounds like a logical request- help us solve the murderer of some 20+ people by letting us into this guys phone.

On the other side of the road is Apple. One of the selling points of the iPhone is its security. There’s security codes, fingerprint scanners, and now facial recognition systems that keep people out of your phone and the data on it safe. There’s an added layer of encryption on the data on the phone itself that’s been added. Apple itself doesn’t have access to the data, because data isn’t their business (unlike Google), hardware is.

Apple has intentionally built a phone that has strong and secure login features, and even stronger and more secure data security elements. They designed it in such a way that even they can’t break into it (more of this in a little bit). They told the FBI all of this. The FBI’s response was to tell Apple to make something that could break into the phone in such a way that it wouldn’t encrypt the data forever, losing it for good. Here’s where it gets juicy. Apple said No to the FBI.

Apple (rightly so imo) refused to develop a tool to crack the phones password and access the data on it. They had several reasons, but I’ll give you what I think are the key ones. First was the ‘contract’ they made with their customers. They sold the iPhone touting its security features. How would it look to current (and more importantly to potential future customers) if they cracked their own security measures? How safe would users feel about their supposedly ‘secure’ phone? Secondly, by breaking the password, it would tell hackers and otherwise unsavory types that the phone wasn’t as secure as previously thought. It would create a situation where every black hacker out there would be banging away at the iPhone trying to break in like Apple did. It would be a worthwhile effort considering the sheer number of iPhones out in the wild. That’s a treasure trove of information for the bad guys to steal.

Last of all was an offshoot of the first two things. It was the idea of hardware and software makers putting in ‘backdoors’ into their products that would allow LEO’s to access the data on a device by bypassing all of the security features on the device put in by the manufacturer. This is the idea that sent shivers down my spine, and a lot of other people too. The FBI was suggesting that a magic door, a lá the back door to Smaugs lair in The Hobbit be put into software.

This is a TERRIBLE idea. This is once again telling hackers that despite all of the layers of security on a device, there’s a secret way in. All it would take is one bad line of code, or one bad character in the line of code that could completely expose a device to hackers. In the end Apple didn’t comply with the FBI’S request. The FBI ended up paying a company some crazy amount of money to break the password and get into the phone. If I recall a court ruled that the FBI didn’t have to release the name of the company they used, or the amount of money paid for cracking the password.

Now if this outside company could break into the phone then I have little doubts that Apple could too. They choose to stick with their principles, their customers, and their company culture and not assist with breaking the password on the murderer suspect phone.

Here’s where bill gates comments come into the story. He used this example as a possible reason why law makers in the US Congress would consider writing legislation forcing companies to comply with requests like the FBI’S in the shooting investigation. Lawmakers could very easily be swain in this kind of legislation because it’s for the ‘greater good’ of society (they use this same tactic with anti gun legislation when they ban certain types of guns). The logic behind this is something like ‘there could be more attacks coming’ and getting into the phone could identify the other attacks, or it could provide evidence that no other attacks were going to happen. It could identify people who aided and assisted the attacker who could be fleeing right now. Determining who that person/people are sooner is better.

My real fear is that lawmakers require backdoors in software. That’s the most frightening thing I can think of right now. Considering how stupid and moronic most legislatures are when it comes to technology (remember the internet tubes comments?) I’m fearful of the law that would be written by these neophytes. How can you write a law on a subject you don’t understand?

I’m not suggesting that tech companies comply with these requests. I’m a firm believer in privacy and privacy rights. I believe in the 4th amendment, and I believe that extends to my online activity and my phone. But we’re also living in a time of ever expanding applications that have secure logins and end to end encryption.

Fuck, I use the Signal message app just for that reason. It’s so secure that Edward Snoden uses it. I use it when I want to make sure that my conversation can’t get tracked or monitored. I set it up to delete the conversation after 15 minutes. Since the message is encrypted end to end, there is no way anybody is going to read what I’ve been saying. But if I use that app, maybe terrorists or other bad guys are too. How do you combat that?

I really don’t have an answer for the question. I know that legislation would not solve or fix the problem, but I don’t know how else the government can respond to this growing issue.

I do know that the FBI, CIA, and NSA have ever growing capabilities to spy on people. Edward Snodens release of information on the NSA’s capabilities was just mind blowing. The things that they are capable of doing are Stuff seen in TV shows and movies. Their reach is global. And as society moves to living a more and more digital Life, the greater chance and ability the alphabet agencies have of accessing your information.

I think that’s why I’m so against backdoor legislation, or laws mandating compliance with law enforcement requests to break into a secure account. These spy agencies already have so much access that I truly fear giving them more. That’s probably why I love using Signal messages. It’s not that I have anything to hide. I just like knowing that they couldn’t see it if they tried.

[End of Line]

Leave a comment