Over the course of any given day, I have at least a half dozen applications open on my Mac. Everything from: Safari, Ulysses, Omnifocus, Fantastical, Spotify and Tweetbot are open all at once while I’m working. This amount of windows alone are difficult to manage and organize, but recently I’ve started watching Netflix while I work, which has made it even more difficult to manage.
For a lot of people watching video during work may seem like a unruly distraction, but to me it’s relaxing. I don’t often watch serious shows, instead I opt for comedies like Family Guy or The Office.
It’s not difficult for me to write an article while an episode of The Office is playing at the same time, but what has proved irritating is trying to manage the browser window. Normally, I’d open a new window with Netflix and resize/position it so that no other window overlaps it. It’s almost like completing a puzzle, one I have to complete every single day.
For a while it was tolerable, but it has quickly became a nuisance.
Google introduced encryption on Android in 2011, but it was buried deep within a phone’s settings. Not until late 2014 did Google begin asking customers if they wanted to encrypt their phones during the setup process.
Although 97% of Android phones have encryption as an option, less than 35% of them actually got prompted to turn it on when they first activated the phone. Even then, not everybody chooses that extra layer of security.
“If a person walks into a Best Buy and walks out with an iPhone, it’s encrypted by default. If they walk out with an Android phone, it’s largely vulnerable to surveillance,” said Christopher Soghoian, the principal technologist at the American Civil Liberties Union.
Can’t say this is surprising.
I’ve long struggled with Mac Mail through countless bugs and performance issues. I’ve remained (mostly) loyal to the default Mail application because, simply, it’s the default Mail application.
Next to Safari, Mail is my next most-frequently used app on my Mac, and thus, the second most important. Up until about a year ago I got by without too many headaches, but once I started using Google Apps for business it became unbearable.
Simply put, Gmail support in Mac Mail is wretched. It continually disconnects, presents random errors and takes way too long to pull new data from Gmail.
I think safety of the public is incredibly important. Safety of our kids, safety of our families is very important. The protection of people’s data is incredibly important and so the tradeoff here is that we know that doing this could expose people to incredible vulnerabilities. This is not something that we would create. This would be bad for America and it would also set a precedent that I believe many people in America would be offended by. And so when you think about those which are knowns, compared to something that might be there, I believe we are making the right choice.
Some things are hard, somethings are right. Some things are both. This is one of those things.
You can watch the video in it’s entirety here. It’s a long interview with a bit of redundancy, but what stuck in my mind was that Tim Cook focused on this case and in particular, it’s effect on the future. He mentioned several times this ‘backdoor’ was the equivalent of a software cancer and it not only harms the safety of the public, but also their civil liberties.
All in all, this interview by Tim Cook was poignant and it reminds me of why he was the right person to lead such as an important company like Apple. I’m proud to say I’m an owner and advocate of their products.
I’ve been a long-time user of Day One. I love to journal and write down my daily thoughts, I find it relaxing and therapeutic. Out of the available options (which there are plenty), Day One is the best iOS/Mac app for doing so.
Day One offers all the necessary journaling features (and no unnecessary frills) in a beautiful, well-designed interface. When I first used Day One 2, I asked myself: how can the best get better?
Turns out, Bloom Built, the developers behind Day One, knew exactly what to do – and that was to add new features without taking away from the simplicity that was the original Day One.
Matthew Panzarino, writing for TechCrunch:
The point is that the FBI is asking Apple to crack its own safe, it doesn’t matter how good the locks are if you modify them to be weak after installing them. And once the precedent is set then the opportunity is there for similar requests to be made of all billion or so active iOS devices. Hence the importance of this fight for Apple.
This is why the debate around this particular order should not focus overmuch on the technical aspects — but on the fact that the government would be weakening the security of a private company’s product, potentially impacting the civil liberties of American citizens and foreign nationals worldwide that use those products.
This is why Apple is fighting this request, because giving in would create a legal precedent which could permit more requests in the future. It’s been brought up that this request in particular may not be possible on newer iPhones, but that’s not necessarily the case:
There has been some chatter about whether these kinds of changes would even be possible with Apple’s newer devices. Those devices come equipped with Apple’s proprietary Secure Enclave, a portion of the core processing chip where private encryption keys are stored and used to secure data and to enable features like TouchID. Apple says that the things that the FBI is asking for are also possible on newer devices with the Secure Enclave. The technical solutions to the asks would be different (no specifics were provided) than they are on the iPhone 5c (and other older iPhones), but not impossible.
This solution is, frankly, unacceptable, and it’s not simply an issue of privacy: it’s one of security. A master key, contrary to conventional wisdom, is not guessable, but it can be stolen; worse, if it is stolen, no one would ever know. It would be a silent failure allowing whoever captured it to break into any device secured by the algorithm in question without those relying on it knowing anything was amiss. I can’t stress enough what a problem this is: World War II, especially in the Pacific, turned on this sort of silent cryptographic failure. And, given the sheer number of law enforcement officials that would want their hands on this key, it landing in the wrong hands would be a matter of when, not if.
What Ben is saying here is a key point to this whole situation. This ‘backdoor’ isn’t limited to one device, it can’t be. If it got in the wrong hands, it could create huge security issues for iOS users.
This is why I’m just a tiny bit worried about Tim Cook drawing such a stark line in the sand with this case: the PR optics could not possibly be worse for Apple. It’s a case of domestic terrorism with a clear cut bad guy and a warrant that no one could object to, and Apple is capable of fulfilling the request. Would it perhaps be better to cooperate in this case secure in the knowledge that the loophole the FBI is exploiting (the software-based security measures) has already been closed, and then save the rhetorical gun powder for the inevitable request to insert the sort of narrow backdoor into the disk encryption itself I just described?
It’s a valid question Ben has raised, that is, it would’ve been easier and perhaps more strategic for Apple to give in to this request silently and live to fight another day. This is why their stand is so commendable, that they are willing to stand against a request involving an event so public.
Tim Cook in an open letter posted on Apple’s website:
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
This kind of letter is unprecedented. It personifies Apple and highlights why they are one of the most important companies in the world.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
I highly encourage you to read the entire letter posted here.