I’m a real fan of Border Security, a reality show featuring the jobs of the Canada Border Services Agency. There’s a recurring story that watchers often see. It’s about someone entering the country and the border agent goes through their cell phone checking text and email messages only to find that the owner of the phone is planning to work without the proper authorization.
I remember the first time I saw such an episode. I had lots of questions:
- Can they search your cell phone like that?
- Who is about to unlock their phone so that others are going to look through it?
- What happens if you refuse?
So, I did some digging and this is an interesting document about entry to Canada. Scroll down to the section on Customs Searches and you’ll see that your device is indeed subject to search. I do remember a few years ago there was a big concern about laptop computers going through inspection and you were encouraged to put it to sleep instead of turning it off so that you can quickly pass through the checkpoint and prove that it’s a legitimate computing device just by opening the lid. It never was a big deal for me since I never turn everything off. I’m just too impatient for a computer to boot.
The inspiration for this post, of course, is the current battle between Apple Computer and the FBI over the process of unlocking the device of a shooter in San Bernardino. I’ve been reading everything that I can about this because I find it one of the more important privacy/security issues of the moment. There’s no one single resource that has all the details so I’m on a mission to read as many as I can.
The issue, as I see it, isn’t about the right to protect the information and privacy of this one person. The issue is about everyone’s right to privacy with their information. It’s not cut and dried (at least here in Canada). From the document above
“When such searches have been challenged in court, judges have typically recognized that people should have reduced expectations of privacy at border points.“
The bigger issue is whether a third party, in this case Apple, should be forced to help access that information. According to what I’ve read, the backup of information to iCloud hasn’t provided the details desired. At present, there doesn’t exist a technology to access the information on the locked iPhone, or at least that’s the story. Apple is being asked/ordered to write one.
I think it’s somehow comforting to know that one doesn’t currently exist. That adds credibility to any claims about security and it’s nice to know that there isn’t one already written by hackers and in the wild so that the question comes back to the developer. How this will play out is anyone’s guess. I find it fascinating to follow. Ever the educator, though, I think it’s fodder for questions and discussion among student.
- Should Apple be forced to write the code?
- Is Apple’s code more secure than Android since it’s developed in a closed shop and the code isn’t freely available? Or is it the other way around?
- Is your device more secure with a passcode, a pattern, or biometrics?
- What’s on your phone right now that you wouldn’t want revealed to others? Will you still keep it on your phone?
- Do you store your own information in the cloud? Is it secure?
- Who owns your data if you store it on the school’s fileserver?
- Should your school be able to force you to store your work in the cloud?
- If you attend a BYOD school and use your own device for work, does your teacher and school have the right to search your device?
- If Apple does eventually write the software to access this device, will it ever be used again?
- Has your thinking about your privacy or security changed since this incident?
- Have you ever read an Acceptable Use Policy?
- Have you ever read and understood the agreement for software or web services before you click “I Agree”?
Your turn. What other questions would lead to a good classroom discussion?