Facebook is on the hook again for having too much access to user data, and users deserve a detailed explanation.
If you thought that Facebook’s “record audio,” “read and edit text messages,” and other permissions were problematic, you’re not going to like the latest information about the Messenger app.
Jonathan Zdziarski, an iOS security researcher and forensics expert, played a key role in exposing a number of security issues within iOS that were unnecessary to everyday user activity. His exposure of iOS loopholes provided Apple with some questions for which the company has yet to answer.
Now, Zdziarski is back with even more troubling news: Facebook Messenger for iOS is more invasive than many first believed. Sure, like all apps, it is set up to record your activity so that Facebook can likely target you with certain mobile ads that bring in more money for Facebook via mobile. At the same time, however, Facebook is doing a lot more tracking than the basic user activity tracking needs. As Zdziarski wrote in an email, “[Facebook is] using some private APIs I didn’t even know were available inside the sandbox to be able to pull out your Wi-Fi SSID (which could be used to snoop on which Wi-Fi networks you’re connected to) and are even tapping the process list for various information on the device.”
Have you ever seen the Facebook permission on iOS that requires Facebook to have access to your network? Yes, we’ve all seen this at some point, but apparently, Facebook wants to know what networks you’re on. What reason does the company has to desire such information? And, just saying, “Well, all companies do it” doesn’t suffice.
Zdziarski also added that Facebook’s Messenger app has “more spyware type code in it than I’ve seen in products intended specifically for enterprise surveillance” and “there is a lot of code that suggests Facebook is running analytics on nearly everything it possibly can monitor on your device.” In other words, Facebook’s getting as much feedback as it can, on a regular basis, from the devices we use that have the Messenger app installed.
As we asked with the record audio, read and edit text messages, and other creepy permissions in the Messenger app, what’s the point of it all? Why does Facebook feel the need to “have its hands in the cookie jar” so deep that it can cull every “cookie” available?
Facebook claims that it’s doing all this in order to improve user experience. “These accusations are completely unjustified. Privacy is a core to our approach with Messenger, and like any developer, we analyze usage trends to make our apps better, faster, and more efficient. As an example, with regard to what where people tap – when we noticed that people were using the ‘like’ stickers a lot, we modified the app so that people could send them with fewer taps,” a Facebook spokesperson said.
Okay, so let’s assume that the spokesperson is honest and is telling us the truth. After all, in days past, Facebook has put user data to good use with its commitment to its Messenger so that users would get messages faster than before. Apparently, the social network giant made a claim that it had noticed with users who downloaded the Facebook Messenger app. What’s so bad about Facebook trying to improve user experience?
At the same time, why not explain this to Facebook users so as to keep down the suspicion? Why is it that things like this have to surface with tech experts, when companies like Facebook should alert users about these things on the front end? It seems to be reminiscent of the so-called virtual social experiment Facebook conducted in January, only to add the social experiment clause into its Facebook user policy months after the experiment was conducted. Why not just inform users about the social experiment before it was conducted? Why not come out and say, “We’re going to test a new feature in the next few weeks for ‘x’ reason”?
Facebook says that all of the suspicion surrounding the social media network is unjustified, but it doesn’t seem that way. After all, it’s no different than an individual who’s having to account for his alibi on the night that someone was murdered. If he or she is innocent, why not tell the police officer about his or her whereabouts? Why shroud them in mystery and avoid answering the question? When someone dodges a direct question, it’s a sign that the individual is either 1) ashamed of the answer or 2) is guilty. It is the possibility of guilt with Facebook that makes the company’s answer a little too general, broad, and mysterious.
And, sadly, when we hear things about Facebook like this, and see the company’s response to these questions, it doesn’t put our minds at ease (it’s rather unnerving). After all, Facebook has been battling with the American Government over the Facebook user data of individuals that state police departments request.
Whenever Facebook is under pressure, it turns around, resists providing the user data, and then takes to its own blog to say, “we’re protecting your interests.” If you’re looking out for our interests, Facebook, why not tell us what your purposes are for the data in the beginning so that when things like this happen, we can shrug them off as you do?
At the same time, however, Facebook is not alone in this. It seems as though Facebook gets access to this deal because of the company’s partnership with Apple in iOS. When Apple decided to integrate Facebook into iOS, it essentially gave Facebook access to user data that most companies do not have. This shows us something terrible about even Apple: that is, the company that claims its user data is safe and that the company that cares about user privacy is one that does allow an excess of user data to get into the hands of companies like Facebook. And, as Facebook has shown with its social experiment, it can’t be trusted in everything.
It is an issue of trust, Facebook: just as a faithful wife expects her faithful husband to come clean and answer certain questions, we expect you to come clean about why you need so much access to our user data and devices. Honesty has always been and will forever be, the best policy.