While much has been said of the A7 chip in the new iPhone 5S — arguably the “world’s first consumer ARM-based [system-on-a-chip]” — its associated new M7 coprocessor was surprisingly under-hyped, by both industry media and Apple.
For the first time, motion sensing occurs in a separate processor, which makes constant activity tracking using the gyroscrope, compass, and accelerometer sensors more power-efficient without turning on the rest of the A7 chip. This means we’ll start to see more Quantified Self (QS) tracking apps detecting steps and stair-climbing, bringing Fitbit and Jawbone capabilities to our phones. And the M7 does all this without a noticeable drain on the battery.
But that’s the sticking point: noticeable. The introduction of the M7 means Apple could collect this activity and movement data in the background without affecting our iPhone experience. Apple says that the M7 coprocessor only stores accelerometer data for up to seven days, but the capability remains.1
Activity tracking used to be a very conscious, active decision. There was a process of deciding what to track, and perhaps buying a device or turning on an app to track it. We also had to remember to put on our wrist bands or clip our Fitbits to our clothes.
Now, with the M7, activity tracking comes as an automatic feature on the device that most of us carry with us all day, every day (Google and Motorola’s Android-based Moto X features a similar coprocessor). The capability to track activity using the existing sensors in our smartphones was there in previous models. And apps like Moves, Human, and Saga had started to take advantage of accelerometer, gyroscope, and GPS information to turn the phones we already carry into activity trackers. But these early applications were still pretty battery intensive.
With the M7, the phones in our pockets can keep on tracking — and possibly do more: “Because where we go, so go our phones.”
We have reason to be wary. We were surprised to find out that Google tracked our walking and bicycling activities when they first surfaced the data in Google Now cards. When we discovered a couple years ago that the iPhone was storing a location cache file based on cell tower and Wi-Fi network triangulation, Apple didn’t even show us that data — it had to be hacked.
It’s not so far-fetched to imagine that companies like Apple and Google would have an interest in gathering such large-scale activity data now that they have sensors in place to capture this information efficiently. Maybe they just want to know how we use our phones to deliver a better experience. Maybe they want to make better products. For Apple, large-scale activity data from all its iPhone 5S users could provide development fodder for its much-rumored wearable smartwatch. Apple already uses that data to optimize battery life: to stop network pinging when our phones haven’t moved for a while (sensing that we are likely sleeping), or to not try to pick up Wi-Fi signals as we fly past in a car (sensing that we don’t need it then).
Forget such context-aware performance optimization. And this goes beyond surveillance of communications metadata. When a phone becomes a powerful body activity tracker, it’s a whole different story.
The M7 coprocessors introduce functionality that some may instinctively identify as “creepy.” Even Apple’s own description hints at eerie omniscience: “M7 knows when you’re walking, running, or even driving…” While it’s quietly implemented within iOS, it’s not secret for third party apps (which require an opt-in through pop-up notification, and management through the phone’s Privacy settings).2 But as we know, most users blindly accept these permissions.
It all comes down to a question of agency in tracking our physical bodies.
The fact that my Fitbit tracks activity without matching it up with all my other data sources, like GPS location or my calendar, is comforting. These data silos can sometimes be frustrating when I want to query across my QS datasets, but the built-in divisions between data about my body — and data about the rest of my digital life — leave room for my intentional inquiry and interpretation.
We’re already living in a world where our clicks and queries are analyzed as signals of our digital lives; it’s the exchange we’ve made for a free, advertising-supported internet. Are we now entering a phase where the physical world — through the interface of our bodies — is also subject to surreptitious tracking in exchange for staying connected while mobile?
There aren’t yet a lot of apps designed to take advantage of the M7’s low battery-consuming, activity-tracking potential, so coverage around the Apple announcement only acknowledged its potential and pointed to the trend of “ambient intelligence.”
Of course we can get excited about the potential new applications of the latest smartphone features, but we shouldn’t let that excitement blind us to the more insidious potential uses of those same features. We put a lot of faith and trust in our favorite tech companies to do the right thing with our data, and to make our lives easier and better with it. But we need to have a critical dialogue about the uses of data that we are and are not comfortable with.
What if the M7 sensed our sedentary lives and offered those data points to underwrite insurance, for instance? Without some transparency about how phone manufacturers intend to use the data, or confirmation that they are even collecting it outside of a specific application’s use, we can’t be sure.
Smartphones are just the start. Sensors are starting to show up in more of our appliances and devices. With activity tracking on our phones, a few quantified selves turns into a quantified society … whether we are aware of it or not.
1 Correction 3:35 EST 10/10/13 This line was clarified to explain how the M7 stores data after Apple responded to the author’s request for comment.
2 Correction 3:35 EST 10/10/13 This line was clarified to explain that users give permission to track motion activity in third party apps.