Apple Pay arrived in the UK this week, nine months after launching in the US. All the signs are that it'll have significant uptake. The majority of the British banks are on board and, thanks to the existing widespread use of contactless card devices, the adoption hurdle is fairly low. Apple Pay is unlikely to make a difference to the day to day of retailers. It's also doubtful that many people will consider it a particularly revolutionary concept in how they pay for things. After all, it's not a massive difference to tap your mobile on a pay point instead of a debit card. However, Apple Pay is still set to have a profound impact. It will be another landmark in the concentration of personal data in the hands of a major tech company. Like Google and Facebook, Apple knows a lot about how people lead their lives. In our new digital world, such knowledge is power - and, increasingly, money.
It's worth considering how much personal information people who use Apple products share with the tech giant: name, home address, phone number, credit card info, email address, current location, activity on platforms such as iTunes, information you share with other people, the list goes on. Apple Pay could add purchasing information to this list - potentially an invaluable source of insight into how people live.
Should we care? On one hand, consumers still have the power to limit what they share and the accuracy of some of this information. People don't have to buy or use Apple products; those who do are probably aware that they sacrifice some privacy. If you read the very small print in Apple's terms and conditions, it spells out what information is collected.
On the other hand, the legal protection against the exploitation of personal information is limited. Data protection legislation in the UK was drafted in the 90s, ten years before the first iPhone was launched. The concentration of so much information in one company carries the inherent risk that someone will choose to misuse this data. Edward Snowden revealed that security agencies can, with relative ease, gain access to large caches of personal information from tech companies. If this happens, individuals have little to no recourse. If Apple decided to change their policy and share information with other companies or agencies, people are again largely powerless to prevent it from happening.
Having such a valuable cache of information also naturally makes Apple a very attractive target to hackers. Recent history has shown us that tech companies do not necessarily report such breaches immediately. It's fair to ask just how safe all our personal information is and if someone else had access to it, would we ever know?
The reality is that, to effectively use certain products and services, you have to be willing to provide some personal information. It's naive to think we can have a scenario where individuals can exist online, use tech protects, and still be the sole proprietors of their own data. My concern is that by allowing a few companies to hold onto so much information, a lot of power and responsibility is concentrated in only a few hands. As Apple Pays show us, each new innovation has the capacity to increase this power.
I'm not arguing that this should provoke a legislative response. Indeed, history shows that it is incredibly difficult to create a law that can adapt to each new tech innovation, while allowing tech businesses to continue to operate normally and still provide enough protection for individuals.
Instead, as individuals, we need to be more conscious of how much information we are providing to the same company through different services. We should be more willing to question how our data are used. From a practical perspective, it makes sense to deliberately vary the companies we get our tech services from as much as possible. Putting all our personal data in the same basket concentrates risk.
Tech companies also need to be aware of their responsibilities. It's incumbent on them to keep personal data as secure as possible and work under strict and transparent ethical guidelines. This isn't just a moral argument, there's an essential business case for it. Misusing data or failing to keep them secure is a sure-fire way to erode consumer trust and can fatally damage a company - or even an industry. As Uncle Ben said, 'with great power comes great responsibility'.
Mike Weston is CEO of data science consultancy ProfusionSuggest a correction