How private is your smartphone keyboard?

smartphone

Recent revelations about the access and use of personal data on smartphone keyboards have proved that not all digital conversations are private, and user data can be used for a variety of different means.

Smartphone users are becoming increasingly conscious of risks to their data privacy – but most still misunderstand exactly how their data is leaked.

Many smartphone users flocked to the likes of WhatsApp and Telegram – perceived safe havens for privacy, due to end-to-end encryption.

However, even by using such end-to-end encrypted apps, it is still possible for data snoopers and harvesters to gain access to private data via an unlikely source.

By connecting with the cloud, some mobile keyboards used for streamlining and personalising typing can access and use data from your device.

Personal conversations

Anything you type, from personal conversations to passwords and credit card details, have the potential to leave your device via many keyboard apps.

Such data can be leaked whenever keyboard apps sync with the cloud. ‘Smart Suggestions’ are another security risk – they often upload information as you type in order to offer more intuitive suggestions.

Nevertheless, the use of ‘Smart Suggestions’ by some virtual keyboards comes with the risk of leaking your personal information.

There have been several notable cases of data leaks in recent years. The personal data of over 31 million users of the AI.type virtual keyboard app leaked online in 2017 after the company failed to secure the database’s server. Names, phone numbers, location data and Google searches were all found to have been leaked.

The users of another different keyboard extension, Swiftkey, reported in 2016 that their keyboards were suggesting the email addresses and search phrases of other users. The bug was found to originate from SwiftKey’s cloud sync service, which had to be suspended.

Innovative designs

And while being investigated for intrusive ads in 2017, GO Keyboard, a widely-used custom Android keyboard app was found to be collecting extensive user data, such as Google account information and even the user’s location.

GO Keyboard was also found to be running external code and was connected to dozens of third-party trackers and ad networks, meaning that the number of affected users ranged anywhere from 200 million to 1 billion.

Even Google’s own Gboard keyboard extension gives the company another avenue to harvest its users’ search queries, regardless of whether it is used in conjunction with end-to-end encryption apps.

Despite these problems, third-party keyboard apps have grown in popularity, mainly due to the improved usability, new features, innovative design themes and smart text prediction that they offer..

This means that the onus is on keyboard providers to regain the trust of their users, particularly in light of Next-Service Prediction (NSP) – the latest innovation.

Smart technology

This new smart technology suggests restaurants, bars, cafes, shops, or even brands, based on what the user is typing, allowing users to instantly access content and information from the web, and access different apps within a single chat.

For example, offering to “go for pizza” with a friend could bring up suggestions of local pizzerias, while suggesting a “meeting next week” with a colleague could trigger your phone’s calendar.

But as such smart NSP algorithms are designed to comprehensively learn and predict user behaviour, particular care must be taken to ensure data privacy.

Fleksy is currently the only keyboard app that does not share its user data, which is retained on the device so that it can’t be leaked via the cloud. This is a standard security measure I expect users to demand from keyboard apps in the future.

Due to its privacy credentials, Fleksy has been licensed to smartphone manufacturers and governments around the world, while many other institutions are growing sceptical about the security of encrypted apps such as Telegram and WhatsApp.

Encrypted messages

In April, the French government announced its intention to move to use its own encrypted messaging service this summer, over fears that foreign entities could spy on officials using foreign-built encrypted apps which do not have servers in France.

This is almost certainly just the start of a new trend of governments and possibly even large corporates turning to their own messaging services to avoid the possibility of ‘data leaks’ – intentional or otherwise.

In light of growing data privacy concerns among governments, security agencies and regular smartphone users, brands must now take steps to renew trust. More and more users are both aware of and concerned by privacy issues, and as a result, are becoming less willing to ignore what happens into their data behind the curtain.

People are also losing patience with companies using their data to sell them products they don’t want, or, in the case of Cambridge Analytica, seek to influence them in even shadier ways.

Data privacy

The days of ticking the T&Cs without reading them are disappearing, and if brands want to survive and compete, they need to respect the privacy of their customers and ensure their data is kept private.

In the meantime, as a user, take a closer look at the messaging and emailing apps you’re using. The first thing to check is whether they have the right layers of end-to-end encryption. One good alternative to WhatsApp and Telegram – the security flaws of which I highlighted earlier – is Signal, which has strong encryption credentials to ensure the privacy of your conversations.

You should also make sure you review the free services offered by any app and understand what data you’re giving away in return for the service. For instance, using Google as your search engine exposes your personal data and behaviours, but alternatives, such as Qwant, respect your privacy.

It’s always worth doing some research into the many private alternatives out there.

Written by Olivier Plante, CEO, Fleksy

Related Posts

Menu