It has become increasingly important for Internet users to know how their information is being used. Very little of what we do on the Internet is private. When information travels through the Internet, it is readable at numerous points along the way because it is unencrypted. Communications can be collected for analysis; something social media companies might do for marketing information purposes. Encryption is a possible solution for security, but it is tedious and time consuming, as keys must be exchanged. Slip-ups can leave security compromised and users also have to convince their contacts to install the required software.
Gilbert created a method intended for the average user that employs a ‘hidden-in-plain-sight’ approach to dealing with prying eyes. This approach appropriates a technique generally used to counter eavesdroppers by transforming outgoing messages to be more vague. The system was implemented for use with Gmail by creating a browser-based tool to demonstrate how this tool can increase privacy.
The method aims to limit the amount of information that is available to eavesdroppers while keeping text understandable by intended recipients. Messages are transformed by analyzing the text and replacing keywords with vaguer terms; for example ‘New York City’ becomes ‘[location]’. The Sender has the opportunity to approve the transformation before sending and make modifications if the message is too abstract. The underlying assumption here is that the correspondents have enough history together to be able to decipher each other’s messages with relative ease. Ten participants were enlisted to examine the workings of this assumption. Each participant, having received no training, was tasked with writing an email to be transformed using the Gmail browser plugin. The transformed email was then sent to their five most emailed, personal contacts, thus creating a group of 40 remote participants. The intended recipients were able to correctly interpret the keyword in 95.2% of cases, with relative ease. The same task was posed to unintended human recipients , who were only able to correctly interpret 2.3% of the keywords and reported high stress associated with the task. Testing with machines revealed that machine-learning algorithms have trouble recovering authorship information from the email corpus to which the method is applied.
The algorithm can play a vital role in extending the concept of privacy towards the average user. For many, encryption is viewed as being too technical, cumbersome and perhaps unnecessary. The program presents the average user with the ability to limit the amount of information available to communications analyzers, without the technicalities of encryption.
Security is currently limited to the technologically adept. Delving into this method in depth could drastically change how the general public views their privacy.
Private information exchanged between friends can be hidden in plain sight by introducing vagueness that can be deciphered without complex keys.