In a roundabout way: the hackers “hacked” your iPhone via naushniki comment
The security of Apple’s mobile system has turned into a real legend. Despite the sometimes occurring from any leak, the closed nature of iOS allows her to leave the attackers with the nose before the procedure and not give them a chance to gain access to information from the device. However, against scrap there is no reception, and instead get into the iPhone software, the hackers decided to act differently. It is comical, this time their goal was the usual wired headset.
Inquisitive minds from ANSSI — French public organization involved in information security, managed remotely obtain partial control through branded smartphone voice assistants. To do this, the hackers used a simple property of headphones: in the connected state they can act as antennas, which receive signals fed to the gadget and it is recognized as a voice command of its owner. Distance, subject to compact the simple version of the transmitter is about two meters, in the case of equipment placed in the van, the radius increases more than twice.
It sounds quite menacing opening: using a very simple set of notebook, power amplifier, antenna and special software to make the device of the victim to perform various actions available to virtual assistants like Siri or Google Now. The good opportunity last continuously grow: today, the assistants are able to send messages, post on social networks, to call any number, and so on. Let’s look at how the authors demonstrate his invention on the example of Android smartphone.
Of course, the idea is impressive, but Cupertino ready to give “the answer to Chamberlain”. The fact is that when set to “Hey Siri” new iPhone owners are asked to provide the voice fragment for further use as a sample. This feature, as it turned out, prevents not only accidental, but quite intentional use of Siri by hackers in the technique described above. Something similar is present in fresh Android with Google Now, but owners of older devices who are concerned about the possibility of surveillance, you might need to disable access to the voice assistant on the lock screen. In the case of iPhone it’s easy to do in the section “(and Touch ID) password” of the menu “Settings”.
I am glad that the probability of success of such attacks is small, especially taking into account the need to be in the vicinity of the victim. On the other hand, simulating pressing the headset button in a public place, a hacker could easily make smartphones, for example, to make calls to premium numbers or to go to potentially dangerous sites. Latest iPhone, however, practically does not exist, but if on your device is stored any top secret national security data, the Council can limit the powers of Siri could be useful. As for the ANSSI, we noticed the vulnerability of the staff have already contacted Apple and Google, offering hardware solution: improve escaping of headphones or install special microchips that can monitor and block unwanted radio signals. However, experts noted the presence of more simple solutions like replacing the standard commands activation of the assistants a single phrase or automatic voice recognition of the owner. Responses from companies are not yet available, but it’s possible that Apple will expand the list of models that support authorization by voice, in the newer versions of iOS. Updated the feature “Hey Siri” can be the first step in this direction.
According to Wired