iPhone location tracking is a security risk

There is no such thing as absolute privacy or security for smartphone users. The only way you can have control is by not storing information that you want to keep a secret on your phone.

As Apple CEO Tim Cook said last year, “The people who track on the internet know a lot more about you than if somebody’s looking in your window, a lot more.” It should make us pause to think about how we use our phones.

Apple, according to Zak Doffman, believes it is “privacy protector-in-chief,” and iOS14 is intended to demonstrate its privacy-first approach. Doffman points to the ongoing battle between Apple and Facebook over ad tracking, remarking, “Exploitation of our personal data has become a commodity traded between the world’s largest organisations.”

However, iOS users were surprised when Apple explained its location tracking. It is an invasive feature, and as Doffman says, “a perfect illustration of just because you can, doesn’t mean you should.”

Were you aware that the location tracking builds up a data collection of all the places you have visited, including times, dates, the type of transport you used to get there and how long you stayed at the location.

Jake Moore of ESET commented, “significant locations is one of those features hidden within the privacy section which many users tend not to be familiar with. I cannot think of a positive or useful reason why Apple would include this feature on any of their devices.”

If you check out the data repository on your iPhone, you will likely see that it stores certain places, times and dates, and that is because it is trying to work out if this might be important for a photo memory or a calendar entry. But do you really want this? I agree with Doffman when he says, “I don’t need my phone tracking every single location I visit and deciding which it deems significant to save me a few seconds of effort.”

According to Apple, the device wants to “learn the places that are significant to you.” However, you can breathe a small sigh of relief when you learn that the “data is end-to-end encrypted and cannot be read by Apple.”

What this illustrates is that even though the data is encrypted, you still don’t have absolute control over the security of your iPhone. John Opdenakker, an information security expert, said, “While Apple’s encryption and device-only restriction certainly reduces the security and privacy risks, I personally switched this feature off because it doesn’t offers real benefits and just feels creepy.”  He added, “What worries me from a privacy perspective is that this feature is enabled by default and that the setting is hidden away such that the average user probably doesn’t find it.”

Don’t forget that you can turn off other location-based services on your Apple device, such as ads and alerts. Want to know where to find them all? Just go to “Settings-Privacy-Location Services-System Services-Significant Locations.”

Would you pay a ransom for your cup of joe?

If you’re a gadget-loving person, and you enjoy your coffee, then there is a very good chance that you have a coffee machine. However, I don’t suppose you’ve ever thought it might be a cybersecurity threat.

Davey Winder, a tech journalist, points caffeine addicts in the direction of a new report by security research firm Avasti, which as discovered, “smart coffee machines can not only be hacked but can be hacked with ransomware.”

One of Avasti’s senior researchers, Martin Hron, wrote in a recent blog, “The fresh smell of ransomed coffee”, about how he proved a myth was true when he turned a “coffee maker into a dangerous machine asking for ransom by modifying the maker’s firmware.”

Proving a myth

Hron goes on to say:”I was asked to prove a myth, call it a suspicion, that the threat to IoT devices is not just to access them via a weak router or exposure to the internet, but that an IoT device itself is vulnerable and can be easily owned without owning the network or the router.”

What Hron discovered was that the coffee machine acted as a Wi-Fi access point when switched on. This then established an unencrypted, unsecured connection to a companion app. From that point he was able to explore the machine’s firmware update mechanism, finding that because the updates were unencrypted, no authentication code was required. Hron, behaving as a hacker would, then reverse engineered the firmware stored in the machine’s Android app.

Crypto or coffee?

Perhaps you’ll smile at what Hron tried to do next. He attempted to turn the coffee machine into a cryptocurrency mining machine, something he found would be possible, although also impossibly slow due to the CPU speed. What he did instead, was perhaps more dramatic. Imagine your coffee machine starts making an ear-splitting noise and there is nothing you can do to stop it. Hron created a noise malfunction that could only be stopped by paying a ransom, or pulling the plug on your morning coffee forever.

A noisy attack

He effectively produced a ransomware attack that nobody could ignore. Winder writes, “The trigger for the attack was the command that connects the machine to the network, and the payload some malicious code that “renders the coffee maker unusable and asks for a ransom.”

Hrom also went a bit further. He inserted code that permanently turned on the hotbed and water heater as well as the coffee grinder.

If you have a coffee machine connected to the Internet, you are probably safe, but it’s useful to know that these machines can be attacked. But I do wonder, would you pay the ransom to have your smart coffee machine return to normal breakfast duties, or would you pull the plug and go back to an old skool method of brewing up a cup of joe?

Why you should be careful about the cable you use to charge your iPhone

If your home, or your office, is anything like mine, you’ll know that it’s not unusual for family members and colleagues to borrow charging cables for their phone. This information relates specifically to iPhones.

Zak Doffman writes: “How do you fancy an iPhone charging cable that looks like an Apple original and acts like one as well, but which will tap into a connected device and steal all its secrets, and which has its own radio transmitter to send all that stolen data over the air to a waiting attacker.” Did you ever think that could happen?

Last year the early version of the O.MG cable caused a stir at Def Con. It’s an iPhone Lightning cable that has been configured to enable remote, malicious access to a computer. It looks and works like an original Apple USB cable, but when the O.MG cable is used to connect a phone to a Mac, it enables an attacker to mount a wireless hijack of the computer.

According to its developer, Mike Grover (the MG in O.MG) in the past you could tell the difference between his cable and an original Apple model, but that is not the case now. He has been working on it over the last year and told Doffman that the result has been “a game-changing improvement to replicate the Apple original.”

As Doffman points out, this is not a major threat to us iPhone users, but it has the potential to be a problem. For example, Doffman says, “If you plug it into your Mac, it will access your computer and log your keystrokes.” It could also drop malware into your devices: “In short, you’re compromised—an attack vector usually consigned to the dark web is now openly for sale online,” Doffman states.

It is unlikely that those of us who swap cables with our kids or colleagues will be affected. But imagine that you work in a government service, the military or you’re a trade negotiator, and you travel frequently, staying in numerous hotels and waiting at airports, then you may be more at risk.

It is easy for criminal to target such iPhone users. In a 2019 article Doffman highlighted the dangers of using public USB charging points, because there is a risk that those USB sockets carry data as they deliver power: “criminals load malware onto charging stations or cables they leave plugged in.”

On the other hand Grover has had positive feedback about the O.MG cable. He said: “I have received countless stories from people who tell us that the O.MG Cable has become one of the most powerful tools for delivering security training, many aspiring hackers tell us how fast the UI is for rapid payload development. And that really makes it worth the effort we put into continually improving our work.” So, let’s be clear, Grover is not the bad guy here. His invention is meant to be a help and his focus is on logging the credentials as they’re entered into the computer—both username and password, enabling a payload to be pushed across.

However, as Grover and Doffman report, “once you open up the world of mass-produced replicas, you can start to play with power adapters and other cables, and you can shift the focus from key logging to infection, and from computers to smart devices.”

As cyber threats grow, it’s time to think more seriously about how we connect our devices and what we use to do that, wherever we are.

The cybersecurity of your front door key

Cybersecurity is one of my main interests, so when I spotted this article by Davey Winder titled “How Hackers Use Sound To Unlock the Secrets of Your Front Door Key’, I was intrigued, not least because smart houses are something of a passion of mine.

The smart lock is the risk in question, and Winder remarks that when he asked 549 security professionals if they would use a smart lock, 400 of them said ‘No’ and “get in the sea.”

What are the smart lock security issues?

Reports suggest that smart locks have a number if vulnerabilities, from snooping via WiFi to smart hub weaknesses. One expert, Craig Young from tripwire, found that one smart lock could easily be bypassed by a hacker with “a media access control (MAC) address and a smartphone app.” Young himself says that he generally doesn’t advise consumers to use internet-connected locks. “If the risk of strangers finding and opening your lock isn’t enough discouragement,” Young says, “just consider what you will do if you’re locked out because the lock maker got hit with ransomware or simply pushed a bad update.”

Winder poses another question: “what if hackers had figured out a way of unlocking the secrets of your actual, physical, door key just by listening to the sound it makes when being inserted into the lock?”

Hackers show how simple it is to open the door

Thankfully a group of ‘hackers’ at the National University of Singapore have developed an “attack model” they call SpiKey, which determines the key shape that will open any tumbler lock. They say SpiKey “significantly lowers the bar for an attacker,” when compared to a more traditional lock-picking attack. Their methodology is surprisingly simple in that it is a matter of listening for the sound of the key as it moves past tumbler pins when the key is inserted.

The Singapore ‘hackers’ have been using “a simple smartphone to record the sound of the key being inserted, and withdrawn, with a smartphone and then observe the time between each tumbler pin click using their custom key reverse-engineering application,” as reported by Hackster.io. ” The group’s research paper states, “SpiKey infers the shape of the key, it is inherently robust against anti-picking features in modern locks, and grants multiple entries without leaving any traces.”

Of course, the real world presents other challenges, the biggest one being “that the current attack mode requires the threat actor to be within a few inches of the lock to make that recording,” which means they need to be literally outside your front door.

However, if you already use a smart lock, don’t panic. For the moment, a smart lock that isn’t connected to any network, is still doing a job of protecting you and your property.