"Thirty-four mins to Piha South Rd Reserve" said the Google Maps notification as I got into my car at 5:15 on a Tuesday afternoon.
It was pre-emptive push toward a destination I had made a habit of driving to after work on most days.
This was the third notification of this nature I had during the day; the first telling me the time it would take to get to work and the second telling me how long it would take to get home.
Add to this a recent update that now has the app telling you how far you are from your parked car and you have an omniscient system that knows where you are, where you're going and how long it will take you to get there. The hypothetical microchip that was once predicted to end up in our arm has instead found our pocket – and we've put it there ourselves.
A less Orwellian view is to see each of these features as things that make it easier to get from A to B on a daily basis.
Until now, New Zealanders have been willing to hand over their personal information in exchange for convenience, seeing it as a fair price for a good service.
But the question is whether these features are starting to overstep convenience, edging into a creepier space where every targeted ad and push notification is a reminder that tech companies' algorithms know your every move.
While a general sense of apathy has always typified the issue of data security and privacy, there are indications Kiwis are increasingly concerned about what they're sharing.
According to Google data, people in New Zealand visited the company's MyAccount website almost 6 million times in 2017, about 80,000 completed a privacy check-up at least once, and 500,000 people completed the security check-up at least once.
This is important because it implies consent and gives tech platforms the defence that people should know what they're signing up to. It's essentially the digital version of caveat emptor or 'buyer beware'.
As accessible as tech companies' terms and conditions might be, most users only come to understand what they've agreed to when they see it playing out in the real world.
And increasingly, our friends, co-workers and family members are sharing stories that bring issues to life.
One such story comes from NZME regional news director Edward Rooney, who recently took a day trip to Whanganui from Auckland.
Catching a taxi at a stand from the CBD to the airport, the driver offered to be waiting at the airport in the afternoon for the return ride.
Rooney sent a text to the driver's mobile number to initiate a connection – which at that point was the only between the two.
Within minutes Rooney's mobile phone alerted him to a new potential Facebook friend – the taxi driver. They could now view each other's Facebook profiles and posts.
Asked about how this could happen, a Facebook spokesperson said the company does not log people's call or SMS text history without their permission.
But there is an opt-in feature for people using Messenger or FB Lite on Android that connects people who text each other together.
Rooney doesn't recall opting into this feature.
Another common anecdote of the real and digital world blurring is face-to-face conversations manifesting as ads online.
One person who spoke to the Herald under the condition of anonymity met a friend for a lunch, during which soft shell crab was mentioned. Only a few hours later, a HuffPost article on soft shell crab appeared on the friend's social feed.
This could easily be derided as paranoia, but anecdotes like this have become so common they're difficult to ignore. To even the most skeptical observer, there comes a time when a series of coincidences congeal into a legitimate concern.
A researcher, who recently spoke to Vice, claims that keywords and phrases picked-up by the gadget can be accessed by third-party apps, like Instagram and Twitter, when the right permissions are enabled.
This means when you chat about needing new jeans, or plans for a holiday in Senegal, apps can plaster your timeline with adverts for clothes and deals on flights.
Peter Henway, a senior security consultant for cybersecurity firm Asterisk, told Vice our phones are constantly listening out for a wake word (Hey Siri or OK Google) to activate.
"From time to time, snippets of audio do go back to [apps like Facebook's] servers but there's no official understanding what the triggers for that are," Henway said.
"Whether it's timing or location-based or usage of certain functions, [apps] are certainly pulling those microphone permissions and using those periodically. All the internals of the applications send this data in encrypted form, so it's very difficult to define the exact trigger."
He said digital companies could have a range of thousands of triggers to kickstart the process of mining your conversations for advertising opportunities.
Henway said he wouldn't be surprised if numerous apps were using trigger words to target consumers based on their conversations.
"It makes good sense from a marketing standpoint and their end-user agreements and the law both allow it, so I would assume they're doing it, but there's no way to be sure."
The point here is that if one app can be triggered with a phrase such as "OK Google", then why can't another be triggered with a reference to cat food, soft-shell crabs or deep tissue massages?
When the Herald raised the concern with the New Zealand Privacy Commission, a spokesman declined to comment because he didn't having enough information on the issue and said instead that all agencies operating in New Zealand are required to abide by local law.
Asked about the legitimacy of these anecdotes, a Facebook spokesperson said the company does not listen to conversations through its apps.
"We only access your microphone if you have given our app permission and if you are actively using a specific feature that requires audio – for example recording a video or using an optional feature we introduced two years ago to include music or other audio in your status updates. We show ads based on people's interests and other profile information – not what you're talking out loud about," the spokesperson said.
This statement was mirrored by a Google spokesperson, who said the company does not use ambient sound from any device to target ads.
"If someone explicitly chooses to interact with Google—by speaking a hot word like 'Ok Google', for example—then we will translate their voice recording into text and it will be treated like any other search query," the spokesperson said.
Google and Facebook are, of course, not the only apps tapping into the functionality lying dormant in every smartphone.
Mobile app developer Matthew Moulin, who works at Flipmind, recently wrote to the Herald expressing concern about a locally produced app's use of the camera on his phone.
While watching TV, with his phone sitting on the coffee table, he noticed an alert notification light up on the screen.
"I opened it to see that [the app] was trying to use my camera in the background. In fact, it has been doing this consistently every few days," he said.
Concerned, Moulin put his app development skills to work and downloaded the back-end file, which features a rundown of all the permissions the app has historically activated.
He was alarmed to discover his camera had been activated numerous times in May, despite the fact it wasn't until June 2 when he first gave the app permission to use the feature to scan a barcode on a store-bought item.
Moulin said this may well just be a glitch in the design of the app, but expressed concern about what the camera captured while it was activated and where this recording ended up.
The problem with digital tech is that it has capitalised on our ingrained behaviour of agreeing to terms and conditions without reading them. And the horror of this only shows its face when things go wrong.
It is analogous to buying a lemon from a car dealer. Even if we lift the hood and have a look inside, there's no guarantee we're going to spot the problems in the moment.
We'll only realise we've made a mistake in agreeing to the terms and conditions when we end up in a collision with the next big data scandal.
In this sense, every app on our mobile phone is a potential lemon. And the only question that remains is how much the lemons we bought today will cost us in the future.
How to protect your data:
Cyber security expert Peter Bailey, the general manager at Aura Information Security, says it's impossible to protect your data completely if you're using a mobile phone but recommends users stay vigilant in the following ways.
• Do your homework if you are going to share your location information. Decent websites should give you access to the company's privacy policy, telling you what they are doing with your data. Always read the terms and conditions so that you know what you are agreeing to.
• While there was some intention a few years ago for apps to be developed with "privacy by design", this isn't closely monitored. Many apps may leak location data, and users should therefore check their own cell phone settings – both for the phone in general and each app.
• Try not to enable location services unless you really need them, and think carefully about who is getting this information. Now that you are aware, you may be able to prevent some unauthorised use of your location data.