Top 5 Picks For The Creepiest Technologies Today

It’s that time of year again. In the spirit of Halloween, we’ve put together our top five innovations and uses of tech that are creeping us out this year. 🦇

1. Smart dust – nanotech as small as dust 💨

Smart dust is the collective term for IoT sensors at the nano level that can be deployed in the millions or billions across sectors. While smart dust offers great opportunity for data collection and insights, it is problematic from a privacy perspective.

Smart dust, microelectromechanical systems (yup) or MEMS when singular, can detect, for example, light, temperature, vibration, magnetism, or chemicals.

Again, this might seem like a goldmine for those of you who are business-minded, as they can flag many problems in your supply chain very early on, problems that could go on to have detrimental effects if unnoticed.

But the technology will no doubt be used for less morally-sound gains. For example, small variations in temperature can impair cognitive abilities, mainly the ability to make good decisions. In the future, who’s to say smart dust won’t monitor your temperature and trigger a specific action – like targeted advertising for something you’d normally disregard – when you’re in a vulnerable state?

While everyone may be freaking out over facial recognition – and so they should – smart dust is much more intrusive.

2. Drone swarms – think robotic bees 🐝

While swarm robots can be pretty alluring (see this Reddit post), the drones, which organize themselves based on the situation and through interactions with each other to achieve a common goal, can no doubt be used maliciously.

Earlier this year, the British defense secretary said "swarm squadrons" will be used by armed forces in the coming years. The US has also been testing swarm drones, which are inspired by swarms of insects like bees and ants. Watch this space.

tech to watch
I said bees, not lego. Photographer: James Pond | Source: Unsplash

3. Deep fakes – can you tell what’s real? 🖥️

Deep fake-fear is dominating the media. But just what is a deep fake? A deep fake is a *fake* image or video made using *deep* learning, which is a particular bit of AI. It uses real images to create fake ones that appear real.

The most dangerous thing about deep fakes is not the fakes themselves, after all, content has long been edited and faked, but the fact that the general public will be unable to trust even true content. It’s deep fake’s accuracy that sets it apart from other types of misinformation – they really do push us into a post-truth world.

The technology can be used for just about anything, from putting words into politicians' mouths to revenge porn. The true goal of misinformation is not to make you believe a lie; it’s to make you doubt the truth, writes Isabelle Roughol. And that’s exactly what deep fakes will achieve.

Can you imagine living in a paranoid world where we just can’t trust anything?

>>> Some people are even referring to deep fakes as AI clones as your image and voice will be snapped up to create an artificial you.

4. Smart speakers – more specifically, those that can monitor your baby 👶

For smart home devices to be worthwhile and effective they need to be constantly listening, watching and tracking you.

While smart speakers still haven’t crawled into every home just yet, they do have a presence in a fair few. Now, you can install an app on them that monitors your baby’s breathing and movements as it sleeps.

BreathJunior plays white noise from the device then records how the noise is reflected, which allows it to detect the motions of a child’s breathing. Using this method, the app can also pinpoint where in the room the baby is sleeping.

This could be a life-changing app for some families, but there’s something eerie about any tech monitoring children, especially considering IoT devices are notoriously vulnerable to hackers.

Hi Alexa (Part II)
Don't stare at me, Alexa. Photographer: Rahul Chakraborty | Source: Unsplash

5. Tech that sees through walls - um, yeah 🔎

This one’s a new one, but equally as problematic for privacy.

Researchers at MIT have used AI and radio waves to develop a system that detects what people are doing, even behind walls and in darkness. Using data collected, the system can recreate 3D stick figures of the scene.

Trained on video games and radio wave representations of the same images, the system isn’t always accurate – but just as much so as systems with visible scenarios.

Though the tech is still in MIT’s lab, there’s no doubt bad actors will soon leverage it. Smart technology is constantly collecting data about you when you’re around it. In the streets, in the workplace, and now, in the home. Today, we can opt to avoid tech and not invite it into our living rooms, but as time goes on this becomes gradually more difficult. There will be few ways to avoid surveillance, and hiding behind walls won’t be one of them.

👻So which one will you go as for Halloween?

cta
\
Tags :

Free Marketing Audit

Let our specialist review your website for free.

You might also like

News
November 8, 2019

This Week In Tech And Telco: Things Aren’t So Uber For Bing

Tasmin Lockwood
November 8, 2019

‘Don’t Reinvent The Wheel, Improve It’, says Tech Accessory Innovator

Tasmin Lockwood
November 1, 2019

This Week In Tech And Telco: Huawei Makes Headlines Again

Tasmin Lockwood
AcceptDecline

Our website stores cookies on your computer. These cookies are used to improve our website and provide more personalised services to you, find out more about the cookies and see our Privacy Policy.