My family uses ours daily and they're fully integrated into our routines. They're our alarm clocks that wake us, and are how we check the weather to decide how to get dressed in the morning. I ask them to turn on the lights in the morning, and to turn them off at night. Alexa locks my front door and closes my garage before bed. In the kitchen, every time we cook, we ask Alexa to preheat the oven or air fryer, and ask her to set the timers for whatever we're cooking.
It's not good practice to connect your security systems to your voice assistant, especially door locks. Maybe this isn't the case in your home, but in smaller apartments it's certainly possible for a malicious actor to say "Hey {voice_assistant}, unlock front door", and they gain access. It's the modern day "open sesame".
They thought of that many years ago, before adding lock support to Alexa. You can lock the door with a simple command, but unlocking or opening requires a pin code. Nobody can get into my house by shouting through the windows.
> it's certainly possible for a malicious actor to say "Hey {voice_assistant}, unlock front door", and they gain access. It's the modern day "open sesame".
I use Google Home and it has a voice match feature which will only accept on commands for allowed voices. Some people might not like this for the privacy aspect though.
Unless they've improved it over the last couple of years, Google's voice match is pretty easy to fool: When I played a recording of my friend's voice saying "hey Google" and then completed the sentence myself, the Home Mini thought I was him and let me access his calendar even though we have completely different voices.
Of course this is all academic. Anyone can get into a home via the windows. It's just that half of all Americans live in a home that has a gun, so getting out alive might not be as easy as getting in.