Any experiences with a self-hosted assistant like the modern Google Assistant? Looking for something LLM-powered that is smarter than older assistants that would just try to call 3rd party tools directly and miss or misunderstand requests half of the time.

I’d like integration with a mobile app to use it from the phone and while driving. I see Home Assistant has an Android Auto integration. Has anyone used this, or another similar option? Any blatant limitations?

  • Avid Amoeba@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 day ago

    HA with local LLM on Ollama. Can imtegrate the Android app as the default phone assistant. I don’t think it can use a wake word on the phone though. I invoke it by holding the power button, like a walkie.

  • wildbus8979@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    20
    ·
    2 days ago

    Home Assistant can absolutely do that. If you are ok with simple intent based phrasing it’ll do it out of the box. If you want complex understanding and reasoning you’ll have to run a local LLM, like Llama, on top of it

      • lyralycan@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 days ago

        I don’t think there’s a straightforward way like a HACS integration yet, but you can access Ollama from the web with open-webui and save the page to your homepage:

        Just be warned, you’ll need a lot of resources depending on which model you choose and its parameter count (4B, 7B etc) – Gemma3 4B uses around 3GB storage, 0.5GB RAM and 4GB of VRAM to respond. It’s a compromise as I can’t get replacement RAM, and tends to be wildly inaccurate with large responses. The one I’d rather use, Dolphin-Mixtral 22B, takes 80GB storage and 17GB min RAM, the latter of which I can’t afford to take from my other services.

    • Eager Eagle@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      ah yes, I stopped watching the guy because of that and the clickbait, but he does make some interesting content sometimes.

  • James R Kirk@startrek.website
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Maybe things have improved but the last time I tried the Home Assistant er- assistant, it was garbage at anything other than the most basic commands given perfectly.

        • Avid Amoeba@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 day ago

          Install Ollama on a machine with fast CPU or GPU and enough RAM. I currently use Qwen3 that takes 8GB RAM. Runs on an NVIDIA GPU. Running it on CPU is also fast enough. There’s a 4GB version which is also decent for device control. Add Ollama integration in Home Assistant. Connect it to the Ollama on the other machine. Add Ollama as conversation agent to the Home Assistant’s voice assistant. Expose HA devices to be controllable. That’s about it on high level.

  • artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    16
    ·
    2 days ago

    Multi-billion dollar companies like Google and Apple can’t even figure this shit out, doubt some nerd is gonna do it for free.

      • artyom@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        2 days ago

        LOL they can’t even reliably turn the lights on, WTF are you talking about?

        • Avid Amoeba@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Using Home Assistant with Qwen locally. It functions better than any version of Google Home I’ve had. Understands me without having to think about how I say things. Can ask it for one or multiple things at the same time. Can even make it so that it pretends to be Santa Claus while responding. My wife was ecstatic when she heard the Ho-ho-ho while asking to turn the coffee machine on on Christmas.

        • Eager Eagle@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          2 days ago

          maybe last you tried it was over 6 months ago, maybe you’re using the old google assistant, or idk, but it definitely works for me

              • NarrativeBear@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                2 days ago

                Gemini is a hot pile of garbage.

                When I ask Gemini for directions it starts to give me a definition, as opposed to opening maps and showing me the way. If I ask to turn off the lights I get a conversation and I end up walking to the light switch myself.