BTW, is there any chance for the trial key (even one day)? My phone is running GrapheneOS and I would need to see if all I'd like works (or I can make it work).
Maybe beta programme?
yincrash 2 days ago [-]
Unfortunately, it's my first time using the discount code, and I didn't realize it required in-app code to enable it. There is a new build that should now have the discount. I will extend it another week.
I like the trial idea. Will see if I can get that out quickly.
yincrash 2 days ago [-]
Free trial implemented. 5 day trial
subscribed 1 days ago [-]
Thanks, trial works.
Onboarding on the custom API is very tedious (I used openrouter: took me 20 minutes to guess the URL and model format you expected, full of 404 and 200 errors).
I added the model, connection test worked, I added all the requested permission and set Aide to be default assistant, but unfortunately after tapping a microphone I get "Client-side error", and after asking it via text something super basic (eg: "what's the time now") it responded with "Provider error".
I'll probably try tomorrow with my local model so I can have full visibility of the traffic because frankly the app doesn't help much :)
Looks sleek, I hope it works too ;)
yincrash 8 hours ago [-]
I appreciate the feedback. I could probably smooth out the custom API to just put in the full endpoint URL.
subscribed 1 days ago [-]
Fantastic! Thank you so much.
newsdeskx 4 days ago [-]
does this work with purely local models through Ollama, or do you still need the Ollama server running on another machine? been looking for something that actually works offline for basic voice commands
yincrash 4 days ago [-]
Still needs a server. You could run a server locally if you had a model that your device could handle then point aide to the localhost URL.
subscribed 3 days ago [-]
New phones can run Gemma 4 quants pretty nicely. It's a surprisingly good model. Google's Edge Gallery also offers some choice to try.
subscribed 3 days ago [-]
Missed the window for edit: I agree that ideally I'd have a tiny local MOE-kind of model, able to establish the complexity of the request, route simple local requests to the instantly available local agent, and route all the rest outside (to one of several models).
It's a second day of the first week (as per Google Play), and it shows $9.99 already (£8.99 in the Play Store).
I'm not saying it's expensive, feature wise it's awesome, I'm saying it's inconsistent :)))
BTW, is there any chance for the trial key (even one day)? My phone is running GrapheneOS and I would need to see if all I'd like works (or I can make it work).
Maybe beta programme?
I like the trial idea. Will see if I can get that out quickly.
Onboarding on the custom API is very tedious (I used openrouter: took me 20 minutes to guess the URL and model format you expected, full of 404 and 200 errors).
I added the model, connection test worked, I added all the requested permission and set Aide to be default assistant, but unfortunately after tapping a microphone I get "Client-side error", and after asking it via text something super basic (eg: "what's the time now") it responded with "Provider error".
I'll probably try tomorrow with my local model so I can have full visibility of the traffic because frankly the app doesn't help much :)
Looks sleek, I hope it works too ;)