Local LLM Access on Mobile Devices

Triplo AI now enables users to access Local LLMs directly from their mobile devices. This feature is available with all PRO multi-device licenses. To use this functionality, ensure you have the following client versions installed:

  • Triplo AI for Desktop: Version 5.0.0 or higher

  • Triplo AI for Mobile: Version 3.4.0 or higher

System Requirements

While larger LLMs require powerful computers with GPUs and substantial VRAM and RAM, smaller, simpler models can run on more modest machines, such as a Pentium i5 with 16GB or 32GB of RAM. It's essential to choose a model that matches your machine's specifications.

Getting Started

To make the most of this feature, follow these steps:

1. Install Required Software

Download and install either Ollama or LM Studio.

2. Download Models

Access and download models from:

Recommendation: Start with smaller models like Gemma 2 2B or Llama 3.2 3B for optimal performance on standard hardware.

3. Activate Server Options

  • Ollama: Run the command ollama serve in your Terminal

  • LM Studio: Press CTRL+R, then enable the "Serve on Local Network" option under Developer Mode / Settings

4. Copy Server URLs

Once the servers are running, copy the displayed URLs:

  • Ollama: Default URL is http://192.168.x.x:11434 (shown in the terminal)

  • LM Studio: URL is http://192.168.x.x:1234 (found under the Developer tab in the Developer Logs section)

5. Configure Triplo AI Desktop

  • In Triplo AI Desktop, navigate to Settings

  • Click on "Local LLM Settings" (the second icon next to the Licensing section)

  • Paste the copied URL into the "Local LLM Provider URL" field

  • Enable the "Enable Local LLM" option

6. Verify Available Models

The models installed on your Local LLM Provider will appear in the "Local LLM Models" list.

7. Share Access with Mobile Devices

To share access to the Local LLMs running on your Desktop with mobile devices under your Triplo AI License:

  • Toggle on the "Share access with mobile devices" option

  • Copy the "Encryption key"

8. Configure Triplo AI Mobile

On the Triplo AI Mobile app, navigate to Settings (gear icon at the top left).

9. Enable Shared LLMs

Enable the Shared LLMs option (the last item on the list) and click its Settings icon (gear icon on the right).

10. Select LLM Device

Select the LLM device you wish to use (if you have multiple devices serving models to your Triplo AI license).

11. Paste Encryption Key

Paste the same Encryption key from your Desktop (copied in step 7).

12. Access Local Models

The models running locally on your Desktop are now accessible on your mobile device, providing you with direct and secure (encrypted) access via Triplo AI Mobile.

 


 

Supercharge Your Productivity with Triplo AI

Unlock the ultimate AI-powered productivity tool with Triplo AI, your all-in-one virtual assistant designed to streamline your daily tasks and boost efficiency. Triplo AI offers real-time assistance, content generation, smart prompts, and translations, making it the perfect solution for students, researchers, writers, and business professionals. Seamlessly integrate Triplo AI with your desktop or mobile device to generate emails, social media posts, code snippets, and more, all while breaking down language barriers with context-aware translations. Experience the future of productivity and transform your workflow with Triplo AI.

Try it risk-free today and see how it can save you time and effort.

Your AI assistant everywhere

Imagined in Brazil, coded by Syrians in Türkiye.
© Elbruz Technologies. All Rights reserved


Was this article helpful?