How to integrate RouterLink in LobeChat
Overview
This tutorial guides you through integrating RouterLinkโWORLD3's decentralized AI model routing infrastructureโwith LobeChat, an open-source multi-modal AI chat platform.
Purpose
RouterLink provides a unified API endpoint that routes requests to multiple AI model providers through a decentralized network. By integrating RouterLink with LobeChat, you gain access to a diverse range of large language models (LLMs) without managing multiple API keys or provider-specific configurations.
What You Will Learn
By completing this tutorial, you will:
Configure LobeChat to connect with RouterLink's OpenRouter-compatible API endpoint
Authenticate using your RouterLink API credentials
Add and manage AI models available through the RouterLink network
Execute inference requests through LobeChat's conversational interface
Prerequisites
Before proceeding, ensure you have:
A RouterLink API key (obtain one from the Quick Start Guide)
A web browser with internet access
Basic familiarity with API configuration concepts
Introduction to LobeChat

LobeChat is an open-source, multi-modal AI chat platform and framework that serves as a unified interface for various large language models (LLMs). Designed as a personal AI "operating system," it enables users to access, orchestrate, and interact with different AI models and tools through a single, cohesive interface.
Key features include:
Multi-provider model support via OpenRouter protocol
Plugin ecosystem for extended functionality
Customizable conversation management
Cross-platform accessibility
Step 1: Access LobeChat
Navigate to https://lobechat.com and authenticate using your preferred sign-in method (e.g., GitHub, Google, or email).

Step 2: Navigate to AI Service Provider Settings
Locate and click your user avatar in the top-left corner of the interface.
Select "Settings" from the dropdown menu.

In the Settings panel, navigate to "AI Service Provider" in the left sidebar.
Scroll down and locate the OpenRouter provider option.

Step 3: Configure RouterLink API Connection
Configure the OpenRouter provider with your RouterLink credentials:

API Configuration Parameters
API Key
Your RouterLink API key from the Quick Start Guide
API Proxy URL
https://router-link.world3.ai/api/v1
After entering the credentials, click the "+" button adjacent to "Fetch Models" to add a custom model configuration.


Model Registration
Specify the model you wish to utilize. This example demonstrates configuration for Gemini 3 Pro Image Preview.

Model ID Format:
Retrieve the model identifier from the example code section in your desired model's documentation page. For instance:
Enter the fully-qualified model ID:

Click "OK" to confirm the model configuration.

Enable the Provider
Toggle the OpenRouter master switch in the top-right corner to enable the provider connection.

Step 4: Initiate a Conversation
Click "Chat" in the top-left navigation area to access the conversation interface.

Select "Just Chat" and choose your configured OpenRouter model from the model selector.

Summary
You have successfully integrated RouterLink with LobeChat. You can now:
Send prompts to various AI models through RouterLink's decentralized routing network
Switch between different models available on the RouterLink platform
Leverage LobeChat's interface features while utilizing RouterLink's infrastructure
For advanced configuration options, model availability, and API documentation, visit the RouterLink Documentation.
Troubleshooting
Authentication failed
Verify your API key is correctly copied without leading/trailing spaces
Model not responding
Confirm the model ID matches the exact format from the RouterLink documentation
Connection timeout
Check your network connectivity and ensure the API Proxy URL is correct
For additional support, please open a ticket on Discord.
Links
Website: https://routerlink.world3.ai
Genesis Program (Alpha Phase): https://routerlink.world3.ai/?tab=alphaprogram
Twitter: @world3_ai
Discord: discord.gg/world3
Last updated
