How to integrate RouterLink in LobeChat

circle-info

This documentation is provided for informational purposes only and demonstrates how to configure and use our API with third-party AI chat interfaces. Any third-party software, websites, or services mentioned are not operated, controlled, or endorsed by us.

Overview

This tutorial guides you through integrating RouterLinkโ€”WORLD3's decentralized AI model routing infrastructureโ€”with LobeChat, an open-source multi-modal AI chat platform.

Purpose

RouterLink provides a unified API endpoint that routes requests to multiple AI model providers through a decentralized network. By integrating RouterLink with LobeChat, you gain access to a diverse range of large language models (LLMs) without managing multiple API keys or provider-specific configurations.

What You Will Learn

By completing this tutorial, you will:

  • Configure LobeChat to connect with RouterLink's OpenRouter-compatible API endpoint

  • Authenticate using your RouterLink API credentials

  • Add and manage AI models available through the RouterLink network

  • Execute inference requests through LobeChat's conversational interface

Prerequisites

Before proceeding, ensure you have:

  • A RouterLink API key (obtain one from the Quick Start Guidearrow-up-right)

  • A web browser with internet access

  • Basic familiarity with API configuration concepts


Introduction to LobeChat

LobeChat is an open-source, multi-modal AI chat platform and framework that serves as a unified interface for various large language models (LLMs). Designed as a personal AI "operating system," it enables users to access, orchestrate, and interact with different AI models and tools through a single, cohesive interface.

Key features include:

  • Multi-provider model support via OpenRouter protocol

  • Plugin ecosystem for extended functionality

  • Customizable conversation management

  • Cross-platform accessibility


Step 1: Access LobeChat

Navigate to https://lobechat.comarrow-up-right and authenticate using your preferred sign-in method (e.g., GitHub, Google, or email).


Step 2: Navigate to AI Service Provider Settings

  1. Locate and click your user avatar in the top-left corner of the interface.

  2. Select "Settings" from the dropdown menu.

  1. In the Settings panel, navigate to "AI Service Provider" in the left sidebar.

  2. Scroll down and locate the OpenRouter provider option.


Configure the OpenRouter provider with your RouterLink credentials:

API Configuration Parameters

Parameter
Value

API Key

Your RouterLink API key from the Quick Start Guidearrow-up-right

API Proxy URL

https://router-link.world3.ai/api/v1

After entering the credentials, click the "+" button adjacent to "Fetch Models" to add a custom model configuration.

Model Registration

Specify the model you wish to utilize. This example demonstrates configuration for Gemini 3 Pro Image Preview.

circle-info

For optimal compatibility with the OpenRouter protocol, select models with the Provider designation "WORLD3 Router North America".

Model ID Format:

Retrieve the model identifier from the example code section in your desired model's documentation page. For instance:

Enter the fully-qualified model ID:

Click "OK" to confirm the model configuration.

Enable the Provider

Toggle the OpenRouter master switch in the top-right corner to enable the provider connection.


Step 4: Initiate a Conversation

  1. Click "Chat" in the top-left navigation area to access the conversation interface.

  1. Select "Just Chat" and choose your configured OpenRouter model from the model selector.

circle-info

To streamline model selection, navigate to Settings and disable unused model providers, leaving only your RouterLink-configured models active.


Summary

You have successfully integrated RouterLink with LobeChat. You can now:

  • Send prompts to various AI models through RouterLink's decentralized routing network

  • Switch between different models available on the RouterLink platform

  • Leverage LobeChat's interface features while utilizing RouterLink's infrastructure

For advanced configuration options, model availability, and API documentation, visit the RouterLink Documentationarrow-up-right.


Troubleshooting

Issue
Solution

Authentication failed

Verify your API key is correctly copied without leading/trailing spaces

Model not responding

Confirm the model ID matches the exact format from the RouterLink documentation

Connection timeout

Check your network connectivity and ensure the API Proxy URL is correct

For additional support, please open a ticket on Discordarrow-up-right.

Last updated