别再死磕配置文件!Spring AI实现大模型 API Key 动态切换,从此告别重启服务!

picture.image

一个KEY用国内所有主流大模型快来AISPACE

现在来,免费送两个月畅享TOKEN

在大模型开发中,你是否遇到过这些问题?

单个 API Key 调用量超限,需要手动切换密钥才能继续服务

不同环境(开发 / 测试 / 生产)需要切换不同密钥,每次都要改配置重启服务

密钥过期需要紧急更换,担心停服影响用户体验

今天就用 Spring AI 实现大模型 API Key 的动态切换,全程无需重启服务,轻松应对多密钥、密钥轮换、环境隔离等场景。

01

核心原理:从 "静态配置" 到 "动态获取"

传统方式中,API Key 通常写死在配置文件(如application.yml),Spring 启动时加载后就无法修改。动态切换的核心思路是: 将 Key 的获取逻辑从 "启动时读取配置" 改为 "调用时动态获取" ,让每次请求都能拿到最新的有效密钥。

环境准备

引入依赖

引入依赖,由于各大厂商都是遵循Open AI的API规范,所以我们只需要引入Spring AI中,OpenAI的规范就能实现,动态API的动态切换并且兼容。

  
<dependency>  
   <groupId>org.springframework.ai</groupId>  
   <artifactId>spring-ai-starter-model-openai</artifactId>  
   <exclusions>  
       <exclusion>  
           <artifactId>spring-ai-autoconfigure-model-openai</artifactId>  
           <groupId>org.springframework.ai</groupId>  
       </exclusion>  
       <exclusion>  
           <artifactId>spring-ai-autoconfigure-model-chat-memory</artifactId>  
           <groupId>org.springframework.ai</groupId>  
       </exclusion>  
   </exclusions>  
</dependency>

注意点:需要排除掉上面两个包,否则会自动加载配置文件中的配置从而报错

新建模型配置的表

  
CREATE TABLE `llm_model_config`  (  
  `id` bigint UNSIGNED NOT NULL COMMENT '主键ID',  
  `model_code` varchar(64) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci NOT NULL COMMENT '模型唯一编码,如 gpt-4o, DeepSeek',  
  `model_name` varchar(128) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci NOT NULL COMMENT '模型部署名称,如 DeepSeek',  
  `model_type` varchar(32) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci NOT NULL COMMENT '模型类型,如 chat, embedding, tts, vision',  
  `api_key` varchar(512) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci NOT NULL COMMENT 'API Key',  
  `base_url` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci NOT NULL COMMENT 'API 基础地址,支持自定义部署',  
  `completions_path` varchar(128) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci NOT NULL DEFAULT '/v1/chat/completions' COMMENT '补全接口路径',  
  `request_headers` json NULL COMMENT '额外请求头,如 X-Request-ID,支持 JSON 格式存储',  
  `temperature` double NULL DEFAULT 0.7 COMMENT '生成温度,范围 0.0-1.0',  
  `top_p` double NULL DEFAULT 1 COMMENT '核采样阈值,范围 0.0-1.0',  
  `model_description` text CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci NULL COMMENT '模型描述信息',  
  `is_default` tinyint(1) NOT NULL DEFAULT 0 COMMENT '是否为默认模型(1=是,0=否)',  
  `is_enabled` tinyint(1) NOT NULL DEFAULT 1 COMMENT '是否启用(软删除标记,1=启用,0=禁用)',  
  `create_time` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间',  
  `update_time` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP COMMENT '最后更新时间',  
  PRIMARY KEY (`id`) USING BTREE,  
  INDEX `idx_model_type`(`model_type` ASC) USING BTREE,  
  INDEX `idx_is_default`(`is_default` ASC) USING BTREE,  
  INDEX `idx_is_enabled`(`is_enabled` ASC) USING BTREE,  
  INDEX `idx_model_code`(`model_code` ASC) USING BTREE,  
  INDEX `model_code`(`model_code` ASC) USING BTREE  
) ENGINE = InnoDB CHARACTER SET = utf8mb4 COLLATE = utf8mb4_unicode_ci COMMENT = '大模型配置表' ROW_FORMAT = Dynamic;

实现动态 Key 切换

新建ILLmService接口类

  
public interface ILLmService {  
  
  
    /**  
     * 获取默认的ChatClient  
     * @return  
     */  
    ChatClient getDefaultChatClient();  
  
    /**  
     * 获取ChatClient  
     * @return  
     */  
    ChatClient getChatClient(ModelConfig model);  
}

新建ILLmService的实现类

  
@Service  
@Slf4j  
@RequiredArgsConstructor  
public class LLMService implements ILLmService{  
  
    @Override  
    public ChatClient getDefaultChatClient() {  
        return null;  
    }  
  
    @Override  
    public ChatClient getChatClient(ModelConfig  model) {  
        return null;  
    }  
  
}

实现类的方法实现

1、从数据库中获取默认模型,如果没有获取到默认模型,就拿获取到的第一个作为默认模型

  
private final IModelConfigService modelService;  
  
private ChatClient chatClient;  
  
private Map<Long, ChatClient> clients = new ConcurrentHashMap<>();  
  
@Override  
public ChatClient getDefaultChatClient() {  
  QueryWrapper<ModelConfig> wrapper = new QueryWrapper<>();  
  wrapper.lambda().eq(ModelConfig::getIsDefault, true);  
  ModelConfig defaultModel = modelService.getOne(wrapper);  
  if (defaultModel == null) {  
     log.error("No default model found");  
     List<ModelConfig> models = modelService.list();  
     if (models.isEmpty()) {  
         log.error("No model found");  
     } else {  
         defaultModel = models.stream().findFirst().orElse(null);  
         if (defaultModel != null) {  
            log.warn("Using first model as default: {}", defaultModel.getModelName());  
         } else {  
            log.error("First model in list is also null");  
         }  
     }  
     }else {  
         log.info("Find default model: {}", defaultModel.getModelName());  
     }  
     if (defaultModel != null) {  
        return getChatClient(defaultModel);  
     }else {  
        log.warn("Cannot find any model,ChatClient will be initialize after model being configured");  
        return null;  
     }  
    }

2、动态获取指定的模型

  
@Override  
public ChatClient getChatClient(ModelConfig  model) {  
    Long modelId = model.getId();  
    if (clients.containsKey(modelId)) {  
        return clients.get(modelId);  
    }  
    return this.buildOrUpdateChatClient(model);  
}

3、在初始化时,实例化一个数据库中默认的大模型

  
@PostConstruct  
public void initializeChatClients() {  
    ChatClient client = getDefaultChatClient();  
    if (client != null) {  
       this.chatClient = client;  
       log.info("Initialize default chat client success");  
    }else {  
       log.warn("Cannot find any model,ChatClient will be initialize after model being configured");  
    }  
}

4、重点来了,动态切换API Key的实现

通过OpenAiApi动态赋值大模型的baseUrl和apiKey,再构建出ChatClient,然后将实例化的ChatClient缓存到clients 的Map集合中

  
private ChatClient buildOrUpdateChatClient(ModelConfig model) {  
    Map<String, String> headers = model.getRequestHeaders();  
    OpenAiApi openAiApi = OpenAiApi.builder()  
            .baseUrl(model.getBaseUrl())  
            .apiKey(model.getApiKey())  
            .completionsPath(model.getCompletionsPath())  
            .build();  
    OpenAiChatOptions.Builder chatOptionsBuilder = OpenAiChatOptions.builder().model(model.getModelName());  
    if (model.getTemperature() != null) {  
        chatOptionsBuilder.temperature(model.getTemperature());  
    }  
    if (model.getTopP() != null) {  
        chatOptionsBuilder.topP(model.getTopP());  
    }  
    chatOptionsBuilder.internalToolExecutionEnabled(false);  
    OpenAiChatOptions chatOptions = chatOptionsBuilder.build();  
    if (headers != null) {  
        chatOptions.setHttpHeaders(headers);  
    }  
    OpenAiChatModel openAiChatModel = OpenAiChatModel.builder()  
            .openAiApi(openAiApi)  
            .defaultOptions(chatOptions)  
            .build();  
    ChatClient client = ChatClient.builder(openAiChatModel)  
            .defaultAdvisors(new SimpleLoggerAdvisor())  
            .build();  
    clients.put(model.getId(), client);  
    log.info("Build or update dynamic chat client for model: {}", model.getModelName());  
    return client;  
}  
  
private void initializeChatClients(ModelConfig model) {  
    this.chatClient=buildOrUpdateChatClient(model);  
}

5、再通过Spring的监听,能够在更新大模型时,能够同步更新实例化出ChatClient

  
@EventListener  
public void onEvent(TaiChuEvent<ModelChangeEvent> event) {  
    log.info("Received ModelChangeEvent...");  
    if (event.getEntity() == null) {  
        log.warn("ModelChangeEvent entity is null");  
        return;  
    }  
    log.info("Model updated");  
    ModelConfig modelEntity = event.getEntity().getModelConfig();  
    initializeChatClients(modelEntity);  
}

实现模型更新的方法

更新默认模型

  
@RestController  
@AllArgsConstructor  
@RequestMapping("/models")  
public class ModelController  extends BaseController<IModelConfigService, ModelConfig> {  
    private final IModelConfigService modelConfigService;  
    /**  
     * 设置默认模型  
     * @param modelId  
     * @return {@link Boolean}  
     */  
    @PostMapping("/set-default/{id}")  
    public ResponseResult<Boolean> setDefault(@PathVariable("id") Long modelId) {  
        return ResponseResult.success(modelConfigService.setDefault(modelId));  
    }  
}

设置默认模型,并且通知监听事件更新ChatClient

  
@Override  
@Transactional(rollbackFor = Exception.class)  
public boolean setDefault(Long modelId) {  
    // 参数校验  
    if (modelId == null) {  
        throw new IllegalArgumentException("modelId cannot be null");  
    }  
    // 先取消当前默认模型  
    QueryWrapper<ModelConfig> queryWrapper = new QueryWrapper<>();  
    queryWrapper.lambda().eq(ModelConfig::getIsDefault, true);  
    ModelConfig config = this.getOne(queryWrapper);  
    if (config != null) {  
        config.setIsDefault(false);  
        this.updateById(config);  
    }  
    // 设置新的默认模型  
    ModelConfig modelConfig = this.getById(modelId);  
    if (modelConfig == null) {  
        throw new IllegalArgumentException("ModelConfig not found with id: " + modelId);  
    }  
    modelConfig.setIsDefault(true);  
    ModelChangeEvent modelChangeEvent = new ModelChangeEvent(modelConfig);  
    boolean updated = this.updateById(modelConfig);  
    if (!updated) {  
        throw new RuntimeException("Failed to update modelConfig with id: " + modelId);  
    }else {  
        log.info("Set model: {} as default model", modelConfig.getModelName());  
        publisher.publishChange(modelChangeEvent);  
    }  
    return updated;  
}

获取DeepSeek的大模型

上AI SPACE能够获取到我为大家准备的DeepSeek的key

picture.image

picture.image

02

测试一下!

写一个请求接口测试一下

  
@RestController  
@AllArgsConstructor  
@RequestMapping("/test")  
public class TestController {  
    private final LLMService llmService;  
    @GetMapping("/chat")  
    public ResponseResult<String> chat(@RequestParam String question) {  
        String content = llmService.getDefaultChatClient().prompt(question).call().content();  
        return ResponseResult.success(content);  
    }  
}

请求默认模型

picture.image

请求动态切换模型

picture.image

控制台显示已经切换了DeepSeek-R1的模型

picture.image

请求测试一下

picture.image

实战源码地址:https://gitee.com/guocjsh/tai-chu-agent

03

关于AISPACE

我们把大模型的价格打到了全网最低

同时我们同步上线了AI大模型应用开发的知识星球

现在加入知识星球,我们免费送两个月大模型Token畅享!!!

不限制Token,随便用!!!

详情看飞书文档

https://aispace-api.feishu.cn/wiki/EMTnwGf4piadDJk43RUcEO4Vneh

picture.image

picture.image

picture.image

0
0
0
0
评论
未登录
暂无评论