-
Ollama Model Status Indicator in Model Selector: Instantly see which Ollama models are currently loaded with a clear indicator in the model selector, helping you stay organized and optimize local model usage.
模型选择器中的 Ollama 模型状态指示器:通过模型选择器中的清晰指示器立即查看当前加载的 Ollama 模型,帮助你更好地组织并优化本地模型使用。
-
Unload Ollama Model Directly from Model Selector: Easily release memory and resources by unloading any loaded Ollama model right in the model selector—streamline hardware management without switching pages.
直接从模型选择器卸载 Ollama 模型:可在模型选择器中直接卸载任何已加载的 Ollama 模型,轻松释放内存和资源——无需切换页面即可简化硬件管理。
-
User-Configurable Speech-to-Text Language Setting: Improve transcription accuracy by letting individual users explicitly set their preferred STT language in their settings—ideal for multilingual teams and clear audio capture.
用户可配置的语音转文字语言设置:允许各个用户在设置中明确指定他们偏好的 STT(语音转文字)语言,提高转录准确性——非常适合多语言团队和清晰音频捕捉需求。
-
Granular Audio Playback Speed Control: Instead of just presets, you can now choose granular audio speed using a numeric input, giving you complete control over playback pace in transcriptions and media reviews.
精细的音频播放速度控制:不再仅限于预设值,可通过数字输入精确选择音频播放速度,从而在转录和媒体回顾中实现对播放节奏的完全掌控。
-
GZip, Brotli, ZStd Compression Middleware: Enjoy significantly faster page loads and reduced bandwidth usage with new server-side compression—giving users a snappier, more efficient experience.
GZip、Brotli、ZStd 压缩中间件:通过新增的服务器端压缩功能,大幅提升页面加载速度并减少带宽使用,为用户带来更快、更高效的体验。
-
Configurable Weight for BM25 in Hybrid Search: Fine-tune search relevance by adjusting the weight for BM25 inside hybrid search from the UI, letting you tailor knowledge search results to your workflow.
混合搜索中 BM25 的可配置权重:可在 UI 中调整 BM25 在混合搜索中的权重,精细优化搜索相关性,根据你的工作流程定制知识搜索结果。
-
Bypass File Creation with CTRL + SHIFT + V: When “Paste Large Text as File” is enabled, use CTRL + SHIFT + V to skip the file creation dialog and instantly upload text as a file—perfect for rapid document prep.
使用 CTRL + SHIFT + V 跳过文件创建:当启用“将大文本粘贴为文件”时,按下 CTRL + SHIFT + V 可跳过文件创建对话框,立即将文本作为文件上传——非常适合快速文档准备。
-
Bypass Web Loader in Web Search: Choose to bypass web content loading and use snippets directly in web search for faster, more reliable results when page loads are slow or blocked.
网页搜索中跳过网页加载:可选择跳过网页内容加载,直接使用摘要,使在页面加载缓慢或被阻止的情况下获得更快速、更可靠的搜索结果。
-
Environment Variable: WEBUI_AUTH_TRUSTED_GROUPS_HEADER: Now sync and manage user groups directly via trusted HTTP header, unlocking smoother single sign-on and identity integrations for organizations.
环境变量 WEBUI_AUTH_TRUSTED_GROUPS_HEADER:现在可通过受信任的 HTTP 头直接同步和管理用户组,为组织实现更顺畅的单点登录和身份集成。
-
Workspace Models Visibility Controls: You can now hide workspace-level models from both the model selector and shared environments—keep your team focused and reduce clutter from rarely-used endpoints.
工作区模型可见性控制:现在可以将工作区级别的模型从模型选择器和共享环境中隐藏,让团队更专注,同时减少不常用端点带来的干扰。
-
Copy Model Link: You can now copy a direct link to any model—including those hidden from the selector—making sharing and onboarding others more seamless.
复制模型链接:您现在可以复制任何模型的直接链接——包括在选择器中隐藏的模型——使分享和引导他人使用更加便捷。
-
Load Function Directly from URL: Simplify custom function management—just paste any GitHub function URL into Open WebUI and import new functions in seconds.
通过 URL 直接加载函数:简化自定义函数管理——只需将任何 GitHub 函数的 URL 粘贴到 Open WebUI 中,即可在几秒钟内导入新函数。
-
Custom Name/Description for External Tool Servers: Personalize and clarify external tool servers by assigning custom names and descriptions, making it easier to manage integrations in large-scale workspaces.
外部工具服务器自定义名称/描述:为外部工具服务器指定自定义名称和描述以实现个性化和明确化,使在大规模工作空间中集成管理更为便捷。
-
Custom OpenAPI JSON URL Support for Tool Servers: Supports specifying any custom OpenAPI JSON URL, unlocking more flexible integration with any backend for tool calls.
工具服务器支持自定义 OpenAPI JSON URL:支持指定任意自定义 OpenAPI JSON URL,实现与任意后端的灵活集成以进行工具调用。
-
Source Field Now Displays in Non-Streaming Responses with Attachments: When files or knowledge are attached, the “source” field now appears for all responses, even in non-streaming mode—enabling improved citation workflow.
附带文件的非流式响应中现在显示来源字段:当附加文件或知识时,即使在非流式模式下,所有响应中现在也会显示“来源”字段——支持更好的引用流程。
-
Pinned Chats: Reduced payload size on pinned chat requests—leading to faster load times and less data usage, especially on busy warehouses.
置顶聊天:减少了置顶聊天请求的负载大小——带来更快的加载速度和更少的数据使用,尤其适用于高负载的仓库。
-
Import/Export Default Prompt Suggestions: Enjoy one-click import/export of prompt suggestions, making it much easier to share, reuse, and manage best practices across teams or deployments.
默认提示建议可导入/导出:现在可以一键导入或导出提示建议,使团队或部署间共享、复用和管理最佳实践变得更加轻松。
-
Banners Now Sortable from Admin Settings: Quickly re-order or prioritize banners, letting you highlight the most critical info for your team.
横幅可在管理员设置中排序:可以快速重新排序或优先显示横幅,方便你突出对团队最重要的信息。
-
Advanced Chat Parameters—Clearer Ollama Support Labels: Parameters and advanced settings now explicitly indicate if they are Ollama-specific, reducing confusion and improving setup accuracy.
高级聊天参数——更清晰的 Ollama 支持标签:参数和高级设置现在明确标识是否为 Ollama 专用,减少混淆并提升设置准确性。
-
Scroll Bar Thumb Improved for Better Visibility: Enhanced scrollbar styling makes navigation more accessible and visually intuitive.
改进滚动条滑块以增强可见性:优化后的滚动条样式使导航更易操作且视觉上更直观。
-
Modal Redesign for Archived and User Chat Listings: Clean, modern modal interface for browsing archived and user-specific chats makes locating conversations faster and more pleasant.
存档与用户聊天列表的模态窗口重新设计:采用简洁现代的模态界面,更便于浏览存档和特定用户的聊天记录,让查找对话更加快捷愉悦。
-
Add/Edit Memory Modal UX: Memory modals are now larger and have resizable input fields, supporting easier editing of long or complex memory content.
添加/编辑记忆模态窗口的用户体验优化:记忆模态窗口现在更大,输入字段可调整大小,便于编辑较长或复杂的记忆内容。
-
Translation & Localization Enhancements: Major upgrades to Chinese (Simplified & Traditional), Korean, Russian, German, Danish, Finnish—not just fixing typos, but consistency, tone, and terminology for a more natural native-language experience.
翻译与本地化增强:对简体中文、繁体中文、韩语、俄语、德语、丹麦语、芬兰语进行了重大升级,不仅修正了拼写错误,更提升了一致性、语气和术语使用,带来更自然的母语体验。
-
General Backend Stability & Security Enhancements: Various backend refinements ensure a more resilient, reliable, and secure platform for smoother operation and peace of mind.
后端稳定性与安全性通用提升:多项后端优化提升了平台的韧性、可靠性与安全性,为更顺畅的操作体验与放心使用保驾护航。