{"id":2147,"date":"2026-01-11T19:11:35","date_gmt":"2026-01-11T19:11:35","guid":{"rendered":"https:\/\/ai-box.eu\/?p=2147"},"modified":"2026-01-11T19:14:14","modified_gmt":"2026-01-11T19:14:14","slug":"install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models","status":"publish","type":"post","link":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/","title":{"rendered":"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models"},"content":{"rendered":"<p>After showing in my previous posts how to install Ollama, Open WebUI, ComfyUI, LLaMA Factory, vLLM, and LM Studio on the <strong>Gigabyte AI TOP ATOM<\/strong>, here comes another interesting alternative for everyone looking for a professional chat interface with advanced features like RAG (Retrieval-Augmented Generation), multi-user support, and a plugin system: <strong>LibreChat<\/strong> is an open-source alternative to ChatGPT with extensive configuration options and support for local Ollama models.<\/p>\n<p>In this post, I will show you how I installed <a href=\"https:\/\/github.com\/danny-avila\/LibreChat\" target=\"_blank\" rel=\"noopener\"><strong>LibreChat<\/strong><\/a> on my Gigabyte AI TOP ATOM and configured it to work with the already running Ollama server. Since the system is based on the same platform as the <strong>NVIDIA DGX Spark<\/strong> and utilizes the ARM64 architecture (aarch64) with the NVIDIA Grace CPU, the official containers work excellently as long as you make a few specific adjustments for the ARM architecture. It didn&#8217;t work quite easily out of the box, but with my guide here, you should have everything set up in about 30 minutes. Most of those 30 minutes will be spent downloading the container images at least that&#8217;s how it was for me since my internet isn&#8217;t that fast.<\/p>\n<p><strong>Note:<\/strong> For my experience reports here on my blog, I was loaned the Gigabyte AI TOP ATOM by the company <a href=\"https:\/\/www.mifcom.de\/\" target=\"_blank\" rel=\"noopener\">MIFCOM<\/a>.<\/p>\n<div id=\"attachment_2144\" style=\"width: 1034px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM-1024x522.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-2144\" class=\"size-large wp-image-2144\" src=\"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM-1024x522.jpg\" alt=\"LibreChat Gigabyte AI TOP ATOM\" width=\"1024\" height=\"522\" srcset=\"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM-1024x522.jpg 1024w, https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM-300x153.jpg 300w, https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM-768x391.jpg 768w, https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM-1536x783.jpg 1536w, https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM-2048x1044.jpg 2048w, https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM-1080x550.jpg 1080w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/a><p id=\"caption-attachment-2144\" class=\"wp-caption-text\">LibreChat Gigabyte AI TOP ATOM<\/p><\/div>\n<div id=\"ez-toc-container\" class=\"ez-toc-v2_0_82_2 counter-hierarchy ez-toc-counter ez-toc-grey ez-toc-container-direction\">\n<div class=\"ez-toc-title-container\">\n<p class=\"ez-toc-title\" style=\"cursor:inherit\">Table of Contents<\/p>\n<span class=\"ez-toc-title-toggle\"><a href=\"#\" class=\"ez-toc-pull-right ez-toc-btn ez-toc-btn-xs ez-toc-btn-default ez-toc-toggle\" aria-label=\"Toggle Table of Content\"><span class=\"ez-toc-js-icon-con\"><span class=\"\"><span class=\"eztoc-hide\" style=\"display:none;\">Toggle<\/span><span class=\"ez-toc-icon-toggle-span\"><svg style=\"fill: #999;color:#999\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" class=\"list-377408\" width=\"20px\" height=\"20px\" viewBox=\"0 0 24 24\" fill=\"none\"><path d=\"M6 6H4v2h2V6zm14 0H8v2h12V6zM4 11h2v2H4v-2zm16 0H8v2h12v-2zM4 16h2v2H4v-2zm16 0H8v2h12v-2z\" fill=\"currentColor\"><\/path><\/svg><svg style=\"fill: #999;color:#999\" class=\"arrow-unsorted-368013\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"10px\" height=\"10px\" viewBox=\"0 0 24 24\" version=\"1.2\" baseProfile=\"tiny\"><path d=\"M18.2 9.3l-6.2-6.3-6.2 6.3c-.2.2-.3.4-.3.7s.1.5.3.7c.2.2.4.3.7.3h11c.3 0 .5-.1.7-.3.2-.2.3-.5.3-.7s-.1-.5-.3-.7zM5.8 14.7l6.2 6.3 6.2-6.3c.2-.2.3-.5.3-.7s-.1-.5-.3-.7c-.2-.2-.4-.3-.7-.3h-11c-.3 0-.5.1-.7.3-.2.2-.3.5-.3.7s.1.5.3.7z\"\/><\/svg><\/span><\/span><\/span><\/a><\/span><\/div>\n<nav><ul class='ez-toc-list ez-toc-list-level-1 ' ><ul class='ez-toc-list-level-3' ><li class='ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-1\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#The_Basic_Idea_Professional_Chat_Interface_with_Advanced_Features\" >The Basic Idea: Professional Chat Interface with Advanced Features<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-2\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_1_Check_System_Requirements\" >Phase 1: Check System Requirements<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-3\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_2_Clone_LibreChat_Repository\" >Phase 2: Clone LibreChat Repository<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-4\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_3_Prepare_Directories_and_Set_Permissions_Important\" >Phase 3: Prepare Directories and Set Permissions (Important!)<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-5\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_4_Configure_Environment_Variables_env\" >Phase 4: Configure Environment Variables (.env)<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-6\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_5_The_ARM_Fix_for_MongoDB_Meilisearch\" >Phase 5: The ARM Fix for MongoDB &amp; Meilisearch<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-7\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_6_Configuration_for_Ollama_librechatyaml\" >Phase 6: Configuration for Ollama (librechat.yaml)<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-8\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_7_Create_Docker_Compose_Override_File\" >Phase 7: Create Docker Compose Override File<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-9\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_8_Verify_Ollama_Service\" >Phase 8: Verify Ollama Service<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-10\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_9_Start_LibreChat\" >Phase 9: Start LibreChat<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-11\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_10_Remove_Unwanted_Endpoints_Optional\" >Phase 10: Remove Unwanted Endpoints (Optional)<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-12\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Phase_11_Access_and_First_Test\" >Phase 11: Access and First Test<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-13\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Troubleshooting_Tips\" >Troubleshooting Tips<\/a><\/li><\/ul><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-14\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#Summary_Conclusion\" >Summary &amp; Conclusion<\/a><\/li><\/ul><\/nav><\/div>\n<h3><span class=\"ez-toc-section\" id=\"The_Basic_Idea_Professional_Chat_Interface_with_Advanced_Features\"><\/span>The Basic Idea: Professional Chat Interface with Advanced Features<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>LibreChat is a fully self-hosted alternative to ChatGPT with functions like RAG for document integration and multi-user support. The system consists of several containers: an API server, MongoDB for the database, Meilisearch for search, and PostgreSQL\/pgvector for RAG functionality.<\/p>\n<p>The special thing about the AI TOP ATOM: Docker Compose automatically recognizes the <strong>aarch64 architecture<\/strong>. However, since the latest version of MongoDB often expects specific CPU instructions (AVX) that can lead to problems on ARM clusters, I&#8217;ll show you the path via the more stable <strong>Version 6.0<\/strong> for the DGX Spark architecture.<\/p>\n<p><strong>What you need:<\/strong><\/p>\n<ul>\n<li>A Gigabyte AI TOP ATOM (or comparable Grace system with ARM64\/aarch64 architecture)<\/li>\n<li>Ollama already installed (see my <a href=\"https:\/\/ai-box.eu\/en\/large-language-models-en\/ollama-on-the-gigabyte-ai-top-atom-central-llm-server-for-the-entire-network\/1898\/\" target=\"_blank\" rel=\"noopener\">previous blog post<\/a>)<\/li>\n<li>Docker and Docker Compose<\/li>\n<li>Terminal access (SSH or direct)<\/li>\n<li>Git for cloning the repository<\/li>\n<li>Several Ollama models already downloaded (e.g., ministral-3:14b, qwen3:4b, qwen3:30b-thinking, qwen3-coder:30b, deepseek-ocr:latest, gpt-oss:20b)<\/li>\n<\/ul>\n<h3><span class=\"ez-toc-section\" id=\"Phase_1_Check_System_Requirements\"><\/span>Phase 1: Check System Requirements<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>First, we check in the terminal whether the architecture and Docker are correctly available:<\/p>\n<p><strong>Check architecture:<\/strong> <code>uname -m<\/code> (should output <code>aarch64<\/code>)<br \/>\n<strong>Check Docker:<\/strong> <code>docker compose version<\/code><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Phase_2_Clone_LibreChat_Repository\"><\/span>Phase 2: Clone LibreChat Repository<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>We create a directory and download the current code from GitHub:<\/p>\n<p><strong>Command:<\/strong> <code>mkdir -p ~\/librechat &amp;&amp; cd ~\/librechat<\/code><\/p>\n<p><strong>Command:<\/strong> <code>git clone https:\/\/github.com\/danny-avila\/LibreChat.git .<\/code><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Phase_3_Prepare_Directories_and_Set_Permissions_Important\"><\/span>Phase 3: Prepare Directories and Set Permissions (Important!)<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>This step is critical to avoid permission errors on ARM systems!<\/strong><\/p>\n<p>Create the required directories:<\/p>\n<p><strong>Command: <\/strong><code>mkdir -p data\/db data\/meili logs images uploads<\/code><\/p>\n<p>Set the correct ownership and permissions:<\/p>\n<p><strong>Command:<\/strong> <code>sudo chown -R 1000:1000 data logs images uploads<\/code><\/p>\n<p><strong>Command: <\/strong><code>sudo chmod -R 775 data logs images uploads<\/code><\/p>\n<p><em>Why this is important:<\/em> Without these permissions, the MongoDB and Meilisearch containers cannot write to the data directories on ARM systems.<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Phase_4_Configure_Environment_Variables_env\"><\/span>Phase 4: Configure Environment Variables (.env)<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>Copy the example environment file:<\/p>\n<p><strong>Command: <\/strong><code>cp .env.example .env<\/code><\/p>\n<p><strong>Generate security keys:<\/strong><br \/>\nFor LibreChat to start, we must fill the keys in the <code>.env<\/code>. This is the fastest way:<\/p>\n<p><strong>Command: <\/strong><code>sed -i \"s\/CREDS_KEY=.*\/CREDS_KEY=$(openssl rand -hex 32)\/\" .env<\/code><br \/>\n<strong>Command: <\/strong><code>sed -i \"s\/CREDS_IV=.*\/CREDS_IV=$(openssl rand -hex 16)\/\" .env<\/code><br \/>\n<strong>Command: <\/strong><code>sed -i \"s\/JWT_SECRET=.*\/JWT_SECRET=$(openssl rand -hex 32)\/\" .env<\/code><br \/>\n<strong>Command: <\/strong><code>sed -i \"s\/JWT_REFRESH_SECRET=.*\/JWT_REFRESH_SECRET=$(openssl rand -hex 32)\/\" .env<\/code><\/p>\n<p><strong>Tip for the AI TOP ATOM:<\/strong> Set your user IDs in the <code>.env<\/code> to avoid permission errors:<br \/>\n<strong>Command: <\/strong><code>echo \"UID=$(id -u)\" &gt;&gt; .env &amp;&amp; echo \"GID=$(id -g)\" &gt;&gt; .env<\/code><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Phase_5_The_ARM_Fix_for_MongoDB_Meilisearch\"><\/span>Phase 5: The ARM Fix for MongoDB &amp; Meilisearch<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>Here comes the most important part for the <strong>AI TOP ATOM<\/strong>. The standard configuration of LibreChat uses MongoDB 8.0, which often leads to crashes on the Grace CPU (ARM64). We are switching the system to the more stable version 6.0.<\/p>\n<p>Open the <code>docker-compose.yml<\/code>:<\/p>\n<p><strong>Command:<\/strong> <code>nano docker-compose.yml<\/code><\/p>\n<p>Find the <code>mongodb:<\/code> section and adjust the image and user tag:<\/p>\n<pre><code>  mongodb:\r\n    container_name: chat-mongodb\r\n    image: mongo:6.0  # IMPORTANT: Version 6.0 for ARM stability\r\n    restart: always\r\n    user: \"1000:1000\" # Prevents permission errors\r\n    volumes:\r\n      - .\/data\/db:\/data\/db\r\n    command: mongod --noauth<\/code><\/pre>\n<p>The same applies to <strong>Meilisearch<\/strong>. Ensure that a fixed user is also defined here so that the search indices can be written:<\/p>\n<pre><code>  meilisearch:\r\n    container_name: chat-meilisearch\r\n    image: getmeili\/meilisearch:v1.12.3\r\n    user: \"1000:1000\"\r\n    environment:\r\n      - MEILI_HOST=http:\/\/0.0.0.0:7700\r\n      - MEILI_MASTER_KEY=${MEILI_MASTER_KEY}\r\n    volumes:\r\n      - .\/data\/meili:\/meili_data<\/code><\/pre>\n<p><strong>Important changes:<\/strong><\/p>\n<ul>\n<li>Changed MongoDB image from default (8.0) to <code>mongo:6.0<\/code><\/li>\n<li>Added <code>user: \"1000:1000\"<\/code> to both services (MongoDB and Meilisearch)<\/li>\n<li>This ensures that containers run with correct permissions on ARM systems<\/li>\n<\/ul>\n<h3><span class=\"ez-toc-section\" id=\"Phase_6_Configuration_for_Ollama_librechatyaml\"><\/span>Phase 6: Configuration for Ollama (librechat.yaml)<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>We copy the example configuration and adjust it:<\/p>\n<p><strong>Command: <\/strong><code>cp librechat.example.yaml librechat.yaml<\/code><br \/>\n<strong>Command: <\/strong><code>nano librechat.yaml<\/code><\/p>\n<p>Add your Ollama models under <code>endpoints: custom:<\/code>. The <code>baseURL<\/code> is important:<\/p>\n<pre><code>endpoints:\r\n  custom:\r\n    # Ollama Local Models\r\n    - name: 'Ollama'\r\n      apiKey: 'ollama'\r\n      baseURL: 'http:\/\/host.docker.internal:11434\/v1\/'\r\n      models:\r\n        default:\r\n          - 'ministral-3:14b'\r\n          - 'qwen3:4b'\r\n          - 'qwen3:30b-thinking'\r\n          - 'qwen3-coder:30b'\r\n          - 'deepseek-ocr:latest'\r\n          - 'gpt-oss:20b'\r\n        fetch: true\r\n      titleConvo: true\r\n      titleModel: 'current_model'\r\n      summarize: false\r\n      summaryModel: 'current_model'\r\n      forcePrompt: false\r\n      modelDisplayLabel: 'Ollama'<\/code><\/pre>\n<p><em>Note: <code>host.docker.internal<\/code> is mandatory so that the container can reach the Ollama service on the host.<\/em><\/p>\n<p><strong>Important configuration notes:<\/strong><\/p>\n<ul>\n<li><code>baseURL<\/code>: Must use <code>host.docker.internal<\/code> instead of <code>localhost<\/code> when LibreChat runs in Docker<\/li>\n<li><code>apiKey<\/code>: Set to &#8216;ollama&#8217; (required, but ignored by Ollama)<\/li>\n<li><code>fetch: true<\/code>: Automatically discovers new models added to Ollama<\/li>\n<li><code>titleModel: 'current_model'<\/code>: Prevents multiple models from being loaded simultaneously<\/li>\n<\/ul>\n<h3><span class=\"ez-toc-section\" id=\"Phase_7_Create_Docker_Compose_Override_File\"><\/span>Phase 7: Create Docker Compose Override File<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>The standard <code>docker-compose.yml<\/code> of LibreChat does not mount the <code>librechat.yaml<\/code> by default. We create an override file to include it:<\/p>\n<p><strong>Create file:<\/strong> <code>docker-compose.override.yaml<\/code><\/p>\n<pre><code># Docker Compose override file to mount librechat.yaml configuration\r\nservices:\r\n  api:\r\n    volumes:\r\n      # Mount the librechat.yaml configuration file\r\n      - .\/librechat.yaml:\/app\/librechat.yaml<\/code><\/pre>\n<p><em>Why this was necessary:<\/em> Without this mount, LibreChat could not read the configuration file and displayed the error: <code>ENOENT: no such file or directory, open '\/app\/librechat.yaml'<\/code><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Phase_8_Verify_Ollama_Service\"><\/span>Phase 8: Verify Ollama Service<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>Check if Ollama is running and accessible:<br \/>\n<strong>Command: <\/strong><code>systemctl status ollama<\/code><br \/>\n<strong>Command: <\/strong><code>ollama list<\/code><br \/>\n<strong>Command: <\/strong><code>curl http:\/\/localhost:11434\/api\/tags<\/code><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Phase_9_Start_LibreChat\"><\/span>Phase 9: Start LibreChat<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>After correcting the paths and versions, we restart the container structure. Docker Compose will now automatically download the appropriate <strong>ARM64 images<\/strong> for the selected versions:<\/p>\n<p><strong>Command:<\/strong> <code>docker compose up -d<\/code><\/p>\n<p>Check the status after about 20 seconds:<\/p>\n<p><strong>Command:<\/strong> <code>docker compose ps<\/code><\/p>\n<p>If a friendly <code>Up<\/code> appears in the <strong>STATUS<\/strong> column for all containers (api, mongodb, meilisearch, rag_api, vectordb), the installation was successful.<\/p>\n<p><strong>If the start failed initially:<\/strong><br \/>\nIf the <code>librechat.yaml<\/code> was not mounted, restart after creating the override file:<br \/>\n<code>docker compose down<\/code><br \/>\n<code>docker compose up -d<\/code><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Phase_10_Remove_Unwanted_Endpoints_Optional\"><\/span>Phase 10: Remove Unwanted Endpoints (Optional)<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>If you don&#8217;t need other providers (Groq, Mistral, OpenRouter, Helicone, Portkey), remove them from the <code>librechat.yaml<\/code>:<\/p>\n<p><strong>Edit file:<\/strong><\/p>\n<p><strong>Command: <\/strong> <code>nano librechat.yaml<\/code><\/p>\n<p>Remove all custom endpoint configurations except Ollama from the <code>endpoints.custom<\/code> section.<\/p>\n<p>Restart to apply the changes:<br \/>\n<strong>Command: <\/strong><code>docker compose restart api<\/code><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Phase_11_Access_and_First_Test\"><\/span>Phase 11: Access and First Test<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>Open in your browser: <code>http:\/\/&lt;IP-ADDRESS-ATOM&gt;:3080<\/code> (or <code>http:\/\/localhost:3080<\/code>). Create the first account (this will automatically be Admin). You will now find your Ollama instances under the models. A short test like &#8220;Why is the sky blue?&#8221; will immediately show you the power of local GPU acceleration.<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Troubleshooting_Tips\"><\/span>Troubleshooting Tips<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<ul>\n<li><strong>Permission Denied:<\/strong> Check if the <code>data<\/code> and <code>logs<\/code> folders really belong to user 1000 (<code>sudo chown -R 1000:1000 data logs images uploads<\/code>).<\/li>\n<li><strong>MongoDB Loop:<\/strong> If MongoDB keeps restarting, ensure you are using version 6.0 in the YAML and that no old data residues remain in the <code>data<\/code> folder.<\/li>\n<li><strong>Ollama Endpoint does not appear:<\/strong> Check if the <code>librechat.yaml<\/code> is correctly mounted. Check logs with <code>docker compose logs api | grep -i ollama<\/code>.<\/li>\n<li><strong>Containers do not start:<\/strong> Check if all directories exist and permissions are correct. If necessary, delete old containers and volumes: <code>docker compose down -v<\/code> (Caution: deletes data!).<\/li>\n<\/ul>\n<h2><span class=\"ez-toc-section\" id=\"Summary_Conclusion\"><\/span>Summary &amp; Conclusion<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>LibreChat on the Gigabyte AI TOP ATOM is an extremely powerful combination. Thanks to the 256GB Unified Memory of the Grace Hopper architecture (or the massive VRAM of your AI TOP setups), you can use even huge models with a professional interface in a team. The RAG functionality also makes the system the ideal local knowledge base for sensitive documents.<\/p>\n<p><strong>The most important success factors on ARM systems:<\/strong><\/p>\n<ol>\n<li>Use MongoDB Version 6.0 instead of 8.0<\/li>\n<li>Set directory permissions correctly before starting<\/li>\n<li>Add user tags in docker-compose.yml to the services<\/li>\n<li>Ensure all required directories exist with correct ownership rights<\/li>\n<\/ol>\n<p><strong>Changed\/Created Files:<\/strong><\/p>\n<ul>\n<li><code>librechat.yaml<\/code> &#8211; Added Ollama endpoint configuration, removed other providers<\/li>\n<li><code>docker-compose.override.yaml<\/code> &#8211; Created to mount the config file<\/li>\n<li><code>docker-compose.yml<\/code> &#8211; Changed MongoDB to version 6.0 and added user tags for ARM stability<\/li>\n<li><code>.env<\/code> &#8211; Security keys generated and UID\/GID added<\/li>\n<\/ul>\n<p><strong>Running Docker Containers:<\/strong><\/p>\n<ul>\n<li>LibreChat (api container) &#8211; Port 3080<\/li>\n<li>MongoDB 6.0 (chat-mongodb) &#8211; ARM-stable version<\/li>\n<li>Meilisearch (chat-meilisearch) &#8211; With correct user permissions<\/li>\n<li>PostgreSQL with pgvector (vectordb)<\/li>\n<li>RAG API (rag_api)<\/li>\n<\/ul>\n<p>Good luck experimenting! Have you used LibreChat for RAG with your own PDFs yet? Let me know in the comments!&#8220;`<\/p>\n","protected":false},"excerpt":{"rendered":"<p>After showing in my previous posts how to install Ollama, Open WebUI, ComfyUI, LLaMA Factory, vLLM, and LM Studio on the Gigabyte AI TOP ATOM, here comes another interesting alternative for everyone looking for a professional chat interface with advanced features like RAG (Retrieval-Augmented Generation), multi-user support, and a plugin system: LibreChat is an open-source [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":2145,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","footnotes":""},"categories":[873,162,50],"tags":[920,973,978,786,972,977,975,974,306,976,307,794],"class_list":["post-2147","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-gigabyte-ai-top-atom","category-large-language-models-en","category-top-story-en","tag-ai-top-atom-tutorial","tag-arm64","tag-docker-compose","tag-gigabyte-ai-top-atom","tag-librechat","tag-local-llm-interface","tag-mongodb-6-0","tag-nvidia-grace","tag-ollama-en","tag-open-source-chatgpt-alternative","tag-rag-en","tag-self-hosted-ai","et-has-post-format-content","et_post_format-et-post-format-standard"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models - Exploring the Future: Inside the AI Box<\/title>\n<meta name=\"description\" content=\"Learn how to install LibreChat on the Gigabyte AI TOP ATOM. This guide covers ARM64 optimization, MongoDB fixes for Grace CPUs, and connecting local Ollama models.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models - Exploring the Future: Inside the AI Box\" \/>\n<meta property=\"og:description\" content=\"Learn how to install LibreChat on the Gigabyte AI TOP ATOM. This guide covers ARM64 optimization, MongoDB fixes for Grace CPUs, and connecting local Ollama models.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/\" \/>\n<meta property=\"og:site_name\" content=\"Exploring the Future: Inside the AI Box\" \/>\n<meta property=\"article:published_time\" content=\"2026-01-11T19:11:35+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-01-11T19:14:14+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"2553\" \/>\n\t<meta property=\"og:image:height\" content=\"1301\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Maker\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@Ingmar_Stapel\" \/>\n<meta name=\"twitter:site\" content=\"@Ingmar_Stapel\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Maker\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/\"},\"author\":{\"name\":\"Maker\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#\\\/schema\\\/person\\\/cc91d08618b3feeef6926591b465eab1\"},\"headline\":\"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models\",\"datePublished\":\"2026-01-11T19:11:35+00:00\",\"dateModified\":\"2026-01-11T19:14:14+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/\"},\"wordCount\":1199,\"commentCount\":0,\"image\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/ai-box.eu\\\/wp-content\\\/uploads\\\/2026\\\/01\\\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg\",\"keywords\":[\"AI TOP ATOM Tutorial\",\"ARM64\",\"Docker Compose\",\"Gigabyte AI TOP ATOM\",\"LibreChat\",\"local LLM interface\",\"MongoDB 6.0\",\"NVIDIA Grace\",\"Ollama\",\"open-source ChatGPT alternative\",\"RAG\",\"Self-hosted AI\"],\"articleSection\":[\"Gigabyte AI TOP ATOM\",\"Large Language Models\",\"Top story\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/\",\"url\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/\",\"name\":\"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models - Exploring the Future: Inside the AI Box\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/ai-box.eu\\\/wp-content\\\/uploads\\\/2026\\\/01\\\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg\",\"datePublished\":\"2026-01-11T19:11:35+00:00\",\"dateModified\":\"2026-01-11T19:14:14+00:00\",\"author\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#\\\/schema\\\/person\\\/cc91d08618b3feeef6926591b465eab1\"},\"description\":\"Learn how to install LibreChat on the Gigabyte AI TOP ATOM. This guide covers ARM64 optimization, MongoDB fixes for Grace CPUs, and connecting local Ollama models.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/#primaryimage\",\"url\":\"https:\\\/\\\/ai-box.eu\\\/wp-content\\\/uploads\\\/2026\\\/01\\\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg\",\"contentUrl\":\"https:\\\/\\\/ai-box.eu\\\/wp-content\\\/uploads\\\/2026\\\/01\\\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg\",\"width\":2553,\"height\":1301,\"caption\":\"LibreChat Gigabyte AI TOP ATOM\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\\\/2147\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Start\",\"item\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#website\",\"url\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/\",\"name\":\"Exploring the Future: Inside the AI Box\",\"description\":\"Inside the AI Box, we share our experiences and discoveries in the world of artificial intelligence.\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#\\\/schema\\\/person\\\/cc91d08618b3feeef6926591b465eab1\",\"name\":\"Maker\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g\",\"caption\":\"Maker\"},\"description\":\"I live in Bavaria near Munich. In my head I always have many topics and try out especially in the field of Internet new media much in my spare time. I write on the blog because it makes me fun to report about the things that inspire me. I am happy about every comment, about suggestion and very about questions.\",\"sameAs\":[\"https:\\\/\\\/ai-box.eu\"],\"url\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/author\\\/ingmars\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models - Exploring the Future: Inside the AI Box","description":"Learn how to install LibreChat on the Gigabyte AI TOP ATOM. This guide covers ARM64 optimization, MongoDB fixes for Grace CPUs, and connecting local Ollama models.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/","og_locale":"en_US","og_type":"article","og_title":"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models - Exploring the Future: Inside the AI Box","og_description":"Learn how to install LibreChat on the Gigabyte AI TOP ATOM. This guide covers ARM64 optimization, MongoDB fixes for Grace CPUs, and connecting local Ollama models.","og_url":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/","og_site_name":"Exploring the Future: Inside the AI Box","article_published_time":"2026-01-11T19:11:35+00:00","article_modified_time":"2026-01-11T19:14:14+00:00","og_image":[{"width":2553,"height":1301,"url":"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg","type":"image\/jpeg"}],"author":"Maker","twitter_card":"summary_large_image","twitter_creator":"@Ingmar_Stapel","twitter_site":"@Ingmar_Stapel","twitter_misc":{"Written by":"Maker","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#article","isPartOf":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/"},"author":{"name":"Maker","@id":"https:\/\/ai-box.eu\/en\/#\/schema\/person\/cc91d08618b3feeef6926591b465eab1"},"headline":"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models","datePublished":"2026-01-11T19:11:35+00:00","dateModified":"2026-01-11T19:14:14+00:00","mainEntityOfPage":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/"},"wordCount":1199,"commentCount":0,"image":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#primaryimage"},"thumbnailUrl":"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg","keywords":["AI TOP ATOM Tutorial","ARM64","Docker Compose","Gigabyte AI TOP ATOM","LibreChat","local LLM interface","MongoDB 6.0","NVIDIA Grace","Ollama","open-source ChatGPT alternative","RAG","Self-hosted AI"],"articleSection":["Gigabyte AI TOP ATOM","Large Language Models","Top story"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/","url":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/","name":"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models - Exploring the Future: Inside the AI Box","isPartOf":{"@id":"https:\/\/ai-box.eu\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#primaryimage"},"image":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#primaryimage"},"thumbnailUrl":"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg","datePublished":"2026-01-11T19:11:35+00:00","dateModified":"2026-01-11T19:14:14+00:00","author":{"@id":"https:\/\/ai-box.eu\/en\/#\/schema\/person\/cc91d08618b3feeef6926591b465eab1"},"description":"Learn how to install LibreChat on the Gigabyte AI TOP ATOM. This guide covers ARM64 optimization, MongoDB fixes for Grace CPUs, and connecting local Ollama models.","breadcrumb":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#primaryimage","url":"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg","contentUrl":"https:\/\/ai-box.eu\/wp-content\/uploads\/2026\/01\/LibreChat_Gigabyte_AI_TOP_ATOM.jpg","width":2553,"height":1301,"caption":"LibreChat Gigabyte AI TOP ATOM"},{"@type":"BreadcrumbList","@id":"https:\/\/ai-box.eu\/en\/top-story-en\/install-librechat-on-gigabyte-ai-top-atom-professional-chat-interface-for-local-llm-models\/2147\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Start","item":"https:\/\/ai-box.eu\/en\/"},{"@type":"ListItem","position":2,"name":"Install LibreChat on Gigabyte AI TOP ATOM: Professional Chat Interface for Local LLM Models"}]},{"@type":"WebSite","@id":"https:\/\/ai-box.eu\/en\/#website","url":"https:\/\/ai-box.eu\/en\/","name":"Exploring the Future: Inside the AI Box","description":"Inside the AI Box, we share our experiences and discoveries in the world of artificial intelligence.","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/ai-box.eu\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/ai-box.eu\/en\/#\/schema\/person\/cc91d08618b3feeef6926591b465eab1","name":"Maker","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g","caption":"Maker"},"description":"I live in Bavaria near Munich. In my head I always have many topics and try out especially in the field of Internet new media much in my spare time. I write on the blog because it makes me fun to report about the things that inspire me. I am happy about every comment, about suggestion and very about questions.","sameAs":["https:\/\/ai-box.eu"],"url":"https:\/\/ai-box.eu\/en\/author\/ingmars\/"}]}},"_links":{"self":[{"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/posts\/2147","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/comments?post=2147"}],"version-history":[{"count":3,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/posts\/2147\/revisions"}],"predecessor-version":[{"id":2150,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/posts\/2147\/revisions\/2150"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/media\/2145"}],"wp:attachment":[{"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/media?parent=2147"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/categories?post=2147"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/tags?post=2147"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}