{"id":1292,"date":"2024-02-11T04:33:46","date_gmt":"2024-02-11T04:33:46","guid":{"rendered":"https:\/\/ai-box.eu\/?p=1292"},"modified":"2024-03-09T04:47:21","modified_gmt":"2024-03-09T04:47:21","slug":"ollama-ubuntu-installation-and-configuration","status":"publish","type":"post","link":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/","title":{"rendered":"Ollama Ubuntu installation and configuration"},"content":{"rendered":"<p>This short guide is made up of several articles and takes you step by step from the installation to the finished application. First of all, the instructions are structured so that Ollama is installed. Ollama is used as a server that provides the various language models. The big advantage of Ollama (<a href=\"https:\/\/github.com\/ollama\/ollama\" target=\"_blank\" rel=\"noopener\">https:\/\/github.com\/ollama\/ollama<\/a>) is that it can provide different language models that are addressed via an API from the actual Python application. This means that everything runs locally on your own computer and there are no additional costs such as those incurred when using OpenAI services, for example. I&#8217;m always a fan of running everything locally at home and I also think this solution is quite good from a data protection point of view.<\/p>\n<p style=\"padding-left: 40px;\"><strong>Note:<\/strong> I am installing everything on an Ubuntu system with an NVIDIA A6000.<\/p>\n<div id=\"attachment_76\" style=\"width: 310px\" class=\"wp-caption alignnone\"><a href=\"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup-300x225.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-76\" class=\"wp-image-76 size-medium\" src=\"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup-300x225.jpg\" alt=\"Deep Learning Computer NVIDIA RTX A6000\" width=\"300\" height=\"225\" srcset=\"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup-300x225.jpg 300w, https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup-1024x768.jpg 1024w, https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup-768x576.jpg 768w, https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup-1536x1152.jpg 1536w, https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup-1080x810.jpg 1080w, https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg 1600w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/a><p id=\"caption-attachment-76\" class=\"wp-caption-text\">Deep Learning Computer NVIDIA RTX A6000<\/p><\/div>\n<div id=\"ez-toc-container\" class=\"ez-toc-v2_0_82_2 counter-hierarchy ez-toc-counter ez-toc-grey ez-toc-container-direction\">\n<div class=\"ez-toc-title-container\">\n<p class=\"ez-toc-title\" style=\"cursor:inherit\">Table of Contents<\/p>\n<span class=\"ez-toc-title-toggle\"><a href=\"#\" class=\"ez-toc-pull-right ez-toc-btn ez-toc-btn-xs ez-toc-btn-default ez-toc-toggle\" aria-label=\"Toggle Table of Content\"><span class=\"ez-toc-js-icon-con\"><span class=\"\"><span class=\"eztoc-hide\" style=\"display:none;\">Toggle<\/span><span class=\"ez-toc-icon-toggle-span\"><svg style=\"fill: #999;color:#999\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" class=\"list-377408\" width=\"20px\" height=\"20px\" viewBox=\"0 0 24 24\" fill=\"none\"><path d=\"M6 6H4v2h2V6zm14 0H8v2h12V6zM4 11h2v2H4v-2zm16 0H8v2h12v-2zM4 16h2v2H4v-2zm16 0H8v2h12v-2z\" fill=\"currentColor\"><\/path><\/svg><svg style=\"fill: #999;color:#999\" class=\"arrow-unsorted-368013\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"10px\" height=\"10px\" viewBox=\"0 0 24 24\" version=\"1.2\" baseProfile=\"tiny\"><path d=\"M18.2 9.3l-6.2-6.3-6.2 6.3c-.2.2-.3.4-.3.7s.1.5.3.7c.2.2.4.3.7.3h11c.3 0 .5-.1.7-.3.2-.2.3-.5.3-.7s-.1-.5-.3-.7zM5.8 14.7l6.2 6.3 6.2-6.3c.2-.2.3-.5.3-.7s-.1-.5-.3-.7c-.2-.2-.4-.3-.7-.3h-11c-.3 0-.5.1-.7.3-.2.2-.3.5-.3.7s.1.5.3.7z\"\/><\/svg><\/span><\/span><\/span><\/a><\/span><\/div>\n<nav><ul class='ez-toc-list ez-toc-list-level-1 ' ><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-1\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#Ollama_installation\" >Ollama installation<\/a><ul class='ez-toc-list-level-3' ><li class='ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-2\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#Ollama_language_models\" >Ollama language models<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-3\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#Ollama_TextEmbedding\" >Ollama TextEmbedding<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-4\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#Ollama_start\" >Ollama start<\/a><\/li><\/ul><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-5\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#Ollama_update\" >Ollama update<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-6\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#Ollama_accessible_in_the_network\" >Ollama accessible in the network<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-7\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#Summary\" >Summary<\/a><\/li><\/ul><\/nav><\/div>\n<h2><span class=\"ez-toc-section\" id=\"Ollama_installation\"><\/span>Ollama installation<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>The first step is to install Ollama on the Ubuntu system. I have executed the following command in my user session.<\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code>curl -fsSL https:\/\/ollama.com\/install.sh | sh<\/code><\/p>\n<p>After the installation, we download the large language model mistral, which does not have too high demands on the graphics card and should therefore run on an RTX3090 or similar for many. Please execute the following command for the installation.<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Ollama_language_models\"><\/span>Ollama language models<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>Here you will always find the current overview of the models available for Ollama: <a href=\"https:\/\/ollama.com\/library\" target=\"_blank\" rel=\"noopener\">https:\/\/ollama.com\/library<\/a><\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code class=\" language-bash\">ollama pull mistral<\/code><\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code>ollama pull gemma:7b<\/code><\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code>ollama pull gemma:2b<\/code><\/p>\n<p>The models are then located under Linux or WSL in the following path: <code>\/usr\/share\/ollama\/.ollama\/models<\/code><\/p>\n<p>With the following command you can check whether mistral and the other models have been downloaded and are successfully available in ollama.<\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code class=\" language-processing\">ollama list<\/code><\/p>\n<p>If you want to delete a model, the command is as follows.<\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code>ollama rm &lt;Model-Name&gt;<\/code><\/p>\n<p>If a model is to be updated, this can be done again with the familiar command for installing models.<\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code>ollama pull &lt;Model-Name&gt;<\/code><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Ollama_TextEmbedding\"><\/span>Ollama TextEmbedding<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>The most important information about text embeddings can be found here: <a href=\"https:\/\/python.langchain.com\/docs\/integrations\/text_embedding\/ollama\" target=\"_blank\" rel=\"noopener\">https:\/\/python.langchain.com\/docs\/integrations\/text_embedding\/ollama<\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Ollama_start\"><\/span>Ollama start<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>Now simply start the Ollama server once with the following command. Our application will then communicate with this and call up the mistral LLM.<\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code class=\" language-processing\">ollama serve<\/code><\/p>\n<p style=\"padding-left: 40px;\"><strong>Note:<\/strong> If the following message is displayed, the Ollama server is already running and you do not need to do anything else for the time being.<\/p>\n<p style=\"padding-left: 40px;\"><code>(ollama_rag) ingmar@A6000:~$ ollama serve<\/code><br \/>\n<code>Error: listen tcp 127.0.0.1:11434: bind: address already in use<\/code><\/p>\n<p>Now everything is set up and you can start writing the small RAG application. With this you will be able to search a PDF document with natural language.<\/p>\n<h2><span class=\"ez-toc-section\" id=\"Ollama_update\"><\/span>Ollama update<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>The installation of Ollama can be updated quite easily. The command to display the currently used version of Ollama is as follows.<\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code>ollama --version<\/code><\/p>\n<p>At that moment I had the version &#8220;<code>ollama version is 0.1.25<\/code>&#8221; installed.<\/p>\n<p>The currently available version of Ollama can be found here in the menu on the right: <a href=\"https:\/\/github.com\/ollama\/ollama\" target=\"_blank\" rel=\"noopener\">https:\/\/github.com\/ollama\/ollama<\/a><\/p>\n<p>You can update Ollama under Ubuntu with the following command already known from the installation.<\/p>\n<p style=\"padding-left: 40px;\"><strong>Command:<\/strong> <code>curl -fsSL https:\/\/ollama.com\/install.sh | sh<\/code><\/p>\n<p>After running the command again, the current version <code>0.1.27<\/code> installed.<\/p>\n<h2><span class=\"ez-toc-section\" id=\"Ollama_accessible_in_the_network\"><\/span>Ollama accessible in the network<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>To do this, please open the file <code>ollama.service<\/code> in the path <code>\/etc\/systemd\/system\/<\/code> and insert the following line in the <code>[Service]<\/code> section.<\/p>\n<p><code class=\"notranslate\">Environment=\"OLLAMA_HOST=0.0.0.0\"<\/code><\/p>\n<p>I then restarted the computer to restart Ollama and I was able to access the API interface via the network.<\/p>\n<h2><span class=\"ez-toc-section\" id=\"Summary\"><\/span>Summary<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>With the Ollama framework, it has become very easy to run different language models locally and make them available on the network via an API. The really great thing about this is that LangChain supports Ollama, making it very easy to program the call to the endpoint in your own Python application. In the rest of my article, I will discuss the Python program that brings this small RAG application to life.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This short guide is made up of several articles and takes you step by step from the installation to the finished application. First of all, the instructions are structured so that Ollama is installed. Ollama is used as a server that provides the various language models. The big advantage of Ollama (https:\/\/github.com\/ollama\/ollama) is that it [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":77,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","footnotes":""},"categories":[162,50],"tags":[329,272,310,321,314,312,313,322,336,332,304,317,89,333,309,275,305,68,316,318,327,330,259,306,331,320,326,334,311,335,307,325,315,308,324,323,92,276,328,319],"class_list":["post-1292","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-large-language-models-en","category-top-story-en","tag-accessible-en","tag-anaconda-en","tag-api-en","tag-check-en","tag-computer-en","tag-data-protection-en","tag-deep-learning-en","tag-delete-en","tag-endpoint-en","tag-environment-en","tag-framework-en","tag-gemma-en","tag-installation-en","tag-langchain-en","tag-language-models-en","tag-linux-en","tag-llm-en","tag-local-en","tag-mistral-en","tag-models-en","tag-natural-language-en","tag-network-en","tag-nvidia-a6000-en","tag-ollama-en","tag-ollama-service-en","tag-path-en","tag-pdf-document-en","tag-programming-en","tag-python-application-en","tag-python-program-en","tag-rag-en","tag-rag-application-en","tag-rtx-a6000-en","tag-server-en","tag-start-en","tag-textembedding-en","tag-ubuntu-en","tag-update-en","tag-version-en","tag-wsl-en","et-has-post-format-content","et_post_format-et-post-format-standard"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Ollama Ubuntu installation and configuration - Exploring the Future: Inside the AI Box<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Ollama Ubuntu installation and configuration - Exploring the Future: Inside the AI Box\" \/>\n<meta property=\"og:description\" content=\"This short guide is made up of several articles and takes you step by step from the installation to the finished application. First of all, the instructions are structured so that Ollama is installed. Ollama is used as a server that provides the various language models. The big advantage of Ollama (https:\/\/github.com\/ollama\/ollama) is that it [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/\" \/>\n<meta property=\"og:site_name\" content=\"Exploring the Future: Inside the AI Box\" \/>\n<meta property=\"article:published_time\" content=\"2024-02-11T04:33:46+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-03-09T04:47:21+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1600\" \/>\n\t<meta property=\"og:image:height\" content=\"1200\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Maker\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@Ingmar_Stapel\" \/>\n<meta name=\"twitter:site\" content=\"@Ingmar_Stapel\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Maker\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/\"},\"author\":{\"name\":\"Maker\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#\\\/schema\\\/person\\\/cc91d08618b3feeef6926591b465eab1\"},\"headline\":\"Ollama Ubuntu installation and configuration\",\"datePublished\":\"2024-02-11T04:33:46+00:00\",\"dateModified\":\"2024-03-09T04:47:21+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/\"},\"wordCount\":614,\"commentCount\":0,\"image\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/ai-box.eu\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg\",\"keywords\":[\"accessible\",\"Anaconda\",\"API\",\"check\",\"computer\",\"data protection\",\"deep learning\",\"delete\",\"endpoint.\",\"Environment\",\"Framework\",\"gemma\",\"Installation\",\"LangChain\",\"language models\",\"Linux\",\"LLM\",\"local\",\"mistral\",\"models\",\"natural language\",\"network\",\"NVIDIA A6000\",\"Ollama\",\"ollama.service\",\"path\",\"PDF document\",\"programming\",\"Python application\",\"Python program\",\"RAG\",\"RAG application\",\"RTX A6000\",\"Server\",\"start\",\"TextEmbedding\",\"Ubuntu\",\"Update\",\"version\",\"WSL\"],\"articleSection\":[\"Large Language Models\",\"Top story\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/\",\"url\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/\",\"name\":\"Ollama Ubuntu installation and configuration - Exploring the Future: Inside the AI Box\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/ai-box.eu\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg\",\"datePublished\":\"2024-02-11T04:33:46+00:00\",\"dateModified\":\"2024-03-09T04:47:21+00:00\",\"author\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#\\\/schema\\\/person\\\/cc91d08618b3feeef6926591b465eab1\"},\"breadcrumb\":{\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/#primaryimage\",\"url\":\"https:\\\/\\\/ai-box.eu\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg\",\"contentUrl\":\"https:\\\/\\\/ai-box.eu\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg\",\"width\":1600,\"height\":1200,\"caption\":\"Deep Learning Computer NVIDIA RTX A6000\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/top-story-en\\\/ollama-ubuntu-installation-and-configuration\\\/1292\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Start\",\"item\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Ollama Ubuntu installation and configuration\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#website\",\"url\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/\",\"name\":\"Exploring the Future: Inside the AI Box\",\"description\":\"Inside the AI Box, we share our experiences and discoveries in the world of artificial intelligence.\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/#\\\/schema\\\/person\\\/cc91d08618b3feeef6926591b465eab1\",\"name\":\"Maker\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g\",\"caption\":\"Maker\"},\"description\":\"I live in Bavaria near Munich. In my head I always have many topics and try out especially in the field of Internet new media much in my spare time. I write on the blog because it makes me fun to report about the things that inspire me. I am happy about every comment, about suggestion and very about questions.\",\"sameAs\":[\"https:\\\/\\\/ai-box.eu\"],\"url\":\"https:\\\/\\\/ai-box.eu\\\/en\\\/author\\\/ingmars\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Ollama Ubuntu installation and configuration - Exploring the Future: Inside the AI Box","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/","og_locale":"en_US","og_type":"article","og_title":"Ollama Ubuntu installation and configuration - Exploring the Future: Inside the AI Box","og_description":"This short guide is made up of several articles and takes you step by step from the installation to the finished application. First of all, the instructions are structured so that Ollama is installed. Ollama is used as a server that provides the various language models. The big advantage of Ollama (https:\/\/github.com\/ollama\/ollama) is that it [&hellip;]","og_url":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/","og_site_name":"Exploring the Future: Inside the AI Box","article_published_time":"2024-02-11T04:33:46+00:00","article_modified_time":"2024-03-09T04:47:21+00:00","og_image":[{"width":1600,"height":1200,"url":"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg","type":"image\/jpeg"}],"author":"Maker","twitter_card":"summary_large_image","twitter_creator":"@Ingmar_Stapel","twitter_site":"@Ingmar_Stapel","twitter_misc":{"Written by":"Maker","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#article","isPartOf":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/"},"author":{"name":"Maker","@id":"https:\/\/ai-box.eu\/en\/#\/schema\/person\/cc91d08618b3feeef6926591b465eab1"},"headline":"Ollama Ubuntu installation and configuration","datePublished":"2024-02-11T04:33:46+00:00","dateModified":"2024-03-09T04:47:21+00:00","mainEntityOfPage":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/"},"wordCount":614,"commentCount":0,"image":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#primaryimage"},"thumbnailUrl":"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg","keywords":["accessible","Anaconda","API","check","computer","data protection","deep learning","delete","endpoint.","Environment","Framework","gemma","Installation","LangChain","language models","Linux","LLM","local","mistral","models","natural language","network","NVIDIA A6000","Ollama","ollama.service","path","PDF document","programming","Python application","Python program","RAG","RAG application","RTX A6000","Server","start","TextEmbedding","Ubuntu","Update","version","WSL"],"articleSection":["Large Language Models","Top story"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/","url":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/","name":"Ollama Ubuntu installation and configuration - Exploring the Future: Inside the AI Box","isPartOf":{"@id":"https:\/\/ai-box.eu\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#primaryimage"},"image":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#primaryimage"},"thumbnailUrl":"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg","datePublished":"2024-02-11T04:33:46+00:00","dateModified":"2024-03-09T04:47:21+00:00","author":{"@id":"https:\/\/ai-box.eu\/en\/#\/schema\/person\/cc91d08618b3feeef6926591b465eab1"},"breadcrumb":{"@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#primaryimage","url":"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg","contentUrl":"https:\/\/ai-box.eu\/wp-content\/uploads\/2022\/02\/No_Code_AI_Pipeline_Deep_Learning_Computer_setup.jpg","width":1600,"height":1200,"caption":"Deep Learning Computer NVIDIA RTX A6000"},{"@type":"BreadcrumbList","@id":"https:\/\/ai-box.eu\/en\/top-story-en\/ollama-ubuntu-installation-and-configuration\/1292\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Start","item":"https:\/\/ai-box.eu\/en\/"},{"@type":"ListItem","position":2,"name":"Ollama Ubuntu installation and configuration"}]},{"@type":"WebSite","@id":"https:\/\/ai-box.eu\/en\/#website","url":"https:\/\/ai-box.eu\/en\/","name":"Exploring the Future: Inside the AI Box","description":"Inside the AI Box, we share our experiences and discoveries in the world of artificial intelligence.","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/ai-box.eu\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/ai-box.eu\/en\/#\/schema\/person\/cc91d08618b3feeef6926591b465eab1","name":"Maker","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/e96b93fc3c7e50c1f21c5c6b1f146dc4867936141360830b328947b32cacf93a?s=96&d=mm&r=g","caption":"Maker"},"description":"I live in Bavaria near Munich. In my head I always have many topics and try out especially in the field of Internet new media much in my spare time. I write on the blog because it makes me fun to report about the things that inspire me. I am happy about every comment, about suggestion and very about questions.","sameAs":["https:\/\/ai-box.eu"],"url":"https:\/\/ai-box.eu\/en\/author\/ingmars\/"}]}},"_links":{"self":[{"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/posts\/1292","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/comments?post=1292"}],"version-history":[{"count":2,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/posts\/1292\/revisions"}],"predecessor-version":[{"id":1294,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/posts\/1292\/revisions\/1294"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/media\/77"}],"wp:attachment":[{"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/media?parent=1292"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/categories?post=1292"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ai-box.eu\/en\/wp-json\/wp\/v2\/tags?post=1292"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}