diff --git a/docs/ollama.md b/docs/ollama.md
index b805e11dcae60af6313f9526c7b9c4ca69543aa0..708e246d5deab0a1dcb5bb9530a4b1789e2e8d99 100644
--- a/docs/ollama.md
+++ b/docs/ollama.md
@@ -31,7 +31,7 @@ $ docker exec -it ollama ollama run mistral
 <img src="https://github.com/infiniflow/ragflow/assets/12318111/a9df198a-226d-4f30-b8d7-829f00256d46" width="1300"/>
 </div>
 
-> Base URL: Enter the base URL where the Ollama service is accessible, like, http://<your-ollama-endpoint-domain>:11434
+> Base URL: Enter the base URL where the Ollama service is accessible, like, `http://<your-ollama-endpoint-domain>:11434`.
 
 - Use Ollama Models.
 
diff --git a/docs/xinference.md b/docs/xinference.md
index f2b23f7f14437f59f66130d4fb4614d6acb3019b..e589300db0769cc9c7e00ed5d14c1338dd93e410 100644
--- a/docs/xinference.md
+++ b/docs/xinference.md
@@ -31,7 +31,7 @@ $ xinference launch -u mistral --model-name mistral-v0.1 --size-in-billions 7 --
 <img src="https://github.com/infiniflow/ragflow/assets/12318111/bcbf4d7a-ade6-44c7-ad5f-0a92c8a73789" width="1300"/>
 </div>
 
-> Base URL: Enter the base URL where the Xinference service is accessible, like, http://<your-xinference-endpoint-domain>:9997/v1
+> Base URL: Enter the base URL where the Xinference service is accessible, like, `http://<your-xinference-endpoint-domain>:9997/v1`.
 
 - Use Xinference Models.