Detailed Command’s of Ollama

Detailed Command’s of Ollama

Ollama offers various commands to manage and run models effectively across different operating systems. Here is a detailed look at each command with examples:

  1. serve
    • Usageollama serve
    • Description: Starts the Ollama server.
    • Exampleollama serve
    • Flags:
      • -h, --help: Help for serve.
    • Environment Variables:
      • OLLAMA_HOST: The host and port to bind to (default is 127.0.0.1:11434).
      • OLLAMA_ORIGINS: Allowed origins (comma-separated).
      • OLLAMA_MODELS: Path to the models directory (default is ~/.ollama/models).
      • OLLAMA_KEEP_ALIVE: Duration models stay loaded in memory (default is 5m).
      • OLLAMA_DEBUG: Set to 1 to enable debug logging.
    • create
      • Usageollama create MODEL
      • Description: Creates a model from a Modelfile.
      • Exampleollama create custom-model -f myModelfile.yaml
      • Flags:
        • -f, --file string: Name of the Modelfile (default is Modelfile).
        • -q, --quantize string: Quantize model to this level (e.g., q4_0).
      • Environment Variables:
        • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • show
        • Usageollama show MODEL
        • Description: Shows information for a model.
        • Exampleollama show llama3 --parameters
        • Flags:
          • --license: Show license of a model.
          • --modelfile: Show Modelfile of a model.
          • --parameters: Show parameters of a model.
          • --system: Show system message of a model.
          • --template: Show template of a model.
        • Environment Variables:
          • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • run
        • Usageollama run MODEL [PROMPT]
        • Description: Runs a model.
        • Exampleollama run llama3 "Explain quantum mechanics."
        • Flags:
          • --format string: Response format (e.g., JSON).
          • --keepalive string: Duration to keep a model loaded (e.g., 5m).
          • --nowordwrap: Don’t wrap words to the next line automatically.
          • --verbose: Show timings for response.
        • Environment Variables:
          • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • pull
        • Usageollama pull MODEL
        • Description: Pulls a model from a registry.
        • Exampleollama pull llama3
        • Flags:
          • --insecure: Use an insecure registry.
        • Environment Variables:
          • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • push
        • Usageollama push MODEL
        • Description: Pushes a model to a registry.
        • Exampleollama push custom-model
        • Flags:
          • --insecure: Use an insecure registry.
        • Environment Variables:
          • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • list
        • Usageollama list
        • Aliaseslist, ls
        • Description: Lists all models.
        • Exampleollama list
        • Flags:
          • -h, --help: Help for list.
        • Environment Variables:
          • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • ps
        • Usageollama ps
        • Description: Lists running models.
        • Exampleollama ps
        • Flags:
          • -h, --help: Help for ps.
        • Environment Variables:
          • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • cp
        • Usageollama cp SOURCE DESTINATION
        • Description: Copies a model.
        • Exampleollama cp llama3 backup-llama3
        • Flags:
          • -h, --help: Help for cp.
        • Environment Variables:
          • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • rm
        • Usageollama rm MODEL [MODEL...]
        • Description: Removes a model.
        • Exampleollama rm llama3
        • Flags:
          • -h, --help: Help for rm.
        • Environment Variables:
          • OLLAMA_HOST: The host and port or base URL of the Ollama server.
      • help
        • Usageollama help [command]
        • Description: Provides help for any command in the application.
        • Exampleollama help run
        • Flags:
          • -h, --help: Help for help.

      Leave a Reply

      Your email address will not be published. Required fields are marked *