From e76a918dab62a94661a9a7ac79d775c4e62784ce Mon Sep 17 00:00:00 2001 From: Mergen Nachin Date: Tue, 21 Oct 2025 14:53:52 -0700 Subject: [PATCH] Fix command for calling export_llm API --- docs/source/llm/export-llm.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/llm/export-llm.md b/docs/source/llm/export-llm.md index 844fdc194b9..b2354305b21 100644 --- a/docs/source/llm/export-llm.md +++ b/docs/source/llm/export-llm.md @@ -26,7 +26,7 @@ The up-to-date list of supported LLMs can be found in the code [here](https://gi `export_llm` is ExecuTorch's high-level export API for LLMs. In this tutorial, we will focus on exporting Llama 3.2 1B using this API. `export_llm`'s arguments are specified either through CLI args or through a yaml configuration whose fields are defined in [`LlmConfig`](https://github.com/pytorch/executorch/blob/main/extension/llm/export/config/llm_config.py). To call `export_llm`: ``` -python -m executorch.examples.extension.llm.export.export_llm +python -m executorch.extension.llm.export.export_llm --config +base. ```