diff --git a/src/oss/python/integrations/chat/index.mdx b/src/oss/python/integrations/chat/index.mdx index fd2b69dd2..9cb25790b 100644 --- a/src/oss/python/integrations/chat/index.mdx +++ b/src/oss/python/integrations/chat/index.mdx @@ -618,21 +618,13 @@ Certain model providers offer endpoints that are compatible with OpenAI's [Chat /> - - system<|end_header_id|>\n\nYou are a helpful assistant that translates English to French. Translate the user sentence.<|eot_id|><|start_header_id|>user<|end_header_id|>\n\nI love programming.<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n', 'stop_reason': 'end_of_text', 'tokens': ['J', "'", 'ad', 'ore', ' la', ' programm', 'ation', '.'], 'total_tokens_count': 43}, 'params': {}, 'status': None}, id='item0') -``` - -```python -print(ai_msg.content) -``` - -```output -J'adore la programmation. -``` - -## Streaming - -```python -system = "You are a helpful assistant with pirate accent." -human = "I want to learn more about this animal: {animal}" -prompt = ChatPromptTemplate.from_messages([("system", system), ("human", human)]) - -chain = prompt | llm - -for chunk in chain.stream({"animal": "owl"}): - print(chunk.content, end="", flush=True) -``` - -```output -Arrr, ye landlubber! Ye be wantin' to learn about owls, eh? Well, matey, settle yerself down with a pint o' grog and listen close, for I be tellin' ye about these fascinatin' creatures o' the night! - -Owls be birds, but not just any birds, me hearty! They be nocturnal, meanin' they do their huntin' at night, when the rest o' the world be sleepin'. And they be experts at it, too! Their big, round eyes be designed for seein' in the dark, with a special reflective layer called the tapetum lucidum that helps 'em spot prey in the shadows. It's like havin' a built-in lantern, savvy? - -But that be not all, me matey! Owls also have acute hearin', which helps 'em pinpoint the slightest sounds in the dark. And their ears be asymmetrical, meanin' one ear be higher than the other, which gives 'em better depth perception. It's like havin' a built-in sonar system, arrr! - -Now, ye might be wonderin' how owls fly so silently, like ghosts in the night. Well, it be because o' their special feathers, me hearty! They have soft, fringed feathers on their wings that help reduce noise and turbulence, makin' 'em the sneakiest flyers on the seven seas... er, skies! - -Owls come in all shapes and sizes, from the tiny elf owl to the great grey owl, which be one o' the largest owl species in the world. And they be found on every continent, except Antarctica, o' course. They be solitary creatures, but some species be known to form long-term monogamous relationships, like the barn owl and its mate. - -So, there ye have it, me hearty! Owls be amazin' creatures, with their clever adaptations and stealthy ways. Now, go forth and spread the word about these magnificent birds o' the night! And remember, if ye ever encounter an owl in the wild, be sure to show respect and keep a weather eye open, or ye might just find yerself on the receivin' end o' a silent, flyin' tackle! Arrr! -``` - -## Async - -```python -prompt = ChatPromptTemplate.from_messages( - [ - ( - "human", - "what is the capital of {country}?", - ) - ] -) - -chain = prompt | llm -await chain.ainvoke({"country": "France"}) -``` - -```output -AIMessage(content='The capital of France is Paris.', response_metadata={'id': 'item0', 'partial': False, 'value': {'completion': 'The capital of France is Paris.', 'logprobs': {'text_offset': [], 'top_logprobs': []}, 'prompt': '<|start_header_id|>user<|end_header_id|>\n\nwhat is the capital of France?<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n', 'stop_reason': 'end_of_text', 'tokens': ['The', ' capital', ' of', ' France', ' is', ' Paris', '.'], 'total_tokens_count': 24}, 'params': {}, 'status': None}, id='item0') -``` - -## Async Streaming - -```python -prompt = ChatPromptTemplate.from_messages( - [ - ( - "human", - "in less than {num_words} words explain me {topic} ", - ) - ] -) -chain = prompt | llm - -async for chunk in chain.astream({"num_words": 30, "topic": "quantum computers"}): - print(chunk.content, end="", flush=True) -``` - -```output -Quantum computers use quantum bits (qubits) to process multiple possibilities simultaneously, exponentially faster than classical computers, enabling breakthroughs in fields like cryptography, optimization, and simulation. -``` - -## Tool calling - -```python -from datetime import datetime - -from langchain.messages import HumanMessage, ToolMessage -from langchain.tools import tool - - -@tool -def get_time(kind: str = "both") -> str: - """Returns current date, current time or both. - Args: - kind: date, time or both - """ - if kind == "date": - date = datetime.now().strftime("%m/%d/%Y") - return f"Current date: {date}" - elif kind == "time": - time = datetime.now().strftime("%H:%M:%S") - return f"Current time: {time}" - else: - date = datetime.now().strftime("%m/%d/%Y") - time = datetime.now().strftime("%H:%M:%S") - return f"Current date: {date}, Current time: {time}" - - -tools = [get_time] - - -def invoke_tools(tool_calls, messages): - available_functions = {tool.name: tool for tool in tools} - for tool_call in tool_calls: - selected_tool = available_functions[tool_call["name"]] - tool_output = selected_tool.invoke(tool_call["args"]) - print(f"Tool output: {tool_output}") - messages.append(ToolMessage(tool_output, tool_call_id=tool_call["id"])) - return messages -``` - -```python -llm_with_tools = llm.bind_tools(tools=tools) -messages = [ - HumanMessage( - content="I need to schedule a meeting for two weeks from today. " - "Can you tell me the exact date of the meeting?" - ) -] -``` - -```python -response = llm_with_tools.invoke(messages) -while len(response.tool_calls) > 0: - print(f"Intermediate model response: {response.tool_calls}") - messages.append(response) - messages = invoke_tools(response.tool_calls, messages) -response = llm_with_tools.invoke(messages) - -print(f"final response: {response.content}") -``` - -```output -Intermediate model response: [{'name': 'get_time', 'args': {'kind': 'date'}, 'id': 'call_4092d5dd21cd4eb494', 'type': 'tool_call'}] -Tool output: Current date: 11/07/2024 -final response: The meeting will be exactly two weeks from today, which would be 25/07/2024. -``` - -## Structured Outputs - -```python -from pydantic import BaseModel, Field - - -class Joke(BaseModel): - """Joke to tell user.""" - - setup: str = Field(description="The setup of the joke") - punchline: str = Field(description="The punchline to the joke") - - -structured_llm = llm.with_structured_output(Joke) - -structured_llm.invoke("Tell me a joke about cats") -``` - -```output -Joke(setup='Why did the cat join a band?', punchline='Because it wanted to be the purr-cussionist!') -``` - -## API reference - -For detailed documentation of all SambaStudio features and configurations head to the API reference: [docs.sambanova.ai/sambastudio/latest/api-ref-landing.html](https://docs.sambanova.ai/sambastudio/latest/api-ref-landing.html) diff --git a/src/oss/python/integrations/llms/sambanovacloud.mdx b/src/oss/python/integrations/llms/sambanovacloud.mdx index e541edb40..d931461c6 100644 --- a/src/oss/python/integrations/llms/sambanovacloud.mdx +++ b/src/oss/python/integrations/llms/sambanovacloud.mdx @@ -2,12 +2,12 @@ title: SambaNovaCloud --- -**[SambaNova](https://sambanova.ai/)'s [SambaNova Cloud](https://cloud.sambanova.ai/)** is a platform for performing inference with open-source models +**[SambaNova](https://sambanova.ai/)'s [SambaNovaCloud](https://cloud.sambanova.ai/)** is a platform for performing inference with open-source models **You are currently on a page documenting the use of SambaNovaCloud models as text completion models. We recommend you to use the [chat completion models](/oss/langchain/models).** -You may be looking for [SambaNovaCloud Chat Models](/oss/integrations/chat/sambanova/) . +You may be looking for [SambaNova Chat Models](/oss/integrations/chat/sambanova/) . ## Overview diff --git a/src/oss/python/integrations/llms/sambastudio.mdx b/src/oss/python/integrations/llms/sambastudio.mdx index ee0def83b..6d1a3134e 100644 --- a/src/oss/python/integrations/llms/sambastudio.mdx +++ b/src/oss/python/integrations/llms/sambastudio.mdx @@ -7,7 +7,7 @@ title: SambaStudio **You are currently on a page documenting the use of SambaStudio models as text completion models. We recommend you to use the [chat completion models](/oss/langchain/models).** -You may be looking for [SambaStudio Chat Models](/oss/integrations/chat/sambastudio/) . +You may be looking for [SambaNova Chat Models](/oss/integrations/chat/sambanova) . ## Overview diff --git a/src/oss/python/integrations/providers/sambanova.mdx b/src/oss/python/integrations/providers/sambanova.mdx index 9a4ae89ce..e58121bbc 100644 --- a/src/oss/python/integrations/providers/sambanova.mdx +++ b/src/oss/python/integrations/providers/sambanova.mdx @@ -8,76 +8,54 @@ Designed for AI, the SambaNova RDU was built with a revolutionary dataflow archi On top of our architecture We have developed some platforms that allow companies and developers to get full advantage of the RDU processors and open source models. -### SambaNovaCloud - -SambaNova's [SambaNova Cloud](https://cloud.sambanova.ai/) is a platform for performing inference with open-source models - -You can obtain a free SambaNovaCloud API key [here](https://cloud.sambanova.ai/) - -### SambaStudio - -SambaNova's [SambaStudio](https://docs.sambanova.ai/sambastudio/latest/sambastudio-intro.html) is a rich, GUI-based platform that provides the functionality to train, deploy, and manage models in SambaNova DataScale systems. ## Installation and Setup Install the integration package: + ```bash pip install langchain-sambanova ``` -set your API key it as an environment variable: +```bash uv +uv add langchain-sambanova +``` + + + +## API Key +Set your API key it as an environment variable: -If you are a SambaNovaCloud user: +If you are a SambaCloud user request an [API key](http://cloud.sambanova.ai/apis?utm_source=langchain&utm_medium=external&utm_campaign=cloud_signup) and set it as an environment variable: ```bash -export SAMBANOVA_API_KEY="your-sambanova-cloud-api-key-here" +export SAMBANOVA_API_KEY="your-sambacloud-api-key-here" ``` -or if you are SambaStudio User +Or if you are SambaStack user set your base URL and API key as environment variables: ```bash -export SAMBASTUDIO_API_KEY="your-sambastudio-api-key-here" +export SAMBANOVA_API_BASE="your-sambastack-envirronment-base-url-here" +export SAMBANOVA_API_KEY="your-sambastack-api-key-here" ``` -## Chat models - -```python -from langchain_sambanova import ChatSambaNovaCloud -llm = ChatSambaNovaCloud(model="Meta-Llama-3.3-70B-Instruct", temperature=0.7) -llm.invoke("Tell me a joke about artificial intelligence.") -``` +## Chat models -For a more detailed walkthrough of the ChatSambaNovaCloud component, see [this notebook](https://python.langchain.com/docs/integrations/chat/sambanova/) +For a detailed walkthrough of the `ChatSambaNova` component, see the [usage example](/oss/integrations/chat/sambanova) ```python -from langchain_sambanova import ChatSambaStudio - -llm = ChatSambaStudio(model="Meta-Llama-3.3-70B-Instruct", temperature=0.7) -llm.invoke("Tell me a joke about artificial intelligence.") +from langchain_sambanova import ChatSambaNova ``` -For a more detailed walkthrough of the ChatSambaStudio component, see [this notebook](https://python.langchain.com/docs/integrations/chat/sambastudio/) ## Embedding Models -```python -from langchain_sambanova import SambaNovaCloudEmbeddings - -embeddings = SambaNovaCloudEmbeddings(model="E5-Mistral-7B-Instruct") -embeddings.embed_query("What is the meaning of life?") -``` - -For a more detailed walkthrough of the SambaNovaCloudEmbeddings component, see [this notebook](https://python.langchain.com/docs/integrations/text_embedding/sambanova/) +For a detailed walkthrough of the `SambaNovaEmbeddings` component, see the [usage example](/oss/integrations/text_embedding/sambanova) ```python -from langchain_sambanova import SambaStudioEmbeddings - -embeddings = SambaStudioEmbeddings(model="e5-mistral-7b-instruct") -embeddings.embed_query("What is the meaning of life?") +from langchain_sambanova import SambaNovaEmbeddings ``` -For a more detailed walkthrough of the SambaStudioEmbeddings component, see [this notebook](https://python.langchain.com/docs/integrations/text_embedding/sambastudio/) - -API Reference [langchain-sambanova](https://docs.sambanova.ai/cloud/api-reference) +[SambaNova API Reference](https://docs.sambanova.ai/cloud/api-reference) diff --git a/src/oss/python/integrations/text_embedding/index.mdx b/src/oss/python/integrations/text_embedding/index.mdx index a66df3f7e..339fabf47 100644 --- a/src/oss/python/integrations/text_embedding/index.mdx +++ b/src/oss/python/integrations/text_embedding/index.mdx @@ -192,8 +192,7 @@ In production, you would typically use a more robust persistent store, such as a - - + diff --git a/src/oss/python/integrations/text_embedding/sambanova.mdx b/src/oss/python/integrations/text_embedding/sambanova.mdx index 60bb09632..50ea20eed 100644 --- a/src/oss/python/integrations/text_embedding/sambanova.mdx +++ b/src/oss/python/integrations/text_embedding/sambanova.mdx @@ -1,10 +1,10 @@ --- -title: SambaNovaCloudEmbeddings +title: SambaNovaEmbeddings --- -This will help you get started with SambaNovaCloud embedding models using LangChain. For detailed documentation on `SambaNovaCloudEmbeddings` features and configuration options, please refer to the [API reference](https://docs.sambanova.ai/cloud/docs/get-started/overview). +This will help you get started with SambaNova embedding models using LangChain. For detailed documentation on `SambaNovaEmbeddings` features and configuration options, please refer to the [API reference](https://docs.sambanova.ai/cloud/docs/get-started/overview). -**[SambaNova](https://sambanova.ai/)'s** [SambaNova Cloud](https://cloud.sambanova.ai/) is a platform for performing inference with open-source models +**[SambaNova](https://sambanova.ai/)'s** [SambaCloud](https://cloud.sambanova.ai/) is a platform for performing inference with open-source models ## Overview @@ -12,11 +12,11 @@ This will help you get started with SambaNovaCloud embedding models using LangCh | Provider | Package | |:--------:|:-------:| -| [SambaNova](/oss/integrations/providers/sambanova/) | [langchain-sambanova](https://python.langchain.com/docs/integrations/providers/sambanova/) | +| [SambaNova](/providers/sambanova/) | [langchain-sambanova](/oss/integrations/providers/sambanova/) | ## Setup -To access ChatSambaNovaCloud models you will need to create a [SambaNovaCloud](https://cloud.sambanova.ai/) account, get an API key, install the `langchain_sambanova` integration package. +To access `SambaNovaEmbeddings` models you will need to create a [SambaCloud](http://cloud.sambanova.ai?utm_source=langchain&utm_medium=external&utm_campaign=cloud_signup) account, get an API key, install the `langchain_sambanova` integration package. ```bash pip install langchain-sambanova @@ -24,11 +24,8 @@ pip install langchain-sambanova ### Credentials -Get an API Key from [cloud.sambanova.ai](https://cloud.sambanova.ai/apis) and add it to your environment variables: +Get an API Key from [cloud.sambanova.ai](http://cloud.sambanova.ai/apis?utm_source=langchain&utm_medium=external&utm_campaign=cloud_signup).Once you've done this set the SAMBANOVA_API_KEY environment variable: -``` bash -export SAMBANOVA_API_KEY="your-api-key-here" -``` ```python import getpass @@ -38,7 +35,7 @@ if not os.getenv("SAMBANOVA_API_KEY"): os.environ["SAMBANOVA_API_KEY"] = getpass.getpass("Enter your SambaNova API key: ") ``` -If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below: +To enable automated tracing of your model calls, set your [LangSmith](https://docs.smith.langchain.com/) API key: ```python os.environ["LANGSMITH_TRACING"] = "true" @@ -58,9 +55,9 @@ pip install -qU langchain-sambanova Now we can instantiate our model object and generate chat completions: ```python -from langchain_sambanova import SambaNovaCloudEmbeddings +from langchain_sambanova import SambaNovaEmbeddings -embeddings = SambaNovaCloudEmbeddings( +embeddings = SambaNovaEmbeddings( model="E5-Mistral-7B-Instruct", ) ``` @@ -122,4 +119,4 @@ for vector in two_vectors: ## API reference -For detailed documentation on `SambaNovaCloud` features and configuration options, please refer to the [API reference](https://docs.sambanova.ai/cloud/docs/get-started/overview). +For detailed documentation on `SambaNovaEmbeddings` features and configuration options, please refer to the [SambaNova Developer Guide](https://docs.sambanova.ai/cloud/docs/get-started/overview). diff --git a/src/oss/python/integrations/text_embedding/sambastudio.mdx b/src/oss/python/integrations/text_embedding/sambastudio.mdx deleted file mode 100644 index 6de825001..000000000 --- a/src/oss/python/integrations/text_embedding/sambastudio.mdx +++ /dev/null @@ -1,133 +0,0 @@ ---- -title: SambaStudioEmbeddings ---- - -This will help you get started with SambaNova's SambaStudio embedding models using LangChain. For detailed documentation on `SambaStudioEmbeddings` features and configuration options, please refer to the [API reference](https://docs.sambanova.ai/sambastudio/latest/index.html). - -**[SambaNova](https://sambanova.ai/)'s** [SambaStudio](https://sambanova.ai/technology/full-stack-ai-platform) is a platform for running your own open-source models - -## Overview - -### Integration details - -| Provider | Package | -|:--------:|:-------:| -| [SambaNova](/oss/integrations/providers/sambanova/) | [langchain-sambanova](https://python.langchain.com/docs/integrations/providers/sambanova/) | - -## Setup - -To access SambaStudio models you will need to [deploy an endpoint](https://docs.sambanova.ai/sambastudio/latest/language-models.html) in your SambaStudio platform, install the `langchain_sambanova` integration package. - -```bash -pip install langchain-sambanova -``` - -### Credentials - -Get the URL and API Key from your SambaStudio deployed endpoint and add them to your environment variables: - -``` bash -export SAMBASTUDIO_URL="sambastudio-url-key-here" -export SAMBASTUDIO_API_KEY="your-api-key-here" -``` - -```python -import getpass -import os - -if not os.getenv("SAMBASTUDIO_URL"): - os.environ["SAMBASTUDIO_URL"] = getpass.getpass( - "Enter your SambaStudio endpoint URL: " - ) - -if not os.getenv("SAMBASTUDIO_API_KEY"): - os.environ["SAMBASTUDIO_API_KEY"] = getpass.getpass( - "Enter your SambaStudio API key: " - ) -``` - -If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below: - -```python -# os.environ["LANGCHAIN_TRACING_V2"] = "true" -# os.environ["LANGCHAIN_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ") -``` - -### Installation - -The LangChain SambaNova integration lives in the `langchain-sambanova` package: - -```python -pip install -qU langchain-sambanova -``` - -## Instantiation - -Now we can instantiate our model object and generate chat completions: - -```python -from langchain_sambanova import SambaStudioEmbeddings - -embeddings = SambaStudioEmbeddings( - model="e5-mistral-7b-instruct", -) -``` - -## Indexing and Retrieval - -Embedding models are often used in retrieval-augmented generation (RAG) flows, both as part of indexing data as well as later retrieving it. For more detailed instructions, please see our [RAG tutorials](/oss/langchain/rag). - -Below, see how to index and retrieve data using the `embeddings` object we initialized above. In this example, we will index and retrieve a sample document in the `InMemoryVectorStore`. - -```python -# Create a vector store with a sample text -from langchain_core.vectorstores import InMemoryVectorStore - -text = "LangChain is the framework for building context-aware reasoning applications" - -vectorstore = InMemoryVectorStore.from_texts( - [text], - embedding=embeddings, -) - -# Use the vectorstore as a retriever -retriever = vectorstore.as_retriever() - -# Retrieve the most similar text -retrieved_documents = retriever.invoke("What is LangChain?") - -# show the retrieved document's content -retrieved_documents[0].page_content -``` - -## Direct Usage - -Under the hood, the vectorstore and retriever implementations are calling `embeddings.embed_documents(...)` and `embeddings.embed_query(...)` to create embeddings for the text(s) used in `from_texts` and retrieval `invoke` operations, respectively. - -You can directly call these methods to get embeddings for your own use cases. - -### Embed single texts - -You can embed single texts or documents with `embed_query`: - -```python -single_vector = embeddings.embed_query(text) -print(str(single_vector)[:100]) # Show the first 100 characters of the vector -``` - -### Embed multiple texts - -You can embed multiple texts with `embed_documents`: - -```python -text2 = ( - "LangGraph is a library for building stateful, multi-actor applications with LLMs" -) -two_vectors = embeddings.embed_documents([text, text2]) -for vector in two_vectors: - print(str(vector)[:100]) # Show the first 100 characters of the vector -``` - -## API reference - -For detailed documentation on `SambaStudio` features and configuration options, please refer to the [API reference](https://docs.sambanova.ai/sambastudio/latest/api-ref-landing.html).