TavilySearchResults
Tavily Search is a robust search API tailored specifically for LLM Agents. It seamlessly integrates with diverse data sources to ensure a superior, relevant search experience.
This guide provides a quick overview for getting started with the Tavily
search results tool. For detailed
documentation of all TavilySearchResults features and configurations
head to the API
reference.
Overviewβ
Integration detailsβ
| Class | Package | PY support | Package latest | |
|---|---|---|---|---|
| TavilySearchResults | @langchain/community | β |  | 
Setupβ
The integration lives in the @langchain/community package, which you
can install as shown below:
- npm
- yarn
- pnpm
npm i @langchain/community
yarn add @langchain/community
pnpm add @langchain/community
Credentialsβ
Set up an API key here and set it as an
environment variable named TAVILY_API_KEY.
process.env.TAVILY_API_KEY = "YOUR_API_KEY";
Itβs also helpful (but not needed) to set up LangSmith for best-in-class observability:
process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_API_KEY = "your-api-key";
Instantiationβ
You can import and instantiate an instance of the TavilySearchResults
tool like this:
import { TavilySearchResults } from "@langchain/community/tools/tavily_search";
const tool = new TavilySearchResults({
  maxResults: 2,
  // ...
});
Invocationβ
Invoke directly with argsβ
You can invoke the tool directly like this:
await tool.invoke({
  input: "what is the current weather in SF?",
});
[{"title":"San Francisco, CA Current Weather | AccuWeather","url":"https://www.accuweather.com/en/us/san-francisco/94103/current-weather/347629","content":"Current weather in San Francisco, CA. Check current conditions in San Francisco, CA with radar, hourly, and more.","score":0.9428234,"raw_content":null},{"title":"National Weather Service","url":"https://forecast.weather.gov/zipcity.php?inputstring=San+Francisco,CA","content":"NOAA National Weather Service. Current conditions at SAN FRANCISCO DOWNTOWN (SFOC1) Lat: 37.77056Β°NLon: 122.42694Β°WElev: 150.0ft.","score":0.94261247,"raw_content":null}]
Invoke with ToolCallβ
We can also invoke the tool with a model-generated ToolCall, in which
case a ToolMessage will be returned:
// This is usually generated by a model, but we'll create a tool call directly for demo purposes.
const modelGeneratedToolCall = {
  args: {
    input: "what is the current weather in SF?",
  },
  id: "1",
  name: tool.name,
  type: "tool_call",
};
await tool.invoke(modelGeneratedToolCall);
ToolMessage {
  "content": "[{\"title\":\"Weather in San Francisco\",\"url\":\"https://www.weatherapi.com/\",\"content\":\"{'location': {'name': 'San Francisco', 'region': 'California', 'country': 'United States of America', 'lat': 37.78, 'lon': -122.42, 'tz_id': 'America/Los_Angeles', 'localtime_epoch': 1722967498, 'localtime': '2024-08-06 11:04'}, 'current': {'last_updated_epoch': 1722967200, 'last_updated': '2024-08-06 11:00', 'temp_c': 18.4, 'temp_f': 65.2, 'is_day': 1, 'condition': {'text': 'Sunny', 'icon': '//cdn.weatherapi.com/weather/64x64/day/113.png', 'code': 1000}, 'wind_mph': 2.9, 'wind_kph': 4.7, 'wind_degree': 275, 'wind_dir': 'W', 'pressure_mb': 1015.0, 'pressure_in': 29.97, 'precip_mm': 0.0, 'precip_in': 0.0, 'humidity': 64, 'cloud': 2, 'feelslike_c': 18.5, 'feelslike_f': 65.2, 'windchill_c': 18.5, 'windchill_f': 65.2, 'heatindex_c': 18.4, 'heatindex_f': 65.2, 'dewpoint_c': 11.7, 'dewpoint_f': 53.1, 'vis_km': 10.0, 'vis_miles': 6.0, 'uv': 5.0, 'gust_mph': 4.3, 'gust_kph': 7.0}}\",\"score\":0.9983156,\"raw_content\":null},{\"title\":\"Weather in San Francisco in June 2024 - Detailed Forecast\",\"url\":\"https://www.easeweather.com/north-america/united-states/california/city-and-county-of-san-francisco/san-francisco/june\",\"content\":\"Until now, June 2024 in San Francisco is slightly cooler than the historical average by -0.6 Β° C.. The forecast for June 2024 in San Francisco predicts the temperature to closely align with the historical average at 17.7 Β° C. 17.7 Β° C.\",\"score\":0.9905143,\"raw_content\":null}]",
  "name": "tavily_search_results_json",
  "additional_kwargs": {},
  "response_metadata": {},
  "tool_call_id": "1"
}
Chainingβ
We can use our tool in a chain by first binding it to a tool-calling model and then calling it:
Pick your chat model:
- OpenAI
- Anthropic
- FireworksAI
- MistralAI
- Groq
- VertexAI
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/openai 
yarn add @langchain/openai 
pnpm add @langchain/openai 
Add environment variables
OPENAI_API_KEY=your-api-key
Instantiate the model
import { ChatOpenAI } from "@langchain/openai";
const llm = new ChatOpenAI({
  model: "gpt-4o-mini",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/anthropic 
yarn add @langchain/anthropic 
pnpm add @langchain/anthropic 
Add environment variables
ANTHROPIC_API_KEY=your-api-key
Instantiate the model
import { ChatAnthropic } from "@langchain/anthropic";
const llm = new ChatAnthropic({
  model: "claude-3-5-sonnet-20240620",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/community 
yarn add @langchain/community 
pnpm add @langchain/community 
Add environment variables
FIREWORKS_API_KEY=your-api-key
Instantiate the model
import { ChatFireworks } from "@langchain/community/chat_models/fireworks";
const llm = new ChatFireworks({
  model: "accounts/fireworks/models/llama-v3p1-70b-instruct",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/mistralai 
yarn add @langchain/mistralai 
pnpm add @langchain/mistralai 
Add environment variables
MISTRAL_API_KEY=your-api-key
Instantiate the model
import { ChatMistralAI } from "@langchain/mistralai";
const llm = new ChatMistralAI({
  model: "mistral-large-latest",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/groq 
yarn add @langchain/groq 
pnpm add @langchain/groq 
Add environment variables
GROQ_API_KEY=your-api-key
Instantiate the model
import { ChatGroq } from "@langchain/groq";
const llm = new ChatGroq({
  model: "mixtral-8x7b-32768",
  temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/google-vertexai 
yarn add @langchain/google-vertexai 
pnpm add @langchain/google-vertexai 
Add environment variables
GOOGLE_APPLICATION_CREDENTIALS=credentials.json
Instantiate the model
import { ChatVertexAI } from "@langchain/google-vertexai";
const llm = new ChatVertexAI({
  model: "gemini-1.5-flash",
  temperature: 0
});
import { HumanMessage } from "@langchain/core/messages";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { RunnableLambda } from "@langchain/core/runnables";
const prompt = ChatPromptTemplate.fromMessages([
  ["system", "You are a helpful assistant."],
  ["placeholder", "{messages}"],
]);
const llmWithTools = llm.bindTools([tool]);
const chain = prompt.pipe(llmWithTools);
const toolChain = RunnableLambda.from(async (userInput: string, config) => {
  const humanMessage = new HumanMessage(userInput);
  const aiMsg = await chain.invoke(
    {
      messages: [new HumanMessage(userInput)],
    },
    config
  );
  const toolMsgs = await tool.batch(aiMsg.tool_calls, config);
  return chain.invoke(
    {
      messages: [humanMessage, aiMsg, ...toolMsgs],
    },
    config
  );
});
const toolChainResult = await toolChain.invoke(
  "what is the current weather in sf?"
);
const { tool_calls, content } = toolChainResult;
console.log(
  "AIMessage",
  JSON.stringify(
    {
      tool_calls,
      content,
    },
    null,
    2
  )
);
AIMessage {
  "tool_calls": [],
  "content": "The current weather in San Francisco is as follows:\n\n- **Condition:** Sunny\n- **Temperature:** 18.4Β°C (65.2Β°F)\n- **Wind:** 2.9 mph (4.7 kph) from the west\n- **Humidity:** 64%\n- **Visibility:** 10 km (6 miles)\n- **UV Index:** 5\n\n\n\nFor more detailed information, you can visit [WeatherAPI](https://www.weatherapi.com/)."
}
Agentsβ
For guides on how to use LangChain tools in agents, see the LangGraph.js docs.
API referenceβ
For detailed documentation of all TavilySearchResults features and
configurations head to the API reference:
https://api.js.langchain.com/classes/langchain_community_tools_tavily_search.TavilySearchResults.html
Relatedβ
- Tool conceptual guide
- Tool how-to guides