Copyright 2023 Google LLC
Licensed under the Apache License, Version 2.0 (the "License"); you
may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied. See the License for the specific language governing
ermissions and limitations under the License.
Understand Better What Happens When You Run a Langchain Chain¶
Run in Colab |
Open in Colab Enterprise |
View on GitHub |
Open in Vertex AI Workbench |
Author(s) | Michael W. Sherman |
| Last updated | 2023 10 17: Cleanup. | | 2023 07 30: Initial version. |
Last tested on Langchain version 0.0.316.
Langchain is a popular framework for building LLM-based systems. However, some Langchain execution details are not surfaced in Langchain's verbose mode, which can complicate debugging and understanding chains.
The code snippet below implements a Langchain callbacks class called AllChainDetails
, which prints out details of what's happening in each step of a chain and has optional debugging breakpoints.
AllChainDetails
is primarily for educational use.
Notebook Structure¶
- Part 1 is the
AllChainDetails
code snippet with some instructions for use and a basic example. - Part 2 is a more complete walkthrough for users of
AllChainDetails
.
Make sure to run part 1 before running part 2.
This notebook was tested in Colab.
1 - AllChainDetails
Code Snippet and Usage¶
Code¶
Install Dependencies¶
Install the dependencies, and make sure to restart the runtime after installation completes.
!pip install --user langchain==0.0.316 google-cloud-aiplatform==1.35.0 prettyprinter
Collecting langchain==0.0.316 Downloading langchain-0.0.316-py3-none-any.whl (1.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.9/1.9 MB 20.7 MB/s eta 0:00:00 Collecting google-cloud-aiplatform==1.35.0 Downloading google_cloud_aiplatform-1.35.0-py2.py3-none-any.whl (3.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 73.2 MB/s eta 0:00:00 Collecting prettyprinter Downloading prettyprinter-0.18.0-py2.py3-none-any.whl (48 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.0/48.0 kB 5.0 MB/s eta 0:00:00 Requirement already satisfied: PyYAML>=5.3 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (6.0.1) Requirement already satisfied: SQLAlchemy<3,>=1.4 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (2.0.22) Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (3.8.6) Requirement already satisfied: anyio<4.0 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (3.7.1) Requirement already satisfied: async-timeout<5.0.0,>=4.0.0 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (4.0.3) Collecting dataclasses-json<0.7,>=0.5.7 (from langchain==0.0.316) Downloading dataclasses_json-0.6.1-py3-none-any.whl (27 kB) Collecting jsonpatch<2.0,>=1.33 (from langchain==0.0.316) Downloading jsonpatch-1.33-py2.py3-none-any.whl (12 kB) Collecting langsmith<0.1.0,>=0.0.43 (from langchain==0.0.316) Downloading langsmith-0.0.44-py3-none-any.whl (40 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 40.1/40.1 kB 4.4 MB/s eta 0:00:00 Requirement already satisfied: numpy<2,>=1 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (1.23.5) Requirement already satisfied: pydantic<3,>=1 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (1.10.13) Requirement already satisfied: requests<3,>=2 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (2.31.0) Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /usr/local/lib/python3.10/dist-packages (from langchain==0.0.316) (8.2.3) Requirement already satisfied: google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0 in /usr/local/lib/python3.10/dist-packages (from google-cloud-aiplatform==1.35.0) (2.11.1) Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.0 in /usr/local/lib/python3.10/dist-packages (from google-cloud-aiplatform==1.35.0) (1.22.3) Requirement already satisfied: protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5 in /usr/local/lib/python3.10/dist-packages (from google-cloud-aiplatform==1.35.0) (3.20.3) Requirement already satisfied: packaging>=14.3 in /usr/local/lib/python3.10/dist-packages (from google-cloud-aiplatform==1.35.0) (23.2) Requirement already satisfied: google-cloud-storage<3.0.0dev,>=1.32.0 in /usr/local/lib/python3.10/dist-packages (from google-cloud-aiplatform==1.35.0) (2.8.0) Requirement already satisfied: google-cloud-bigquery<4.0.0dev,>=1.15.0 in /usr/local/lib/python3.10/dist-packages (from google-cloud-aiplatform==1.35.0) (3.10.0) Requirement already satisfied: google-cloud-resource-manager<3.0.0dev,>=1.3.3 in /usr/local/lib/python3.10/dist-packages (from google-cloud-aiplatform==1.35.0) (1.10.4) Requirement already satisfied: shapely<3.0.0dev in /usr/local/lib/python3.10/dist-packages (from google-cloud-aiplatform==1.35.0) (2.0.2) Requirement already satisfied: Pygments>=2.2.0 in /usr/local/lib/python3.10/dist-packages (from prettyprinter) (2.16.1) Collecting colorful>=0.4.0 (from prettyprinter) Downloading colorful-0.5.5-py2.py3-none-any.whl (201 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 201.4/201.4 kB 23.2 MB/s eta 0:00:00 Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.0.316) (23.1.0) Requirement already satisfied: charset-normalizer<4.0,>=2.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.0.316) (3.3.0) Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.0.316) (6.0.4) Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.0.316) (1.9.2) Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.0.316) (1.4.0) Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.3->langchain==0.0.316) (1.3.1) Requirement already satisfied: idna>=2.8 in /usr/local/lib/python3.10/dist-packages (from anyio<4.0->langchain==0.0.316) (3.4) Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.10/dist-packages (from anyio<4.0->langchain==0.0.316) (1.3.0) Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio<4.0->langchain==0.0.316) (1.1.3) Collecting marshmallow<4.0.0,>=3.18.0 (from dataclasses-json<0.7,>=0.5.7->langchain==0.0.316) Downloading marshmallow-3.20.1-py3-none-any.whl (49 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 49.4/49.4 kB 5.7 MB/s eta 0:00:00 Collecting typing-inspect<1,>=0.4.0 (from dataclasses-json<0.7,>=0.5.7->langchain==0.0.316) Downloading typing_inspect-0.9.0-py3-none-any.whl (8.8 kB) Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in /usr/local/lib/python3.10/dist-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (1.61.0) Requirement already satisfied: google-auth<3.0.dev0,>=2.14.1 in /usr/local/lib/python3.10/dist-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (2.17.3) Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in /usr/local/lib/python3.10/dist-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (1.59.0) Requirement already satisfied: grpcio-status<2.0.dev0,>=1.33.2 in /usr/local/lib/python3.10/dist-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (1.48.2) Requirement already satisfied: google-cloud-core<3.0.0dev,>=1.6.0 in /usr/local/lib/python3.10/dist-packages (from google-cloud-bigquery<4.0.0dev,>=1.15.0->google-cloud-aiplatform==1.35.0) (2.3.3) Requirement already satisfied: google-resumable-media<3.0dev,>=0.6.0 in /usr/local/lib/python3.10/dist-packages (from google-cloud-bigquery<4.0.0dev,>=1.15.0->google-cloud-aiplatform==1.35.0) (2.6.0) Requirement already satisfied: python-dateutil<3.0dev,>=2.7.2 in /usr/local/lib/python3.10/dist-packages (from google-cloud-bigquery<4.0.0dev,>=1.15.0->google-cloud-aiplatform==1.35.0) (2.8.2) Requirement already satisfied: grpc-google-iam-v1<1.0.0dev,>=0.12.4 in /usr/local/lib/python3.10/dist-packages (from google-cloud-resource-manager<3.0.0dev,>=1.3.3->google-cloud-aiplatform==1.35.0) (0.12.6) Collecting jsonpointer>=1.9 (from jsonpatch<2.0,>=1.33->langchain==0.0.316) Downloading jsonpointer-2.4-py2.py3-none-any.whl (7.8 kB) Requirement already satisfied: typing-extensions>=4.2.0 in /usr/local/lib/python3.10/dist-packages (from pydantic<3,>=1->langchain==0.0.316) (4.5.0) Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2->langchain==0.0.316) (2.0.6) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2->langchain==0.0.316) (2023.7.22) Requirement already satisfied: greenlet!=0.4.17 in /usr/local/lib/python3.10/dist-packages (from SQLAlchemy<3,>=1.4->langchain==0.0.316) (3.0.0) Requirement already satisfied: cachetools<6.0,>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from google-auth<3.0.dev0,>=2.14.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (5.3.1) Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.10/dist-packages (from google-auth<3.0.dev0,>=2.14.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (0.3.0) Requirement already satisfied: six>=1.9.0 in /usr/local/lib/python3.10/dist-packages (from google-auth<3.0.dev0,>=2.14.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (1.16.0) Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.10/dist-packages (from google-auth<3.0.dev0,>=2.14.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (4.9) Requirement already satisfied: google-crc32c<2.0dev,>=1.0 in /usr/local/lib/python3.10/dist-packages (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4.0.0dev,>=1.15.0->google-cloud-aiplatform==1.35.0) (1.5.0) Collecting mypy-extensions>=0.3.0 (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7,>=0.5.7->langchain==0.0.316) Downloading mypy_extensions-1.0.0-py3-none-any.whl (4.7 kB) Requirement already satisfied: pyasn1<0.6.0,>=0.4.6 in /usr/local/lib/python3.10/dist-packages (from pyasn1-modules>=0.2.1->google-auth<3.0.dev0,>=2.14.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform==1.35.0) (0.5.0) Installing collected packages: colorful, prettyprinter, mypy-extensions, marshmallow, jsonpointer, typing-inspect, langsmith, jsonpatch, dataclasses-json, langchain, google-cloud-aiplatform WARNING: The script langsmith is installed in '/root/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts langchain and langchain-server are installed in '/root/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script tb-gcp-uploader is installed in '/root/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. Successfully installed colorful-0.5.5 dataclasses-json-0.6.1 google-cloud-aiplatform-1.35.0 jsonpatch-1.33 jsonpointer-2.4 langchain-0.0.316 langsmith-0.0.44 marshmallow-3.20.1 mypy-extensions-1.0.0 prettyprinter-0.18.0 typing-inspect-0.9.0
MAKE SURE TO RESTART YOUR RUNTIME BEFORE GOING FURTHER
The Code Snippet¶
The code in the next three cells is what you need to copy to use AllChainDetails
elsewhere.
# Import dependencies.
from langchain.callbacks.base import BaseCallbackHandler
from langchain.schema import AgentAction, AgentFinish, Document, LLMResult
from prettyprinter import cpprint
from typing import Any, Dict, List, Optional, Sequence, Type, Union
from uuid import UUID
# Two helper classes for pretty output.
class Color():
"""For easier understanding and faster manipulation of printed colors."""
PURPLE = "\033[95m"
CYAN = "\033[96m"
DARKCYAN = "\033[36m"
BLUE = "\033[94m"
GREEN = "\033[92m"
YELLOW = "\033[93m"
RED = "\033[91m"
BOLD = "\033[1m"
UNDERLINE = "\033[4m"
ITALICS = "\x1B[3m"
END = "\033[0m\x1B[0m"
class OutputFormatter:
""" Helper class to control the format of printed output from the callbacks.
If used in prod, consider reimplementing in a way that removes hardcoding
of where the output is written. Maybe use Python logging and then pass a
custom configuration?
"""
def heading(text: str) -> None:
print(f"{Color.BOLD}{text}{Color.END}")
def key_info(text: str) -> None:
print(f"{Color.BOLD}{Color.DARKCYAN}{text}{Color.END}")
def key_info_labeled(label: str,
contents: str,
contents_newlined: Optional[bool] = False
) -> None:
print(f"{Color.BOLD}{Color.DARKCYAN}{label}: {Color.END}{Color.DARKCYAN}",
end="")
if contents_newlined:
contents = contents.splitlines()
cpprint(f"{contents}")
print(f"{Color.END}", end="")
def debug_info(text: str) -> None:
print(f"{Color.BLUE}{text}{Color.END}")
def debug_info_labeled(label: str,
contents: str,
contents_newlined: Optional[bool] = False
) -> None:
print(f"{Color.BOLD}{Color.BLUE}{label}: {Color.END}{Color.BLUE}",
end="")
if contents_newlined:
contents = contents.splitlines()
cpprint(f"{contents}")
print(f"{Color.END}", end="")
def llm_call(text: str) -> None:
print(f"{Color.ITALICS}{text}{Color.END}")
def llm_output(text: str) -> None:
print(f"{Color.UNDERLINE}{text}{Color.END}")
def tool_call(text: str) -> None:
print(f"{Color.ITALICS}{Color.PURPLE}{text}{Color.END}")
def tool_output(text: str) -> None:
print(f"{Color.UNDERLINE}{Color.PURPLE}{text}{Color.END}")
def debug_error(text: str) -> None:
print(f"{Color.BOLD}{Color.RED}{text}{Color.END}")
# Actual Langchain callback handler, this produces status updates during a
# Langchain execution.
class AllChainDetails(BaseCallbackHandler):
"""Outputs details of chain progress and state.
Exposes details available at callback time to each executed step in a chain.
Method arguments in this class are based on the (most of?) the arguments
available to the callback method, though not all implementations in this
class use all the arguments.
Usage:
Pass as an argument to a langchain method or class that accepts a callback
handler. Note that not all langchain classes will invoke all callbacks
when the callback handler is provided at initialization time, so the
recommended usage is to provide the callback handler when executing a
chain.
Example:
from langchain import LLMChain, PromptTemplate
from langchain.llms import VertexAI
import vertexai # Comes from google-cloud-aiplatform package.
vertexai.init(project=PROJECT_ID, location=REGION)
llm = VertexAI(temperature=0) # Use any LLM.
prompt_template = "What food pairs well with {food}?"
handler = AllChainDetails()
llm_chain = LLMChain(
llm=llm,
prompt=PromptTemplate.from_template(prompt_template))
llm_chain("chocolate", callbacks=[handler])
Args:
debug_mode: If True, prints more details of each chain step and activates
breakpoints (using pdb) when unexpected behavior is detected. Note that
the breakpoints are in the callbacks, which limits the amount of
inspectable langchain state to what langchain surfaces to callbacks.
out: Class for managing output, only tested with the OutputFormatter
accompanying this class.
"""
def __init__(self,
debug_mode: Optional[bool] = False,
out: Type[OutputFormatter] = OutputFormatter,
) -> None:
self.debug_mode = debug_mode
self.out = out
def on_text(self,
text: str,
color: Optional[str] = None,
end: str = "",
**kwargs: Any,) -> None:
"""Run usually (not always) when langchain creates text for an LLM call.
This callback is only used when debug_mode == True, since it can be
confusing to see the blocks of text that come from this callback on top
of the text sent to the LLM--it's much easier to understand what's going
on by only looking at text sent to an LLM.
"""
if self.debug_mode:
self.out.heading(f"\n\n> Preparing text.")
self.out.debug_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.debug_info_labeled("Parent chain ID",
f"{kwargs['parent_run_id']}")
self.out.debug_info_labeled("Arguments", f"{kwargs}")
print(text) # Langchain already agressively formats this.
def on_llm_start(self,
serialized: Dict[str, Any],
prompts: List[str],
**kwargs: Any) -> None:
"""Run when langchain calls an LLM."""
self.out.heading(f"\n\n> Sending text to the LLM.")
self.out.key_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.key_info_labeled("Parent chain ID", f"{kwargs['parent_run_id']}")
if len(prompts) > 1:
self.out.debug_error("prompts has multiple items.")
self.out.debug_error("Only outputting first item in prompts.")
if self.debug_mode:
self.out.debug_info_labeled("Prompts", f"{prompts}")
breakpoint()
self.out.key_info(f"Text sent to LLM:")
self.out.llm_call(prompts[0])
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("serialized", f"{serialized}")
def on_llm_end(self, response: LLMResult, **kwargs: Any) -> None:
"""Run after LLM response is received by langchain."""
self.out.heading(f"\n\n> Received response from LLM.")
self.out.key_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.key_info_labeled("Parent chain ID", f"{kwargs['parent_run_id']}")
if len(response.generations) > 1:
self.out.debug_error("response object has multiple generations.")
self.out.debug_error("Only outputting first generation in response.")
if self.debug_mode:
self.out.debug_info_labeled("response", f"{response}")
breakpoint()
self.out.key_info(f"Text received from LLM:")
self.out.llm_output(response.generations[0][0].text)
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("response", f"{response}")
def on_chain_start(self,
serialized: Dict[str, Any],
inputs: Dict[str, Any],
**kwargs: Any) -> None:
"""Run when a new chain (or subchain) is started."""
self.out.heading(f"\n\n> Starting new chain.")
if 'id' not in serialized.keys():
self.out.debug_error("Missing serialized['id']")
class_name = "Unknown -- serialized['id'] is missing"
if self.debug_mode:
self.out.debug_info_labeled("serialized", f"{serialized}")
breakpoint()
else:
class_name = ".".join(serialized['id'])
self.out.key_info_labeled(f"Chain class", f"{class_name}")
self.out.key_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.key_info_labeled("Parent chain ID", f"{kwargs['parent_run_id']}")
if len(inputs) < 1:
self.out.debug.error("Chain inputs is empty.")
if self.debug_mode:
self.out.debug_info_labeled("inputs", f"{inputs}")
breakpoint()
else:
self.out.key_info("Iterating through keys/values of chain inputs:")
for key, value in inputs.items():
# These keys contain mostly noise.
if key not in ["stop", "agent_scratchpad"]:
self.out.key_info_labeled(f" {key}", f"{value}")
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("inputs", f"{inputs}")
self.out.debug_info_labeled("serialized", f"{serialized}")
def on_chain_end(self, outputs: Dict[str, Any], **kwargs: Any) -> None:
"""Run when a chain completes."""
self.out.heading(f"\n\n> Ending chain.")
self.out.key_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.key_info_labeled("Parent chain ID", f"{kwargs['parent_run_id']}")
if len(outputs) == 0:
self.out.debug_errors("No chain outputs.")
if self.debug_mode:
self.out.debug_info_labeled("outputs", f"{outputs}")
breakpoint()
else:
outputs_keys = [*outputs.keys()]
for key in outputs_keys:
self.out.key_info_labeled(f"Output {key}",
f"{outputs[key]}",
contents_newlined=True)
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("outputs", f"{outputs}")
def on_tool_start(self,
serialized: Dict[str, Any],
input_str: str,
**kwargs: Any,) -> None:
"""Run when making a call to a tool."""
self.out.heading(f"\n\n> Using tool.")
self.out.key_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.key_info_labeled("Parent chain ID", f"{kwargs['parent_run_id']}")
self.out.key_info_labeled(f"Tool name", f"{serialized['name']}")
self.out.key_info(f"Query sent to tool:")
self.out.tool_call(input_str)
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("serialized", f"{serialized}")
def on_tool_end(
self,
output: str,
color: Optional[str] = None,
observation_prefix: Optional[str] = None,
llm_prefix: Optional[str] = None,
**kwargs: Any,) -> None:
"""Run on response from a tool."""
self.out.heading(f"\n\n> Received tool output.")
self.out.key_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.key_info_labeled("Parent chain ID", f"{kwargs['parent_run_id']}")
self.out.key_info_labeled(f"Tool name", f"{kwargs['name']}")
if "output" not in locals():
self.out.debug_error("No tool output.")
if self.debug_mode:
breakpoint()
else:
self.out.key_info("Response from tool:")
self.out.tool_output(f"{output}")
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("observation_prefix",
f"{observation_prefix}")
self.out.debug_info_labeled("llm_prefix",
f"{llm_prefix}")
def on_agent_action(self,
action: AgentAction,
color: Optional[str] = None,
**kwargs: Any) -> Any:
"""Run when agent performs an action."""
self.out.heading(f"\n\n> Agent taking an action.")
self.out.key_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.key_info_labeled("Parent chain ID", f"{kwargs['parent_run_id']}")
if not hasattr(action, "log"):
self.out.debug_error("No log in action.")
if self.debug_mode:
self.out.debug_info_labeled("action", f"{action}")
breakpoint()
else:
self.out.key_info_labeled(f"Action log",
f"{action.log}",
contents_newlined=True)
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("action", f"{action}")
def on_agent_finish(self,
finish: AgentFinish,
color: Optional[str] = None,
**kwargs: Any) -> None:
"""Run after agent completes."""
self.out.heading(f"\n\n> Agent has finished.")
self.out.key_info_labeled(f"Chain ID", f"{kwargs['run_id']}")
self.out.key_info_labeled("Parent chain ID", f"{kwargs['parent_run_id']}")
if not hasattr(finish, "log"):
self.out.debug_error("No log in action finish.")
if self.debug_mode:
breakpoint()
else:
self.out.key_info_labeled(f"Action finish log",
f"{finish.log}",
contents_newlined=True)
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("finish",
f"{finish}")
def on_llm_error(self,
error: Union[Exception, KeyboardInterrupt],
**kwargs: Any) -> None:
self.out.debug_error("LLM Error")
self.out.debug_info_labeled("Error object", f"{error}")
if self.debug_mode:
breakpoint()
def on_chain_error(self,
error: Union[Exception, KeyboardInterrupt],
**kwargs: Any) -> None:
self.out.debug_error("Chain Error")
self.out.debug_info_labeled("Error object", f"{error}")
if self.debug_mode:
breakpoint()
def on_tool_error(self,
error: Union[Exception, KeyboardInterrupt],
**kwargs: Any) -> None:
self.out.debug_error("Chain Error")
self.out.debug_info_labeled("Error object", f"{error}")
if self.debug_mode:
breakpoint()
def on_retriever_start(self,
serialized: Dict[str, Any],
query: str,
*,
run_id: UUID,
parent_run_id: Optional[UUID] = None,
tags: Optional[List[str]] = None,
metadata: Optional[Dict[str, Any]] = None,
**kwargs: Any) -> Any:
"""Run when querying a retriever."""
self.out.heading(f"\n\n> Querying retriever.")
self.out.key_info_labeled(f"Chain ID", f"{run_id}")
self.out.key_info_labeled("Parent chain ID", f"{parent_run_id}")
self.out.key_info_labeled("Tags", f"{tags}")
if 'id' not in serialized.keys():
self.out.debug_error("Missing serialized['id']")
class_name = "Unknown -- serialized['id'] is missing"
if self.debug_mode:
self.out.debug_info_labeled("serialized", f"{serialized}")
breakpoint()
else:
class_name = ".".join(serialized['id'])
self.out.key_info_labeled(f"Retriever class", f"{class_name}")
self.out.key_info(f"Query sent to retriever:")
self.out.tool_call(query)
if self.debug_mode:
self.out.debug_info_labeled("Arguments", f"{kwargs}")
self.out.debug_info_labeled("metadata", f"{metadata}")
self.out.debug_info_labeled("serialized", f"{serialized}")
def on_retriever_end(self,
documents: Sequence[Document],
*,
run_id: UUID,
parent_run_id: Optional[UUID] = None,
**kwargs: Any) -> Any:
"""Run when retriever returns a response."""
self.out.heading(f"\n\n> Retriever finished.")
self.out.key_info_labeled(f"Chain ID", f"{run_id}")
self.out.key_info_labeled("Parent chain ID", f"{parent_run_id}")
self.out.key_info(f"Found {len(documents)} documents.")
if len(documents) == 0:
self.out.debug_error("No documents found.")
if self.debug_mode:
breakpoint()
else:
for doc_num, doc in enumerate(documents):
self.out.key_info("---------------------------------------------------")
self.out.key_info(f"Document number {doc_num} of {len(documents)}")
self.out.key_info_labeled("Metadata", f"{doc.metadata}")
self.out.key_info("Document contents:")
self.out.tool_output(doc.page_content)
AllChainDetails
Usage¶
If you're using Colab, run the code in the next cell. Follow the popups and authenticate with an account that has access to a Google Cloud project, for using the Vertex AI LLMs.
If you're running this notebook somewhere besides Colab, make sure your environment has the right Google Cloud access. If that's a new concept to you, consider looking into Application Default Credentials for your local environment. More authentication options are discussed here.
If you're entirely new to Google Cloud, get started.
from google.colab import auth
auth.authenticate_user()
PROJECT_ID = "YOUR_PROJECT_ID_HERE" # @param {type:"string"}
LOCATION = "us-central1" # @param {type:"string"}
# Code examples may misbehave if the model is changed.
MODEL_NAME = "text-bison@001"
# Dependencies for usage example.
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
from langchain.llms import VertexAI
import vertexai # Comes from google-cloud-aiplatform package.
# Initiaize connection to Vertex PaLM API LLM.
vertexai.init(project=PROJECT_ID, location=LOCATION)
llm = VertexAI(model_name=MODEL_NAME, temperature=0)
You can use the AllChainDetails
callback handler both when executing a chain/agent/etc. or when initializing a chain/agent/etc.
You'll generally get more complete output of langchain internals when passing the AllChainDetails
callback handler to a chain execution rather than an initialization.
# Callback handler specified at execution time, more information given.
prompt_template = "What food pairs well with {food}?"
handler = AllChainDetails()
llm_chain = LLMChain(
llm=llm,
prompt=PromptTemplate.from_template(prompt_template)
)
llm_chain("chocolate", callbacks=[handler])
> Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: 'd2b274f7-b992-4b67-a352-8968bd9efa1f' Parent chain ID: 'None' Iterating through keys/values of chain inputs: food: 'chocolate' > Sending text to the LLM. Chain ID: 'bda6c996-7750-48fc-ba05-7afe5c18933b' Parent chain ID: 'd2b274f7-b992-4b67-a352-8968bd9efa1f' Text sent to LLM: What food pairs well with chocolate? > Received response from LLM. Chain ID: 'bda6c996-7750-48fc-ba05-7afe5c18933b' Parent chain ID: 'd2b274f7-b992-4b67-a352-8968bd9efa1f' Text received from LLM: Chocolate pairs well with many foods, including fruits, nuts, and dairy products. Some popular pairings include chocolate with strawberries, chocolate with bananas, chocolate with nuts, and chocolate with cheese. > Ending chain. Chain ID: 'd2b274f7-b992-4b67-a352-8968bd9efa1f' Parent chain ID: 'None' Output text: "['Chocolate pairs well with many foods, including fruits, nuts, and " "dairy products. Some popular pairings include chocolate with " "strawberries, chocolate with bananas, chocolate with nuts, and " "chocolate with cheese.']"
{'food': 'chocolate', 'text': 'Chocolate pairs well with many foods, including fruits, nuts, and dairy products. Some popular pairings include chocolate with strawberries, chocolate with bananas, chocolate with nuts, and chocolate with cheese.'}
# Callback handler specified at initialization, less information given.
prompt_template = "What food pairs well with {food}?"
handler = AllChainDetails()
llm_chain = LLMChain(
llm=llm,
prompt=PromptTemplate.from_template(prompt_template),
callbacks=[handler])
llm_chain("chocolate")
> Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: '9b321f38-174a-4258-99a8-25c611585553' Parent chain ID: 'None' Iterating through keys/values of chain inputs: food: 'chocolate' > Ending chain. Chain ID: '9b321f38-174a-4258-99a8-25c611585553' Parent chain ID: 'None' Output text: "['Chocolate pairs well with many foods, including fruits, nuts, and " "dairy products. Some popular pairings include chocolate with " "strawberries, chocolate with bananas, chocolate with nuts, and " "chocolate with cheese.']"
{'food': 'chocolate', 'text': 'Chocolate pairs well with many foods, including fruits, nuts, and dairy products. Some popular pairings include chocolate with strawberries, chocolate with bananas, chocolate with nuts, and chocolate with cheese.'}
Debug Mode¶
AllChainDetails
has a debug mode that provides more output information and engages breakpoints when a chain errors or something unexpected happens.
prompt_template = "What food pairs well with {food}?"
# Turn on debug mode.
handler = AllChainDetails(debug_mode=True)
llm_chain = LLMChain(
llm=llm,
prompt=PromptTemplate.from_template(prompt_template)
)
llm_chain("chocolate", callbacks=[handler])
> Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: '941c6dd8-1474-465f-a34b-7a31ac30c04f' Parent chain ID: 'None' Iterating through keys/values of chain inputs: food: 'chocolate' Arguments: "{'run_id': UUID('941c6dd8-1474-465f-a34b-7a31ac30c04f'), " "'parent_run_id': None, 'tags': [], 'metadata': {}, 'name': None}" inputs: "{'food': 'chocolate'}" serialized: "{'lc': 1, 'type': 'constructor', 'id': ['langchain', 'chains', " "'llm', 'LLMChain'], 'kwargs': {'llm': {'lc': 1, 'type': " "'constructor', 'id': ['langchain', 'llms', 'vertexai', 'VertexAI'], " "'kwargs': {'model_name': 'text-bison@001', 'temperature': 0.0}}, " "'prompt': {'lc': 1, 'type': 'constructor', 'id': ['langchain', " "'prompts', 'prompt', 'PromptTemplate'], 'kwargs': " "{'input_variables': ['food'], 'template': 'What food pairs well with " "{food}?', 'template_format': 'f-string', 'partial_variables': {}}}}}" > Preparing text. Chain ID: '941c6dd8-1474-465f-a34b-7a31ac30c04f' Parent chain ID: 'None' Arguments: "{'run_id': UUID('941c6dd8-1474-465f-a34b-7a31ac30c04f'), " "'parent_run_id': None, 'tags': [], 'verbose': False}" Prompt after formatting: What food pairs well with chocolate? > Sending text to the LLM. Chain ID: '0df6aa10-efde-4cb3-807e-1bdc5f870d10' Parent chain ID: '941c6dd8-1474-465f-a34b-7a31ac30c04f' Text sent to LLM: What food pairs well with chocolate? Arguments: "{'run_id': UUID('0df6aa10-efde-4cb3-807e-1bdc5f870d10'), " "'parent_run_id': UUID('941c6dd8-1474-465f-a34b-7a31ac30c04f'), " "'tags': [], 'metadata': {}, 'invocation_params': {'model_name': " "'text-bison@001', 'temperature': 0.0, 'max_output_tokens': 128, " "'top_k': 40, 'top_p': 0.95, 'candidate_count': 1, '_type': " "'vertexai', 'stop': None}, 'options': {'stop': None}, 'name': None}" serialized: "{'lc': 1, 'type': 'constructor', 'id': ['langchain', 'llms', " "'vertexai', 'VertexAI'], 'kwargs': {'model_name': 'text-bison@001', " "'temperature': 0.0}}" > Received response from LLM. Chain ID: '0df6aa10-efde-4cb3-807e-1bdc5f870d10' Parent chain ID: '941c6dd8-1474-465f-a34b-7a31ac30c04f' Text received from LLM: Chocolate pairs well with many foods, including fruits, nuts, and dairy products. Some popular pairings include chocolate with strawberries, chocolate with bananas, chocolate with nuts, and chocolate with cheese. Arguments: "{'run_id': UUID('0df6aa10-efde-4cb3-807e-1bdc5f870d10'), " "'parent_run_id': UUID('941c6dd8-1474-465f-a34b-7a31ac30c04f'), " "'tags': []}" response: "generations=[[GenerationChunk(text='Chocolate pairs well with many " "foods, including fruits, nuts, and dairy products. Some popular " "pairings include chocolate with strawberries, chocolate with " "bananas, chocolate with nuts, and chocolate with cheese.', " "generation_info={'is_blocked': False, 'safety_attributes': " "{'Health': 0.9}})]] llm_output=None run=None" > Ending chain. Chain ID: '941c6dd8-1474-465f-a34b-7a31ac30c04f' Parent chain ID: 'None' Output text: "['Chocolate pairs well with many foods, including fruits, nuts, and " "dairy products. Some popular pairings include chocolate with " "strawberries, chocolate with bananas, chocolate with nuts, and " "chocolate with cheese.']" Arguments: "{'run_id': UUID('941c6dd8-1474-465f-a34b-7a31ac30c04f'), " "'parent_run_id': None, 'tags': []}" outputs: "{'text': 'Chocolate pairs well with many foods, including fruits, " "nuts, and dairy products. Some popular pairings include chocolate " "with strawberries, chocolate with bananas, chocolate with nuts, and " "chocolate with cheese.'}"
{'food': 'chocolate', 'text': 'Chocolate pairs well with many foods, including fruits, nuts, and dairy products. Some popular pairings include chocolate with strawberries, chocolate with bananas, chocolate with nuts, and chocolate with cheese.'}
Tip: New to Python debugging? Just type 'c' then enter in the text box that appears at the bottom of the cell's output when the execution breaks.
# Temperature > 1 causes the PaLM APIs to return an error.
llm = VertexAI(model_name=MODEL_NAME, temperature=10)
prompt_template = "What food pairs well with {food}?"
handler = AllChainDetails(debug_mode=True)
llm_chain = LLMChain(
llm=llm,
prompt=PromptTemplate.from_template(prompt_template)
)
llm_chain("testing", callbacks=[handler])
> Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: 'fd0bb489-4a20-47ae-a542-db2e4a3eb508' Parent chain ID: 'None' Iterating through keys/values of chain inputs: food: 'testing' Arguments: "{'run_id': UUID('fd0bb489-4a20-47ae-a542-db2e4a3eb508'), " "'parent_run_id': None, 'tags': [], 'metadata': {}, 'name': None}" inputs: "{'food': 'testing'}" serialized: "{'lc': 1, 'type': 'constructor', 'id': ['langchain', 'chains', " "'llm', 'LLMChain'], 'kwargs': {'llm': {'lc': 1, 'type': " "'constructor', 'id': ['langchain', 'llms', 'vertexai', 'VertexAI'], " "'kwargs': {'model_name': 'text-bison@001', 'temperature': 10.0}}, " "'prompt': {'lc': 1, 'type': 'constructor', 'id': ['langchain', " "'prompts', 'prompt', 'PromptTemplate'], 'kwargs': " "{'input_variables': ['food'], 'template': 'What food pairs well with " "{food}?', 'template_format': 'f-string', 'partial_variables': {}}}}}" > Preparing text. Chain ID: 'fd0bb489-4a20-47ae-a542-db2e4a3eb508' Parent chain ID: 'None' Arguments: "{'run_id': UUID('fd0bb489-4a20-47ae-a542-db2e4a3eb508'), " "'parent_run_id': None, 'tags': [], 'verbose': False}" Prompt after formatting: What food pairs well with testing? > Sending text to the LLM. Chain ID: '648e1db4-cc4a-4892-9718-ed974da9068a' Parent chain ID: 'fd0bb489-4a20-47ae-a542-db2e4a3eb508' Text sent to LLM: What food pairs well with testing? Arguments: "{'run_id': UUID('648e1db4-cc4a-4892-9718-ed974da9068a'), " "'parent_run_id': UUID('fd0bb489-4a20-47ae-a542-db2e4a3eb508'), " "'tags': [], 'metadata': {}, 'invocation_params': {'model_name': " "'text-bison@001', 'temperature': 10.0, 'max_output_tokens': 128, " "'top_k': 40, 'top_p': 0.95, 'candidate_count': 1, '_type': " "'vertexai', 'stop': None}, 'options': {'stop': None}, 'name': None}" serialized: "{'lc': 1, 'type': 'constructor', 'id': ['langchain', 'llms', " "'vertexai', 'VertexAI'], 'kwargs': {'model_name': 'text-bison@001', " "'temperature': 10.0}}"
PYDEV DEBUGGER WARNING: sys.settrace() should not be used when the debugger is being used. This may cause the debugger to stop working correctly. If this is needed, please check: http://pydev.blogspot.com/2007/06/why-cant-pydev-debugger-work-with.html to see how to restore the debug tracing back correctly. Call Location: File "/usr/lib/python3.10/bdb.py", line 336, in set_trace sys.settrace(self.trace_dispatch)
LLM Error Error object: '400 10.000000 is out of supported range [0, 1]; for value of ' 'temperature.' --Return-- None > <ipython-input-4-aee70565b176>(267)on_llm_error() 265 self.out.debug_info_labeled("Error object", f"{error}") 266 if self.debug_mode: --> 267 breakpoint() 268 269 def on_chain_error(self, ipdb> c
PYDEV DEBUGGER WARNING: sys.settrace() should not be used when the debugger is being used. This may cause the debugger to stop working correctly. If this is needed, please check: http://pydev.blogspot.com/2007/06/why-cant-pydev-debugger-work-with.html to see how to restore the debug tracing back correctly. Call Location: File "/usr/lib/python3.10/bdb.py", line 347, in set_continue sys.settrace(None)
Chain Error Error object: '400 10.000000 is out of supported range [0, 1]; for value of ' 'temperature.' --Return-- None > <ipython-input-4-aee70565b176>(275)on_chain_error() 273 self.out.debug_info_labeled("Error object", f"{error}") 274 if self.debug_mode: --> 275 breakpoint() 276 277 def on_tool_error(self, ipdb> c
--------------------------------------------------------------------------- _InactiveRpcError Traceback (most recent call last) /usr/local/lib/python3.10/dist-packages/google/api_core/grpc_helpers.py in error_remapped_callable(*args, **kwargs) 71 try: ---> 72 return callable_(*args, **kwargs) 73 except grpc.RpcError as exc: /usr/local/lib/python3.10/dist-packages/grpc/_channel.py in __call__(self, request, timeout, metadata, credentials, wait_for_ready, compression) 1160 ) -> 1161 return _end_unary_response_blocking(state, call, False, None) 1162 /usr/local/lib/python3.10/dist-packages/grpc/_channel.py in _end_unary_response_blocking(state, call, with_call, deadline) 1003 else: -> 1004 raise _InactiveRpcError(state) # pytype: disable=not-instantiable 1005 _InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.INVALID_ARGUMENT details = "10.000000 is out of supported range [0, 1]; for value of temperature." debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.141.95:443 {created_time:"2023-10-18T02:08:34.597918378+00:00", grpc_status:3, grpc_message:"10.000000 is out of supported range [0, 1]; for value of temperature."}" > The above exception was the direct cause of the following exception: InvalidArgument Traceback (most recent call last) <ipython-input-11-fc3b819d999c> in <cell line: 9>() 7 prompt=PromptTemplate.from_template(prompt_template) 8 ) ----> 9 llm_chain("testing", callbacks=[handler]) ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 306 except BaseException as e: 307 run_manager.on_chain_error(e) --> 308 raise e 309 run_manager.on_chain_end(outputs) 310 final_outputs: Dict[str, Any] = self.prep_outputs( ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 300 try: 301 outputs = ( --> 302 self._call(inputs, run_manager=run_manager) 303 if new_arg_supported 304 else self._call(inputs) ~/.local/lib/python3.10/site-packages/langchain/chains/llm.py in _call(self, inputs, run_manager) 91 run_manager: Optional[CallbackManagerForChainRun] = None, 92 ) -> Dict[str, str]: ---> 93 response = self.generate([inputs], run_manager=run_manager) 94 return self.create_outputs(response)[0] 95 ~/.local/lib/python3.10/site-packages/langchain/chains/llm.py in generate(self, input_list, run_manager) 101 """Generate LLM result from inputs.""" 102 prompts, stop = self.prep_prompts(input_list, run_manager=run_manager) --> 103 return self.llm.generate_prompt( 104 prompts, 105 stop, ~/.local/lib/python3.10/site-packages/langchain/llms/base.py in generate_prompt(self, prompts, stop, callbacks, **kwargs) 495 ) -> LLMResult: 496 prompt_strings = [p.to_string() for p in prompts] --> 497 return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs) 498 499 async def agenerate_prompt( ~/.local/lib/python3.10/site-packages/langchain/llms/base.py in generate(self, prompts, stop, callbacks, tags, metadata, run_name, **kwargs) 644 ) 645 ] --> 646 output = self._generate_helper( 647 prompts, stop, run_managers, bool(new_arg_supported), **kwargs 648 ) ~/.local/lib/python3.10/site-packages/langchain/llms/base.py in _generate_helper(self, prompts, stop, run_managers, new_arg_supported, **kwargs) 532 for run_manager in run_managers: 533 run_manager.on_llm_error(e) --> 534 raise e 535 flattened_outputs = output.flatten() 536 for manager, flattened_output in zip(run_managers, flattened_outputs): ~/.local/lib/python3.10/site-packages/langchain/llms/base.py in _generate_helper(self, prompts, stop, run_managers, new_arg_supported, **kwargs) 519 try: 520 output = ( --> 521 self._generate( 522 prompts, 523 stop=stop, ~/.local/lib/python3.10/site-packages/langchain/llms/vertexai.py in _generate(self, prompts, stop, run_manager, stream, **kwargs) 296 generations.append([generation]) 297 else: --> 298 res = completion_with_retry( 299 self, prompt, run_manager=run_manager, **params 300 ) ~/.local/lib/python3.10/site-packages/langchain/llms/vertexai.py in completion_with_retry(llm, run_manager, *args, **kwargs) 100 return llm.client.predict(*args, **kwargs) 101 --> 102 return _completion_with_retry(*args, **kwargs) 103 104 /usr/local/lib/python3.10/dist-packages/tenacity/__init__.py in wrapped_f(*args, **kw) 287 @functools.wraps(f) 288 def wrapped_f(*args: t.Any, **kw: t.Any) -> t.Any: --> 289 return self(f, *args, **kw) 290 291 def retry_with(*args: t.Any, **kwargs: t.Any) -> WrappedFn: /usr/local/lib/python3.10/dist-packages/tenacity/__init__.py in __call__(self, fn, *args, **kwargs) 377 retry_state = RetryCallState(retry_object=self, fn=fn, args=args, kwargs=kwargs) 378 while True: --> 379 do = self.iter(retry_state=retry_state) 380 if isinstance(do, DoAttempt): 381 try: /usr/local/lib/python3.10/dist-packages/tenacity/__init__.py in iter(self, retry_state) 312 is_explicit_retry = fut.failed and isinstance(fut.exception(), TryAgain) 313 if not (is_explicit_retry or self.retry(retry_state)): --> 314 return fut.result() 315 316 if self.after is not None: /usr/lib/python3.10/concurrent/futures/_base.py in result(self, timeout) 449 raise CancelledError() 450 elif self._state == FINISHED: --> 451 return self.__get_result() 452 453 self._condition.wait(timeout) /usr/lib/python3.10/concurrent/futures/_base.py in __get_result(self) 401 if self._exception: 402 try: --> 403 raise self._exception 404 finally: 405 # Break a reference cycle with the exception in self._exception /usr/local/lib/python3.10/dist-packages/tenacity/__init__.py in __call__(self, fn, *args, **kwargs) 380 if isinstance(do, DoAttempt): 381 try: --> 382 result = fn(*args, **kwargs) 383 except BaseException: # noqa: B902 384 retry_state.set_exception(sys.exc_info()) # type: ignore[arg-type] ~/.local/lib/python3.10/site-packages/langchain/llms/vertexai.py in _completion_with_retry(*args, **kwargs) 98 @retry_decorator 99 def _completion_with_retry(*args: Any, **kwargs: Any) -> Any: --> 100 return llm.client.predict(*args, **kwargs) 101 102 return _completion_with_retry(*args, **kwargs) ~/.local/lib/python3.10/site-packages/vertexai/language_models/_language_models.py in predict(self, prompt, max_output_tokens, temperature, top_k, top_p, stop_sequences, candidate_count) 772 ) 773 --> 774 prediction_response = self._endpoint.predict( 775 instances=[prediction_request.instance], 776 parameters=prediction_request.parameters, ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/models.py in predict(self, instances, parameters, timeout, use_raw_predict) 1594 ) 1595 else: -> 1596 prediction_response = self._prediction_client.predict( 1597 endpoint=self._gca_resource.name, 1598 instances=instances, ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform_v1/services/prediction_service/client.py in predict(self, request, endpoint, instances, parameters, retry, timeout, metadata) 602 603 # Send the request. --> 604 response = rpc( 605 request, 606 retry=retry, /usr/local/lib/python3.10/dist-packages/google/api_core/gapic_v1/method.py in __call__(self, timeout, retry, *args, **kwargs) 111 kwargs["metadata"] = metadata 112 --> 113 return wrapped_func(*args, **kwargs) 114 115 /usr/local/lib/python3.10/dist-packages/google/api_core/grpc_helpers.py in error_remapped_callable(*args, **kwargs) 72 return callable_(*args, **kwargs) 73 except grpc.RpcError as exc: ---> 74 raise exceptions.from_grpc_error(exc) from exc 75 76 return error_remapped_callable InvalidArgument: 400 10.000000 is out of supported range [0, 1]; for value of temperature.
2 - Introductory Walkthrough¶
AllChainDetails
is most useful with Langchain agents, since some details are not available during chain execution.
We'll create an ReAct agent that has access to Wikpedia search and calculator tools, then observe the steps that happen during agent execution.
# One more dependency, no need to restart.
!pip install --user wikipedia
Collecting wikipedia Downloading wikipedia-1.4.0.tar.gz (27 kB) Preparing metadata (setup.py) ... done Requirement already satisfied: beautifulsoup4 in /usr/local/lib/python3.10/dist-packages (from wikipedia) (4.11.2) Requirement already satisfied: requests<3.0.0,>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from wikipedia) (2.31.0) Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests<3.0.0,>=2.0.0->wikipedia) (3.3.0) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests<3.0.0,>=2.0.0->wikipedia) (3.4) Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests<3.0.0,>=2.0.0->wikipedia) (2.0.6) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests<3.0.0,>=2.0.0->wikipedia) (2023.7.22) Requirement already satisfied: soupsieve>1.2 in /usr/local/lib/python3.10/dist-packages (from beautifulsoup4->wikipedia) (2.5) Building wheels for collected packages: wikipedia Building wheel for wikipedia (setup.py) ... done Created wheel for wikipedia: filename=wikipedia-1.4.0-py3-none-any.whl size=11678 sha256=8c952630d294232439cf6da1235377b20a4647dc345a3308fff5af7836c887f2 Stored in directory: /root/.cache/pip/wheels/5e/b6/c5/93f3dec388ae76edc830cb42901bb0232504dfc0df02fc50de Successfully built wikipedia Installing collected packages: wikipedia Successfully installed wikipedia-1.4.0
from langchain.agents import AgentType, initialize_agent, load_tools
from langchain.tools import WikipediaQueryRun
from langchain.utilities import WikipediaAPIWrapper
import wikipedia
llm = VertexAI(model_name=MODEL_NAME, temperature=0)
# Initialize the Wikipedia tool.
_ = WikipediaQueryRun(api_wrapper=WikipediaAPIWrapper())
# This next line invisibly maps to the previous line. The WikipediaQueryRun
# call is what matters here for Langchain to use its "wikipedia", not
# the variable that call is output to.
tools = load_tools(["wikipedia", "llm-math"], llm=llm)
handler = AllChainDetails()
agent = initialize_agent(tools,
llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION)
agent.run("What US President costarred with a chimp in 'Bedtime for Bonzo'?",
callbacks=[handler])
> Starting new chain. Chain class: 'langchain.agents.agent.AgentExecutor' Chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Parent chain ID: 'None' Iterating through keys/values of chain inputs: input: "What US President costarred with a chimp in 'Bedtime for Bonzo'?" > Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: '39282b60-b054-4f88-9e7f-22a95472f17f' Parent chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Iterating through keys/values of chain inputs: input: "What US President costarred with a chimp in 'Bedtime for Bonzo'?" > Sending text to the LLM. Chain ID: '25d07543-4963-4014-a637-1ae78ed76171' Parent chain ID: '39282b60-b054-4f88-9e7f-22a95472f17f' Text sent to LLM: Answer the following questions as best you can. You have access to the following tools: Wikipedia: A wrapper around Wikipedia. Useful for when you need to answer general questions about people, places, companies, facts, historical events, or other subjects. Input should be a search query. Calculator: Useful for when you need to answer questions about math. Use the following format: Question: the input question you must answer Thought: you should always think about what to do Action: the action to take, should be one of [Wikipedia, Calculator] Action Input: the input to the action Observation: the result of the action ... (this Thought/Action/Action Input/Observation can repeat N times) Thought: I now know the final answer Final Answer: the final answer to the original input question Begin! Question: What US President costarred with a chimp in 'Bedtime for Bonzo'? Thought: > Received response from LLM. Chain ID: '25d07543-4963-4014-a637-1ae78ed76171' Parent chain ID: '39282b60-b054-4f88-9e7f-22a95472f17f' Text received from LLM: I need to find out what US President costarred with a chimp in 'Bedtime for Bonzo' Action: Wikipedia Action Input: bedtime for bonzo > Ending chain. Chain ID: '39282b60-b054-4f88-9e7f-22a95472f17f' Parent chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Output text: "[\"I need to find out what US President costarred with a chimp in " "'Bedtime for Bonzo'\", 'Action: Wikipedia', 'Action Input: bedtime " "for bonzo']" > Agent taking an action. Chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Parent chain ID: 'None' Action log: "[\"I need to find out what US President costarred with a chimp in " "'Bedtime for Bonzo'\", 'Action: Wikipedia', 'Action Input: bedtime " "for bonzo']" > Using tool. Chain ID: 'ae13987c-d276-4610-b225-e3c85810e607' Parent chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Tool name: 'Wikipedia' Query sent to tool: bedtime for bonzo
/root/.local/lib/python3.10/site-packages/wikipedia/wikipedia.py:389: GuessedAtParserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently. The code that caused this warning is on line 389 of the file /root/.local/lib/python3.10/site-packages/wikipedia/wikipedia.py. To get rid of this warning, pass the additional argument 'features="lxml"' to the BeautifulSoup constructor. lis = BeautifulSoup(html).find_all('li')
> Received tool output. Chain ID: 'ae13987c-d276-4610-b225-e3c85810e607' Parent chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Tool name: 'Wikipedia' Response from tool: Page: Bedtime for Bonzo Summary: Bedtime for Bonzo is a 1951 American comedy film directed by Fred de Cordova and starring Ronald Reagan, Diana Lynn, and a chimpanzee named Peggy as Bonzo. Its central character, psychology professor Peter Boyd (Reagan), tries to teach human morals to a chimpanzee, hoping to solve the "nature versus nurture" question. Boyd hires Jane Linden (Lynn) to pose as the chimpanzee's mother while he plays father to it and uses 1950s-era child-rearing techniques.A sequel was released titled Bonzo Goes to College (1952), but it featured none of the three lead performers from the original film. Peggy, who had also appeared in My Friend Irma Goes West (1950), died in a fire on March 4, 1951, so another chimpanzee was hired for the second film. Reagan did not want to appear in the second film as he thought that the premise was unbelievable. Page: Bedtime for Democracy Summary: Bedtime for Democracy is the fourth and final studio album by American punk rock band Dead Kennedys. Released in 1986, songs on this album cover common punk subjects often found in punk rock lyrics of the era such as conformity, Reaganomics, the U.S. military, and critique of the hardcore punk movement. The album's title refers to the 1951 comedy film, Bedtime for Bonzo starring Ronald Reagan and also reflects the band's weary bitterness from the trial they were undergoing at the time over the controversial art included with their previous album. By the time recording of Bedtime for Democracy had begun, the Dead Kennedys had already played what would be their last concert with Jello Biafra and announced their breakup immediately after the release of the record, whose opening track is a cover of David Allan Coe's "Take This Job and Shove It." > Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: '4cc5178b-452c-48ca-8c10-d23837e3191a' Parent chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Iterating through keys/values of chain inputs: input: "What US President costarred with a chimp in 'Bedtime for Bonzo'?" > Sending text to the LLM. Chain ID: 'd55439f2-f564-4080-8781-b6363bd739ce' Parent chain ID: '4cc5178b-452c-48ca-8c10-d23837e3191a' Text sent to LLM: Answer the following questions as best you can. You have access to the following tools: Wikipedia: A wrapper around Wikipedia. Useful for when you need to answer general questions about people, places, companies, facts, historical events, or other subjects. Input should be a search query. Calculator: Useful for when you need to answer questions about math. Use the following format: Question: the input question you must answer Thought: you should always think about what to do Action: the action to take, should be one of [Wikipedia, Calculator] Action Input: the input to the action Observation: the result of the action ... (this Thought/Action/Action Input/Observation can repeat N times) Thought: I now know the final answer Final Answer: the final answer to the original input question Begin! Question: What US President costarred with a chimp in 'Bedtime for Bonzo'? Thought:I need to find out what US President costarred with a chimp in 'Bedtime for Bonzo' Action: Wikipedia Action Input: bedtime for bonzo Observation: Page: Bedtime for Bonzo Summary: Bedtime for Bonzo is a 1951 American comedy film directed by Fred de Cordova and starring Ronald Reagan, Diana Lynn, and a chimpanzee named Peggy as Bonzo. Its central character, psychology professor Peter Boyd (Reagan), tries to teach human morals to a chimpanzee, hoping to solve the "nature versus nurture" question. Boyd hires Jane Linden (Lynn) to pose as the chimpanzee's mother while he plays father to it and uses 1950s-era child-rearing techniques.A sequel was released titled Bonzo Goes to College (1952), but it featured none of the three lead performers from the original film. Peggy, who had also appeared in My Friend Irma Goes West (1950), died in a fire on March 4, 1951, so another chimpanzee was hired for the second film. Reagan did not want to appear in the second film as he thought that the premise was unbelievable. Page: Bedtime for Democracy Summary: Bedtime for Democracy is the fourth and final studio album by American punk rock band Dead Kennedys. Released in 1986, songs on this album cover common punk subjects often found in punk rock lyrics of the era such as conformity, Reaganomics, the U.S. military, and critique of the hardcore punk movement. The album's title refers to the 1951 comedy film, Bedtime for Bonzo starring Ronald Reagan and also reflects the band's weary bitterness from the trial they were undergoing at the time over the controversial art included with their previous album. By the time recording of Bedtime for Democracy had begun, the Dead Kennedys had already played what would be their last concert with Jello Biafra and announced their breakup immediately after the release of the record, whose opening track is a cover of David Allan Coe's "Take This Job and Shove It." Thought: > Received response from LLM. Chain ID: 'd55439f2-f564-4080-8781-b6363bd739ce' Parent chain ID: '4cc5178b-452c-48ca-8c10-d23837e3191a' Text received from LLM: I now know the final answer Final Answer: Ronald Reagan > Ending chain. Chain ID: '4cc5178b-452c-48ca-8c10-d23837e3191a' Parent chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Output text: "['I now know the final answer', 'Final Answer: Ronald Reagan']" > Agent has finished. Chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Parent chain ID: 'None' Action finish log: "['I now know the final answer', 'Final Answer: Ronald Reagan']" > Ending chain. Chain ID: '38181f4e-4bbc-4ee4-811f-31ccebba88aa' Parent chain ID: 'None' Output output: "['Ronald Reagan']"
'Ronald Reagan'
It's now possible to follow the complete set of calls to the LLM and track all the calls out to Wikipedia. Compare this to Langchain's built-in verbose mode, which doesn't print the complete LLM prompts.
agent = initialize_agent(tools,
llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
verbose=True)
agent.run("What US President costarred with a chimp in 'Bedtime for Bonzo'?")
> Entering new AgentExecutor chain... I need to find out what US President costarred with a chimp in 'Bedtime for Bonzo' Action: Wikipedia Action Input: bedtime for bonzo
/root/.local/lib/python3.10/site-packages/wikipedia/wikipedia.py:389: GuessedAtParserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently. The code that caused this warning is on line 389 of the file /root/.local/lib/python3.10/site-packages/wikipedia/wikipedia.py. To get rid of this warning, pass the additional argument 'features="lxml"' to the BeautifulSoup constructor. lis = BeautifulSoup(html).find_all('li')
Observation: Page: Bedtime for Bonzo Summary: Bedtime for Bonzo is a 1951 American comedy film directed by Fred de Cordova and starring Ronald Reagan, Diana Lynn, and a chimpanzee named Peggy as Bonzo. Its central character, psychology professor Peter Boyd (Reagan), tries to teach human morals to a chimpanzee, hoping to solve the "nature versus nurture" question. Boyd hires Jane Linden (Lynn) to pose as the chimpanzee's mother while he plays father to it and uses 1950s-era child-rearing techniques.A sequel was released titled Bonzo Goes to College (1952), but it featured none of the three lead performers from the original film. Peggy, who had also appeared in My Friend Irma Goes West (1950), died in a fire on March 4, 1951, so another chimpanzee was hired for the second film. Reagan did not want to appear in the second film as he thought that the premise was unbelievable. Page: Bedtime for Democracy Summary: Bedtime for Democracy is the fourth and final studio album by American punk rock band Dead Kennedys. Released in 1986, songs on this album cover common punk subjects often found in punk rock lyrics of the era such as conformity, Reaganomics, the U.S. military, and critique of the hardcore punk movement. The album's title refers to the 1951 comedy film, Bedtime for Bonzo starring Ronald Reagan and also reflects the band's weary bitterness from the trial they were undergoing at the time over the controversial art included with their previous album. By the time recording of Bedtime for Democracy had begun, the Dead Kennedys had already played what would be their last concert with Jello Biafra and announced their breakup immediately after the release of the record, whose opening track is a cover of David Allan Coe's "Take This Job and Shove It." Thought:I now know the final answer Final Answer: Ronald Reagan > Finished chain.
'Ronald Reagan'
Verbose mode can also hide important details. Can you tell the cause of this failure?
agent = initialize_agent(tools,
llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
verbose=True)
agent.run("What day of the week was September 1st, 2010?")
> Entering new AgentExecutor chain... I need to know what day of the week September 1st, 2010 was Action: Calculator Action Input: 1 September 2010
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) ~/.local/lib/python3.10/site-packages/langchain/chains/llm_math/base.py in _evaluate_expression(self, expression) 87 output = str( ---> 88 numexpr.evaluate( 89 expression.strip(), /usr/local/lib/python3.10/dist-packages/numexpr/necompiler.py in evaluate(ex, local_dict, global_dict, out, order, casting, sanitize, _frame_depth, **kwargs) 974 else: --> 975 raise e 976 /usr/local/lib/python3.10/dist-packages/numexpr/necompiler.py in validate(ex, local_dict, global_dict, out, order, casting, _frame_depth, sanitize, **kwargs) 871 if expr_key not in _names_cache: --> 872 _names_cache[expr_key] = getExprNames(ex, context, sanitize=sanitize) 873 names, ex_uses_vml = _names_cache[expr_key] /usr/local/lib/python3.10/dist-packages/numexpr/necompiler.py in getExprNames(text, context, sanitize) 720 def getExprNames(text, context, sanitize: bool=True): --> 721 ex = stringToExpression(text, {}, context, sanitize) 722 ast = expressionToAST(ex) /usr/local/lib/python3.10/dist-packages/numexpr/necompiler.py in stringToExpression(s, types, context, sanitize) 280 if _blacklist_re.search(no_whitespace) is not None: --> 281 raise ValueError(f'Expression {s} has forbidden control characters.') 282 ValueError: Expression datetime.datetime(2010, 9, 1) has forbidden control characters. During handling of the above exception, another exception occurred: ValueError Traceback (most recent call last) <ipython-input-17-2d9482acf43f> in <cell line: 5>() 3 agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, 4 verbose=True) ----> 5 agent.run("What day of the week was September 1st, 2010?") ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in run(self, callbacks, tags, metadata, *args, **kwargs) 501 if len(args) != 1: 502 raise ValueError("`run` supports only one positional argument.") --> 503 return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[ 504 _output_key 505 ] ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 306 except BaseException as e: 307 run_manager.on_chain_error(e) --> 308 raise e 309 run_manager.on_chain_end(outputs) 310 final_outputs: Dict[str, Any] = self.prep_outputs( ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 300 try: 301 outputs = ( --> 302 self._call(inputs, run_manager=run_manager) 303 if new_arg_supported 304 else self._call(inputs) ~/.local/lib/python3.10/site-packages/langchain/agents/agent.py in _call(self, inputs, run_manager) 1139 # We now enter the agent loop (until it returns something). 1140 while self._should_continue(iterations, time_elapsed): -> 1141 next_step_output = self._take_next_step( 1142 name_to_tool_map, 1143 color_mapping, ~/.local/lib/python3.10/site-packages/langchain/agents/agent.py in _take_next_step(self, name_to_tool_map, color_mapping, inputs, intermediate_steps, run_manager) 989 tool_run_kwargs["llm_prefix"] = "" 990 # We then call the tool on the tool input to get an observation --> 991 observation = tool.run( 992 agent_action.tool_input, 993 verbose=self.verbose, ~/.local/lib/python3.10/site-packages/langchain/tools/base.py in run(self, tool_input, verbose, start_color, color, callbacks, tags, metadata, run_name, **kwargs) 362 except (Exception, KeyboardInterrupt) as e: 363 run_manager.on_tool_error(e) --> 364 raise e 365 else: 366 run_manager.on_tool_end( ~/.local/lib/python3.10/site-packages/langchain/tools/base.py in run(self, tool_input, verbose, start_color, color, callbacks, tags, metadata, run_name, **kwargs) 334 tool_args, tool_kwargs = self._to_args_and_kwargs(parsed_input) 335 observation = ( --> 336 self._run(*tool_args, run_manager=run_manager, **tool_kwargs) 337 if new_arg_supported 338 else self._run(*tool_args, **tool_kwargs) ~/.local/lib/python3.10/site-packages/langchain/tools/base.py in _run(self, run_manager, *args, **kwargs) 507 new_argument_supported = signature(self.func).parameters.get("callbacks") 508 return ( --> 509 self.func( 510 *args, 511 callbacks=run_manager.get_child() if run_manager else None, ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in run(self, callbacks, tags, metadata, *args, **kwargs) 501 if len(args) != 1: 502 raise ValueError("`run` supports only one positional argument.") --> 503 return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[ 504 _output_key 505 ] ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 306 except BaseException as e: 307 run_manager.on_chain_error(e) --> 308 raise e 309 run_manager.on_chain_end(outputs) 310 final_outputs: Dict[str, Any] = self.prep_outputs( ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 300 try: 301 outputs = ( --> 302 self._call(inputs, run_manager=run_manager) 303 if new_arg_supported 304 else self._call(inputs) ~/.local/lib/python3.10/site-packages/langchain/chains/llm_math/base.py in _call(self, inputs, run_manager) 155 callbacks=_run_manager.get_child(), 156 ) --> 157 return self._process_llm_result(llm_output, _run_manager) 158 159 async def _acall( ~/.local/lib/python3.10/site-packages/langchain/chains/llm_math/base.py in _process_llm_result(self, llm_output, run_manager) 109 if text_match: 110 expression = text_match.group(1) --> 111 output = self._evaluate_expression(expression) 112 run_manager.on_text("\nAnswer: ", verbose=self.verbose) 113 run_manager.on_text(output, color="yellow", verbose=self.verbose) ~/.local/lib/python3.10/site-packages/langchain/chains/llm_math/base.py in _evaluate_expression(self, expression) 93 ) 94 except Exception as e: ---> 95 raise ValueError( 96 f'LLMMathChain._evaluate("{expression}") raised error: {e}.' 97 " Please try again with a valid numerical expression" ValueError: LLMMathChain._evaluate(" datetime.datetime(2010, 9, 1) ") raised error: Expression datetime.datetime(2010, 9, 1) has forbidden control characters.. Please try again with a valid numerical expression
When you can see the full call langchain makes to the LLM to use the math tool, it's easier to see that the failure is due to the LLM returning a response to the math tool prompt that can't be parsed by numexpr
:
agent = initialize_agent(tools,
llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
verbose=False)
agent.run("What day of the week was September 1st, 2010?",
callbacks=[handler])
> Starting new chain. Chain class: 'langchain.agents.agent.AgentExecutor' Chain ID: '7aa29d2a-f2d3-49bb-ba4e-934115d996a8' Parent chain ID: 'None' Iterating through keys/values of chain inputs: input: 'What day of the week was September 1st, 2010?' > Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: '23739bfa-fd3b-40b2-a499-64a06d9e7959' Parent chain ID: '7aa29d2a-f2d3-49bb-ba4e-934115d996a8' Iterating through keys/values of chain inputs: input: 'What day of the week was September 1st, 2010?' > Sending text to the LLM. Chain ID: '7cb565f7-481f-40c1-9b2c-d1562aa71aac' Parent chain ID: '23739bfa-fd3b-40b2-a499-64a06d9e7959' Text sent to LLM: Answer the following questions as best you can. You have access to the following tools: Wikipedia: A wrapper around Wikipedia. Useful for when you need to answer general questions about people, places, companies, facts, historical events, or other subjects. Input should be a search query. Calculator: Useful for when you need to answer questions about math. Use the following format: Question: the input question you must answer Thought: you should always think about what to do Action: the action to take, should be one of [Wikipedia, Calculator] Action Input: the input to the action Observation: the result of the action ... (this Thought/Action/Action Input/Observation can repeat N times) Thought: I now know the final answer Final Answer: the final answer to the original input question Begin! Question: What day of the week was September 1st, 2010? Thought: > Received response from LLM. Chain ID: '7cb565f7-481f-40c1-9b2c-d1562aa71aac' Parent chain ID: '23739bfa-fd3b-40b2-a499-64a06d9e7959' Text received from LLM: I need to know what day of the week September 1st, 2010 was Action: Calculator Action Input: 1 September 2010 > Ending chain. Chain ID: '23739bfa-fd3b-40b2-a499-64a06d9e7959' Parent chain ID: '7aa29d2a-f2d3-49bb-ba4e-934115d996a8' Output text: "['I need to know what day of the week September 1st, 2010 was', " "'Action: Calculator', 'Action Input: 1 September 2010']" > Agent taking an action. Chain ID: '7aa29d2a-f2d3-49bb-ba4e-934115d996a8' Parent chain ID: 'None' Action log: "['I need to know what day of the week September 1st, 2010 was', " "'Action: Calculator', 'Action Input: 1 September 2010']" > Using tool. Chain ID: '18c858d3-34fc-43d6-9209-acf8d2eea920' Parent chain ID: '7aa29d2a-f2d3-49bb-ba4e-934115d996a8' Tool name: 'Calculator' Query sent to tool: 1 September 2010 > Starting new chain. Chain class: 'langchain.chains.llm_math.base.LLMMathChain' Chain ID: 'b13dea64-ee04-472a-abe1-dcc7f889e1cf' Parent chain ID: '18c858d3-34fc-43d6-9209-acf8d2eea920' Iterating through keys/values of chain inputs: question: '1 September 2010' > Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: 'e0bbab40-9571-4a76-a45b-c7d84079b9fb' Parent chain ID: 'b13dea64-ee04-472a-abe1-dcc7f889e1cf' Iterating through keys/values of chain inputs: question: '1 September 2010' > Sending text to the LLM. Chain ID: 'c512f093-0fdf-4661-8e6e-9ef08995a13e' Parent chain ID: 'e0bbab40-9571-4a76-a45b-c7d84079b9fb' Text sent to LLM: Translate a math problem into a expression that can be executed using Python's numexpr library. Use the output of running this code to answer the question. Question: ${Question with math problem.} ```text ${single line mathematical expression that solves the problem} ``` ...numexpr.evaluate(text)... ```output ${Output of running the code} ``` Answer: ${Answer} Begin. Question: What is 37593 * 67? ```text 37593 * 67 ``` ...numexpr.evaluate("37593 * 67")... ```output 2518731 ``` Answer: 2518731 Question: 37593^(1/5) ```text 37593**(1/5) ``` ...numexpr.evaluate("37593**(1/5)")... ```output 8.222831614237718 ``` Answer: 8.222831614237718 Question: 1 September 2010 > Received response from LLM. Chain ID: 'c512f093-0fdf-4661-8e6e-9ef08995a13e' Parent chain ID: 'e0bbab40-9571-4a76-a45b-c7d84079b9fb' Text received from LLM: ```text datetime.datetime(2010, 9, 1) ``` ...numexpr.evaluate("datetime.datetime(2010, 9, 1)")... > Ending chain. Chain ID: 'e0bbab40-9571-4a76-a45b-c7d84079b9fb' Parent chain ID: 'b13dea64-ee04-472a-abe1-dcc7f889e1cf' Output text: "['```text', 'datetime.datetime(2010, 9, 1)', '```', " "'...numexpr.evaluate(\"datetime.datetime(2010, 9, 1)\")...']" Chain Error Error object: 'LLMMathChain._evaluate("\ndatetime.datetime(2010, 9, 1)\n") raised ' 'error: Expression datetime.datetime(2010, 9, 1) has forbidden ' 'control characters.. Please try again with a valid numerical ' 'expression' Chain Error Error object: 'LLMMathChain._evaluate("\ndatetime.datetime(2010, 9, 1)\n") raised ' 'error: Expression datetime.datetime(2010, 9, 1) has forbidden ' 'control characters.. Please try again with a valid numerical ' 'expression' Chain Error Error object: 'LLMMathChain._evaluate("\ndatetime.datetime(2010, 9, 1)\n") raised ' 'error: Expression datetime.datetime(2010, 9, 1) has forbidden ' 'control characters.. Please try again with a valid numerical ' 'expression'
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) ~/.local/lib/python3.10/site-packages/langchain/chains/llm_math/base.py in _evaluate_expression(self, expression) 87 output = str( ---> 88 numexpr.evaluate( 89 expression.strip(), /usr/local/lib/python3.10/dist-packages/numexpr/necompiler.py in evaluate(ex, local_dict, global_dict, out, order, casting, sanitize, _frame_depth, **kwargs) 974 else: --> 975 raise e 976 /usr/local/lib/python3.10/dist-packages/numexpr/necompiler.py in validate(ex, local_dict, global_dict, out, order, casting, _frame_depth, sanitize, **kwargs) 871 if expr_key not in _names_cache: --> 872 _names_cache[expr_key] = getExprNames(ex, context, sanitize=sanitize) 873 names, ex_uses_vml = _names_cache[expr_key] /usr/local/lib/python3.10/dist-packages/numexpr/necompiler.py in getExprNames(text, context, sanitize) 720 def getExprNames(text, context, sanitize: bool=True): --> 721 ex = stringToExpression(text, {}, context, sanitize) 722 ast = expressionToAST(ex) /usr/local/lib/python3.10/dist-packages/numexpr/necompiler.py in stringToExpression(s, types, context, sanitize) 280 if _blacklist_re.search(no_whitespace) is not None: --> 281 raise ValueError(f'Expression {s} has forbidden control characters.') 282 ValueError: Expression datetime.datetime(2010, 9, 1) has forbidden control characters. During handling of the above exception, another exception occurred: ValueError Traceback (most recent call last) <ipython-input-18-dab86b905e64> in <cell line: 5>() 3 agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, 4 verbose=False) ----> 5 agent.run("What day of the week was September 1st, 2010?", 6 callbacks=[handler]) ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in run(self, callbacks, tags, metadata, *args, **kwargs) 501 if len(args) != 1: 502 raise ValueError("`run` supports only one positional argument.") --> 503 return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[ 504 _output_key 505 ] ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 306 except BaseException as e: 307 run_manager.on_chain_error(e) --> 308 raise e 309 run_manager.on_chain_end(outputs) 310 final_outputs: Dict[str, Any] = self.prep_outputs( ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 300 try: 301 outputs = ( --> 302 self._call(inputs, run_manager=run_manager) 303 if new_arg_supported 304 else self._call(inputs) ~/.local/lib/python3.10/site-packages/langchain/agents/agent.py in _call(self, inputs, run_manager) 1139 # We now enter the agent loop (until it returns something). 1140 while self._should_continue(iterations, time_elapsed): -> 1141 next_step_output = self._take_next_step( 1142 name_to_tool_map, 1143 color_mapping, ~/.local/lib/python3.10/site-packages/langchain/agents/agent.py in _take_next_step(self, name_to_tool_map, color_mapping, inputs, intermediate_steps, run_manager) 989 tool_run_kwargs["llm_prefix"] = "" 990 # We then call the tool on the tool input to get an observation --> 991 observation = tool.run( 992 agent_action.tool_input, 993 verbose=self.verbose, ~/.local/lib/python3.10/site-packages/langchain/tools/base.py in run(self, tool_input, verbose, start_color, color, callbacks, tags, metadata, run_name, **kwargs) 362 except (Exception, KeyboardInterrupt) as e: 363 run_manager.on_tool_error(e) --> 364 raise e 365 else: 366 run_manager.on_tool_end( ~/.local/lib/python3.10/site-packages/langchain/tools/base.py in run(self, tool_input, verbose, start_color, color, callbacks, tags, metadata, run_name, **kwargs) 334 tool_args, tool_kwargs = self._to_args_and_kwargs(parsed_input) 335 observation = ( --> 336 self._run(*tool_args, run_manager=run_manager, **tool_kwargs) 337 if new_arg_supported 338 else self._run(*tool_args, **tool_kwargs) ~/.local/lib/python3.10/site-packages/langchain/tools/base.py in _run(self, run_manager, *args, **kwargs) 507 new_argument_supported = signature(self.func).parameters.get("callbacks") 508 return ( --> 509 self.func( 510 *args, 511 callbacks=run_manager.get_child() if run_manager else None, ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in run(self, callbacks, tags, metadata, *args, **kwargs) 501 if len(args) != 1: 502 raise ValueError("`run` supports only one positional argument.") --> 503 return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[ 504 _output_key 505 ] ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 306 except BaseException as e: 307 run_manager.on_chain_error(e) --> 308 raise e 309 run_manager.on_chain_end(outputs) 310 final_outputs: Dict[str, Any] = self.prep_outputs( ~/.local/lib/python3.10/site-packages/langchain/chains/base.py in __call__(self, inputs, return_only_outputs, callbacks, tags, metadata, run_name, include_run_info) 300 try: 301 outputs = ( --> 302 self._call(inputs, run_manager=run_manager) 303 if new_arg_supported 304 else self._call(inputs) ~/.local/lib/python3.10/site-packages/langchain/chains/llm_math/base.py in _call(self, inputs, run_manager) 155 callbacks=_run_manager.get_child(), 156 ) --> 157 return self._process_llm_result(llm_output, _run_manager) 158 159 async def _acall( ~/.local/lib/python3.10/site-packages/langchain/chains/llm_math/base.py in _process_llm_result(self, llm_output, run_manager) 109 if text_match: 110 expression = text_match.group(1) --> 111 output = self._evaluate_expression(expression) 112 run_manager.on_text("\nAnswer: ", verbose=self.verbose) 113 run_manager.on_text(output, color="yellow", verbose=self.verbose) ~/.local/lib/python3.10/site-packages/langchain/chains/llm_math/base.py in _evaluate_expression(self, expression) 93 ) 94 except Exception as e: ---> 95 raise ValueError( 96 f'LLMMathChain._evaluate("{expression}") raised error: {e}.' 97 " Please try again with a valid numerical expression" ValueError: LLMMathChain._evaluate(" datetime.datetime(2010, 9, 1) ") raised error: Expression datetime.datetime(2010, 9, 1) has forbidden control characters.. Please try again with a valid numerical expression
It can be difficult to see how different built-in Langchain agents types prompt the LLM differently, even with verbose=True
.
The first example here uses the same agent setup as above, with a Wikipedia and math tool:
question = "What TV show inspired the saying 'Jumping the Shark'?"
agent = initialize_agent(tools,
llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
verbose=True)
agent.run(question)
> Entering new AgentExecutor chain... I need to know what TV show inspired the saying 'Jumping the Shark' Action: Wikipedia Action Input: jumping the shark Observation: Page: Jumping the shark Summary: The idiom "jumping the shark" or "jump the shark" is a pejorative that is used to argue that a creative work or entity has reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with, or an overexaggeration of, its original purpose. The phrase was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. Page: Jump the Shark (The X-Files) Summary: "Jump the Shark" is the fifteenth episode of the ninth season of the American science fiction television series The X-Files. The episode first aired in the United States on April 21, 2002 on the Fox network. It was written by executive producers Vince Gilligan, John Shiban and Frank Spotnitz, and directed by Cliff Bole. The episode is a "monster-of-the-week" episode—unconnected to the series' wider mythology—and was created to give closure for The Lone Gunmen television series, which was a spin-off of The X-Files. The episode earned a Nielsen rating of 5.1 and was viewed by 8.6 million viewers. The episode received mixed to negative reviews from television critics. The show centers on FBI special agents who work on cases linked to the paranormal, called X-Files; this season focuses on the investigations of John Doggett (Robert Patrick), Monica Reyes (Annabeth Gish), and Dana Scully (Gillian Anderson). In this episode, Doggett and Reyes attempt to locate a female friend of The Lone Gunmen after former Area 51 Man-in-Black Morris Fletcher appears and claims that she is actually a super-soldier. What Doggett and Reyes soon discover is a bizarre plot to unleash a biological weapon via the use of grafted shark organs. "Jump the Shark" features the death of The Lone Gunmen—popular recurring characters who first appeared in the first season episode "E.B.E.", although this plot was later retconned in the comic book series The X-Files Season 10. The episode proved difficult to make because, after the cancellation of The Lone Gunmen television series, Fox was adamant that the characters not have a featured role back on The X-Files. (The characters did appear in four previous season 9 episodes, but always very briefly.) The choice to kill off the trio was controversial. Writers Spotnitz and Gilligan later revealed some regret with the way the episode was handled. However, actors Bruce Harwood and Dean Haglund were happy with the way the episode ended. The episode title is a humorous reference to the phrase "jumping the shark", which is used to describe shows that are in decline and therefore try a gimmick to get attention. Page: Ted McGinley Summary: Ted McGinley (born May 30, 1958) is an American actor. He is known for his roles as Jefferson D'Arcy on the television sitcom Married... with Children and as Charley Shanowski on the ABC sitcom Hope & Faith. He was a late regular on Happy Days, Dynasty and The Love Boat and is known for playing the villainous role of Stan Gable in the film Revenge of the Nerds and several made-for-television sequels. Thought:I now know the final answer Final Answer: Happy Days > Finished chain.
'Happy Days'
The same query, sent to a docstore agent:
from langchain.agents import Tool
from langchain.agents.react.base import DocstoreExplorer
from langchain import Wikipedia
docstore = DocstoreExplorer(Wikipedia())
doc_tools = [
Tool(name="Search",
func=docstore.search,
description="useful for when you need to ask with search",),
Tool(name="Lookup",
func=docstore.lookup,
description="useful for when you need to ask with lookup",),
]
doc_agent = initialize_agent(doc_tools,
llm,
agent=AgentType.REACT_DOCSTORE,
verbose=True)
doc_agent({"input": question})
> Entering new AgentExecutor chain... Thought: I need to search "Jumping the Shark" and find the TV show that inspired the saying. Action: Search[Jumping the Shark] Observation: The idiom "jumping the shark" or "jump the shark" is a pejorative that is used to argue that a creative work or entity has reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with, or an overexaggeration of, its original purpose. The phrase was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. Thought:The idiom "jumping the shark" was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. So the TV show that inspired the saying is Happy Days. Action: Finish[Happy Days] > Finished chain.
{'input': "What TV show inspired the saying 'Jumping the Shark'?", 'output': 'Happy Days'}
Running these with the AllChainDetails
callback handler more clearly shows how the agents prompt the LLM differently.
agent.run(question, callbacks=[handler])
> Starting new chain. Chain class: 'langchain.agents.agent.AgentExecutor' Chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Parent chain ID: 'None' Iterating through keys/values of chain inputs: input: "What TV show inspired the saying 'Jumping the Shark'?" > Entering new AgentExecutor chain... > Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: 'e186a09b-6d92-4ff8-ae38-53fb0fdcdcbd' Parent chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Iterating through keys/values of chain inputs: input: "What TV show inspired the saying 'Jumping the Shark'?" > Sending text to the LLM. Chain ID: 'ce757510-a537-446f-9c53-b492e273d406' Parent chain ID: 'e186a09b-6d92-4ff8-ae38-53fb0fdcdcbd' Text sent to LLM: Answer the following questions as best you can. You have access to the following tools: Wikipedia: A wrapper around Wikipedia. Useful for when you need to answer general questions about people, places, companies, facts, historical events, or other subjects. Input should be a search query. Calculator: Useful for when you need to answer questions about math. Use the following format: Question: the input question you must answer Thought: you should always think about what to do Action: the action to take, should be one of [Wikipedia, Calculator] Action Input: the input to the action Observation: the result of the action ... (this Thought/Action/Action Input/Observation can repeat N times) Thought: I now know the final answer Final Answer: the final answer to the original input question Begin! Question: What TV show inspired the saying 'Jumping the Shark'? Thought: > Received response from LLM. Chain ID: 'ce757510-a537-446f-9c53-b492e273d406' Parent chain ID: 'e186a09b-6d92-4ff8-ae38-53fb0fdcdcbd' Text received from LLM: I need to know what TV show inspired the saying 'Jumping the Shark' Action: Wikipedia Action Input: jumping the shark > Ending chain. Chain ID: 'e186a09b-6d92-4ff8-ae38-53fb0fdcdcbd' Parent chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Output text: "[\"I need to know what TV show inspired the saying 'Jumping the " "Shark'\", 'Action: Wikipedia', 'Action Input: jumping the shark']" > Agent taking an action. Chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Parent chain ID: 'None' Action log: "[\"I need to know what TV show inspired the saying 'Jumping the " "Shark'\", 'Action: Wikipedia', 'Action Input: jumping the shark']" I need to know what TV show inspired the saying 'Jumping the Shark' Action: Wikipedia Action Input: jumping the shark > Using tool. Chain ID: '594b7d5a-6460-4904-8871-6a04841d7b8c' Parent chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Tool name: 'Wikipedia' Query sent to tool: jumping the shark > Received tool output. Chain ID: '594b7d5a-6460-4904-8871-6a04841d7b8c' Parent chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Tool name: 'Wikipedia' Response from tool: Page: Jumping the shark Summary: The idiom "jumping the shark" or "jump the shark" is a pejorative that is used to argue that a creative work or entity has reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with, or an overexaggeration of, its original purpose. The phrase was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. Page: Jump the Shark (The X-Files) Summary: "Jump the Shark" is the fifteenth episode of the ninth season of the American science fiction television series The X-Files. The episode first aired in the United States on April 21, 2002 on the Fox network. It was written by executive producers Vince Gilligan, John Shiban and Frank Spotnitz, and directed by Cliff Bole. The episode is a "monster-of-the-week" episode—unconnected to the series' wider mythology—and was created to give closure for The Lone Gunmen television series, which was a spin-off of The X-Files. The episode earned a Nielsen rating of 5.1 and was viewed by 8.6 million viewers. The episode received mixed to negative reviews from television critics. The show centers on FBI special agents who work on cases linked to the paranormal, called X-Files; this season focuses on the investigations of John Doggett (Robert Patrick), Monica Reyes (Annabeth Gish), and Dana Scully (Gillian Anderson). In this episode, Doggett and Reyes attempt to locate a female friend of The Lone Gunmen after former Area 51 Man-in-Black Morris Fletcher appears and claims that she is actually a super-soldier. What Doggett and Reyes soon discover is a bizarre plot to unleash a biological weapon via the use of grafted shark organs. "Jump the Shark" features the death of The Lone Gunmen—popular recurring characters who first appeared in the first season episode "E.B.E.", although this plot was later retconned in the comic book series The X-Files Season 10. The episode proved difficult to make because, after the cancellation of The Lone Gunmen television series, Fox was adamant that the characters not have a featured role back on The X-Files. (The characters did appear in four previous season 9 episodes, but always very briefly.) The choice to kill off the trio was controversial. Writers Spotnitz and Gilligan later revealed some regret with the way the episode was handled. However, actors Bruce Harwood and Dean Haglund were happy with the way the episode ended. The episode title is a humorous reference to the phrase "jumping the shark", which is used to describe shows that are in decline and therefore try a gimmick to get attention. Page: Ted McGinley Summary: Ted McGinley (born May 30, 1958) is an American actor. He is known for his roles as Jefferson D'Arcy on the television sitcom Married... with Children and as Charley Shanowski on the ABC sitcom Hope & Faith. He was a late regular on Happy Days, Dynasty and The Love Boat and is known for playing the villainous role of Stan Gable in the film Revenge of the Nerds and several made-for-television sequels. Observation: Page: Jumping the shark Summary: The idiom "jumping the shark" or "jump the shark" is a pejorative that is used to argue that a creative work or entity has reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with, or an overexaggeration of, its original purpose. The phrase was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. Page: Jump the Shark (The X-Files) Summary: "Jump the Shark" is the fifteenth episode of the ninth season of the American science fiction television series The X-Files. The episode first aired in the United States on April 21, 2002 on the Fox network. It was written by executive producers Vince Gilligan, John Shiban and Frank Spotnitz, and directed by Cliff Bole. The episode is a "monster-of-the-week" episode—unconnected to the series' wider mythology—and was created to give closure for The Lone Gunmen television series, which was a spin-off of The X-Files. The episode earned a Nielsen rating of 5.1 and was viewed by 8.6 million viewers. The episode received mixed to negative reviews from television critics. The show centers on FBI special agents who work on cases linked to the paranormal, called X-Files; this season focuses on the investigations of John Doggett (Robert Patrick), Monica Reyes (Annabeth Gish), and Dana Scully (Gillian Anderson). In this episode, Doggett and Reyes attempt to locate a female friend of The Lone Gunmen after former Area 51 Man-in-Black Morris Fletcher appears and claims that she is actually a super-soldier. What Doggett and Reyes soon discover is a bizarre plot to unleash a biological weapon via the use of grafted shark organs. "Jump the Shark" features the death of The Lone Gunmen—popular recurring characters who first appeared in the first season episode "E.B.E.", although this plot was later retconned in the comic book series The X-Files Season 10. The episode proved difficult to make because, after the cancellation of The Lone Gunmen television series, Fox was adamant that the characters not have a featured role back on The X-Files. (The characters did appear in four previous season 9 episodes, but always very briefly.) The choice to kill off the trio was controversial. Writers Spotnitz and Gilligan later revealed some regret with the way the episode was handled. However, actors Bruce Harwood and Dean Haglund were happy with the way the episode ended. The episode title is a humorous reference to the phrase "jumping the shark", which is used to describe shows that are in decline and therefore try a gimmick to get attention. Page: Ted McGinley Summary: Ted McGinley (born May 30, 1958) is an American actor. He is known for his roles as Jefferson D'Arcy on the television sitcom Married... with Children and as Charley Shanowski on the ABC sitcom Hope & Faith. He was a late regular on Happy Days, Dynasty and The Love Boat and is known for playing the villainous role of Stan Gable in the film Revenge of the Nerds and several made-for-television sequels. Thought: > Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: '34efa8d4-2c48-4a41-9aae-3115426c90b2' Parent chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Iterating through keys/values of chain inputs: input: "What TV show inspired the saying 'Jumping the Shark'?" > Sending text to the LLM. Chain ID: '0a50f0e5-b267-4e36-b18a-60555ce78181' Parent chain ID: '34efa8d4-2c48-4a41-9aae-3115426c90b2' Text sent to LLM: Answer the following questions as best you can. You have access to the following tools: Wikipedia: A wrapper around Wikipedia. Useful for when you need to answer general questions about people, places, companies, facts, historical events, or other subjects. Input should be a search query. Calculator: Useful for when you need to answer questions about math. Use the following format: Question: the input question you must answer Thought: you should always think about what to do Action: the action to take, should be one of [Wikipedia, Calculator] Action Input: the input to the action Observation: the result of the action ... (this Thought/Action/Action Input/Observation can repeat N times) Thought: I now know the final answer Final Answer: the final answer to the original input question Begin! Question: What TV show inspired the saying 'Jumping the Shark'? Thought:I need to know what TV show inspired the saying 'Jumping the Shark' Action: Wikipedia Action Input: jumping the shark Observation: Page: Jumping the shark Summary: The idiom "jumping the shark" or "jump the shark" is a pejorative that is used to argue that a creative work or entity has reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with, or an overexaggeration of, its original purpose. The phrase was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. Page: Jump the Shark (The X-Files) Summary: "Jump the Shark" is the fifteenth episode of the ninth season of the American science fiction television series The X-Files. The episode first aired in the United States on April 21, 2002 on the Fox network. It was written by executive producers Vince Gilligan, John Shiban and Frank Spotnitz, and directed by Cliff Bole. The episode is a "monster-of-the-week" episode—unconnected to the series' wider mythology—and was created to give closure for The Lone Gunmen television series, which was a spin-off of The X-Files. The episode earned a Nielsen rating of 5.1 and was viewed by 8.6 million viewers. The episode received mixed to negative reviews from television critics. The show centers on FBI special agents who work on cases linked to the paranormal, called X-Files; this season focuses on the investigations of John Doggett (Robert Patrick), Monica Reyes (Annabeth Gish), and Dana Scully (Gillian Anderson). In this episode, Doggett and Reyes attempt to locate a female friend of The Lone Gunmen after former Area 51 Man-in-Black Morris Fletcher appears and claims that she is actually a super-soldier. What Doggett and Reyes soon discover is a bizarre plot to unleash a biological weapon via the use of grafted shark organs. "Jump the Shark" features the death of The Lone Gunmen—popular recurring characters who first appeared in the first season episode "E.B.E.", although this plot was later retconned in the comic book series The X-Files Season 10. The episode proved difficult to make because, after the cancellation of The Lone Gunmen television series, Fox was adamant that the characters not have a featured role back on The X-Files. (The characters did appear in four previous season 9 episodes, but always very briefly.) The choice to kill off the trio was controversial. Writers Spotnitz and Gilligan later revealed some regret with the way the episode was handled. However, actors Bruce Harwood and Dean Haglund were happy with the way the episode ended. The episode title is a humorous reference to the phrase "jumping the shark", which is used to describe shows that are in decline and therefore try a gimmick to get attention. Page: Ted McGinley Summary: Ted McGinley (born May 30, 1958) is an American actor. He is known for his roles as Jefferson D'Arcy on the television sitcom Married... with Children and as Charley Shanowski on the ABC sitcom Hope & Faith. He was a late regular on Happy Days, Dynasty and The Love Boat and is known for playing the villainous role of Stan Gable in the film Revenge of the Nerds and several made-for-television sequels. Thought: > Received response from LLM. Chain ID: '0a50f0e5-b267-4e36-b18a-60555ce78181' Parent chain ID: '34efa8d4-2c48-4a41-9aae-3115426c90b2' Text received from LLM: I now know the final answer Final Answer: Happy Days > Ending chain. Chain ID: '34efa8d4-2c48-4a41-9aae-3115426c90b2' Parent chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Output text: "['I now know the final answer', 'Final Answer: Happy Days']" > Agent has finished. Chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Parent chain ID: 'None' Action finish log: "['I now know the final answer', 'Final Answer: Happy Days']" I now know the final answer Final Answer: Happy Days > Ending chain. Chain ID: '11f5945c-1cb0-4dae-87b7-5faee4d7f554' Parent chain ID: 'None' Output output: "['Happy Days']" > Finished chain.
'Happy Days'
doc_agent({"input": question}, callbacks=[handler])
> Starting new chain. Chain class: 'langchain.agents.agent.AgentExecutor' Chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Parent chain ID: 'None' Iterating through keys/values of chain inputs: input: "What TV show inspired the saying 'Jumping the Shark'?" > Entering new AgentExecutor chain... > Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: '99d20ccf-685c-49db-8264-4a5a29ca8281' Parent chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Iterating through keys/values of chain inputs: input: "What TV show inspired the saying 'Jumping the Shark'?" > Sending text to the LLM. Chain ID: 'a6bfe585-eadc-41ab-9259-0e3c172aac43' Parent chain ID: '99d20ccf-685c-49db-8264-4a5a29ca8281' Text sent to LLM: Question: What is the elevation range for the area that the eastern sector of the Colorado orogeny extends into? Thought: I need to search Colorado orogeny, find the area that the eastern sector of the Colorado orogeny extends into, then find the elevation range of the area. Action: Search[Colorado orogeny] Observation: The Colorado orogeny was an episode of mountain building (an orogeny) in Colorado and surrounding areas. Thought: It does not mention the eastern sector. So I need to look up eastern sector. Action: Lookup[eastern sector] Observation: (Result 1 / 1) The eastern sector extends into the High Plains and is called the Central Plains orogeny. Thought: The eastern sector of Colorado orogeny extends into the High Plains. So I need to search High Plains and find its elevation range. Action: Search[High Plains] Observation: High Plains refers to one of two distinct land regions Thought: I need to instead search High Plains (United States). Action: Search[High Plains (United States)] Observation: The High Plains are a subregion of the Great Plains. From east to west, the High Plains rise in elevation from around 1,800 to 7,000 ft (550 to 2,130 m).[3] Thought: High Plains rise in elevation from around 1,800 to 7,000 ft, so the answer is 1,800 to 7,000 ft. Action: Finish[1,800 to 7,000 ft] Question: Musician and satirist Allie Goertz wrote a song about the "The Simpsons" character Milhouse, who Matt Groening named after who? Thought: The question simplifies to "The Simpsons" character Milhouse is named after who. I only need to search Milhouse and find who it is named after. Action: Search[Milhouse] Observation: Milhouse Mussolini Van Houten is a recurring character in the Fox animated television series The Simpsons voiced by Pamela Hayden and created by Matt Groening. Thought: The paragraph does not tell who Milhouse is named after, maybe I can look up "named after". Action: Lookup[named after] Observation: (Result 1 / 1) Milhouse was named after U.S. president Richard Nixon, whose middle name was Milhous. Thought: Milhouse was named after U.S. president Richard Nixon, so the answer is Richard Nixon. Action: Finish[Richard Nixon] Question: Which documentary is about Finnish rock groups, Adam Clayton Powell or The Saimaa Gesture? Thought: I need to search Adam Clayton Powell and The Saimaa Gesture, and find which documentary is about Finnish rock groups. Action: Search[Adam Clayton Powell] Observation: Could not find [Adam Clayton Powell]. Similar: [’Adam Clayton Powell III’, ’Seventh Avenue (Manhattan)’, ’Adam Clayton Powell Jr. State Office Building’, ’Isabel Washington Powell’, ’Adam Powell’, ’Adam Clayton Powell (film)’, ’Giancarlo Esposito’]. Thought: To find the documentary, I can search Adam Clayton Powell (film). Action: Search[Adam Clayton Powell (film)] Observation: Adam Clayton Powell is a 1989 American documentary film directed by Richard Kilberg. The film is about the rise and fall of influential African-American politician Adam Clayton Powell Jr.[3][4] It was later aired as part of the PBS series The American Experience. Thought: Adam Clayton Powell (film) is a documentary about an African-American politician, not Finnish rock groups. So the documentary about Finnish rock groups must instead be The Saimaa Gesture. Action: Finish[The Saimaa Gesture] Question: What profession does Nicholas Ray and Elia Kazan have in common? Thought: I need to search Nicholas Ray and Elia Kazan, find their professions, then find the profession they have in common. Action: Search[Nicholas Ray] Observation: Nicholas Ray (born Raymond Nicholas Kienzle Jr., August 7, 1911 - June 16, 1979) was an American film director, screenwriter, and actor best known for the 1955 film Rebel Without a Cause. Thought: Professions of Nicholas Ray are director, screenwriter, and actor. I need to search Elia Kazan next and find his professions. Action: Search[Elia Kazan] Observation: Elia Kazan was an American film and theatre director, producer, screenwriter and actor. Thought: Professions of Elia Kazan are director, producer, screenwriter, and actor. So profession Nicholas Ray and Elia Kazan have in common is director, screenwriter, and actor. Action: Finish[director, screenwriter, actor] Question: Which magazine was started first Arthur’s Magazine or First for Women? Thought: I need to search Arthur’s Magazine and First for Women, and find which was started first. Action: Search[Arthur’s Magazine] Observation: Arthur’s Magazine (1844-1846) was an American literary periodical published in Philadelphia in the 19th century. Thought: Arthur’s Magazine was started in 1844. I need to search First for Women next. Action: Search[First for Women] Observation: First for Women is a woman’s magazine published by Bauer Media Group in the USA.[1] The magazine was started in 1989. Thought: First for Women was started in 1989. 1844 (Arthur’s Magazine) < 1989 (First for Women), so Arthur’s Magazine was started first. Action: Finish[Arthur’s Magazine] Question: Were Pavel Urysohn and Leonid Levin known for the same type of work? Thought: I need to search Pavel Urysohn and Leonid Levin, find their types of work, then find if they are the same. Action: Search[Pavel Urysohn] Observation: Pavel Samuilovich Urysohn (February 3, 1898 - August 17, 1924) was a Soviet mathematician who is best known for his contributions in dimension theory. Thought: Pavel Urysohn is a mathematician. I need to search Leonid Levin next and find its type of work. Action: Search[Leonid Levin] Observation: Leonid Anatolievich Levin is a Soviet-American mathematician and computer scientist. Thought: Leonid Levin is a mathematician and computer scientist. So Pavel Urysohn and Leonid Levin have the same type of work. Action: Finish[yes] Question: What TV show inspired the saying 'Jumping the Shark'? > Received response from LLM. Chain ID: 'a6bfe585-eadc-41ab-9259-0e3c172aac43' Parent chain ID: '99d20ccf-685c-49db-8264-4a5a29ca8281' Text received from LLM: Thought: I need to search "Jumping the Shark" and find the TV show that inspired the saying. Action: Search[Jumping the Shark] > Ending chain. Chain ID: '99d20ccf-685c-49db-8264-4a5a29ca8281' Parent chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Output text: "['Thought: I need to search \"Jumping the Shark\" and find the TV " "show that inspired the saying.', 'Action: Search[Jumping the " "Shark]']" > Agent taking an action. Chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Parent chain ID: 'None' Action log: "['Thought: I need to search \"Jumping the Shark\" and find the TV " "show that inspired the saying.', 'Action: Search[Jumping the " "Shark]']" Thought: I need to search "Jumping the Shark" and find the TV show that inspired the saying. Action: Search[Jumping the Shark] > Using tool. Chain ID: '54d2a1a9-2e19-4b5e-8d5a-10f1f0baa30f' Parent chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Tool name: 'Search' Query sent to tool: Jumping the Shark > Received tool output. Chain ID: '54d2a1a9-2e19-4b5e-8d5a-10f1f0baa30f' Parent chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Tool name: 'Search' Response from tool: The idiom "jumping the shark" or "jump the shark" is a pejorative that is used to argue that a creative work or entity has reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with, or an overexaggeration of, its original purpose. The phrase was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. Observation: The idiom "jumping the shark" or "jump the shark" is a pejorative that is used to argue that a creative work or entity has reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with, or an overexaggeration of, its original purpose. The phrase was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. Thought: > Starting new chain. Chain class: 'langchain.chains.llm.LLMChain' Chain ID: '900cd209-1caa-49bb-87a7-4d1a1317290e' Parent chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Iterating through keys/values of chain inputs: input: "What TV show inspired the saying 'Jumping the Shark'?" > Sending text to the LLM. Chain ID: 'b4876461-98de-4a95-bf8d-3340932f421e' Parent chain ID: '900cd209-1caa-49bb-87a7-4d1a1317290e' Text sent to LLM: Question: What is the elevation range for the area that the eastern sector of the Colorado orogeny extends into? Thought: I need to search Colorado orogeny, find the area that the eastern sector of the Colorado orogeny extends into, then find the elevation range of the area. Action: Search[Colorado orogeny] Observation: The Colorado orogeny was an episode of mountain building (an orogeny) in Colorado and surrounding areas. Thought: It does not mention the eastern sector. So I need to look up eastern sector. Action: Lookup[eastern sector] Observation: (Result 1 / 1) The eastern sector extends into the High Plains and is called the Central Plains orogeny. Thought: The eastern sector of Colorado orogeny extends into the High Plains. So I need to search High Plains and find its elevation range. Action: Search[High Plains] Observation: High Plains refers to one of two distinct land regions Thought: I need to instead search High Plains (United States). Action: Search[High Plains (United States)] Observation: The High Plains are a subregion of the Great Plains. From east to west, the High Plains rise in elevation from around 1,800 to 7,000 ft (550 to 2,130 m).[3] Thought: High Plains rise in elevation from around 1,800 to 7,000 ft, so the answer is 1,800 to 7,000 ft. Action: Finish[1,800 to 7,000 ft] Question: Musician and satirist Allie Goertz wrote a song about the "The Simpsons" character Milhouse, who Matt Groening named after who? Thought: The question simplifies to "The Simpsons" character Milhouse is named after who. I only need to search Milhouse and find who it is named after. Action: Search[Milhouse] Observation: Milhouse Mussolini Van Houten is a recurring character in the Fox animated television series The Simpsons voiced by Pamela Hayden and created by Matt Groening. Thought: The paragraph does not tell who Milhouse is named after, maybe I can look up "named after". Action: Lookup[named after] Observation: (Result 1 / 1) Milhouse was named after U.S. president Richard Nixon, whose middle name was Milhous. Thought: Milhouse was named after U.S. president Richard Nixon, so the answer is Richard Nixon. Action: Finish[Richard Nixon] Question: Which documentary is about Finnish rock groups, Adam Clayton Powell or The Saimaa Gesture? Thought: I need to search Adam Clayton Powell and The Saimaa Gesture, and find which documentary is about Finnish rock groups. Action: Search[Adam Clayton Powell] Observation: Could not find [Adam Clayton Powell]. Similar: [’Adam Clayton Powell III’, ’Seventh Avenue (Manhattan)’, ’Adam Clayton Powell Jr. State Office Building’, ’Isabel Washington Powell’, ’Adam Powell’, ’Adam Clayton Powell (film)’, ’Giancarlo Esposito’]. Thought: To find the documentary, I can search Adam Clayton Powell (film). Action: Search[Adam Clayton Powell (film)] Observation: Adam Clayton Powell is a 1989 American documentary film directed by Richard Kilberg. The film is about the rise and fall of influential African-American politician Adam Clayton Powell Jr.[3][4] It was later aired as part of the PBS series The American Experience. Thought: Adam Clayton Powell (film) is a documentary about an African-American politician, not Finnish rock groups. So the documentary about Finnish rock groups must instead be The Saimaa Gesture. Action: Finish[The Saimaa Gesture] Question: What profession does Nicholas Ray and Elia Kazan have in common? Thought: I need to search Nicholas Ray and Elia Kazan, find their professions, then find the profession they have in common. Action: Search[Nicholas Ray] Observation: Nicholas Ray (born Raymond Nicholas Kienzle Jr., August 7, 1911 - June 16, 1979) was an American film director, screenwriter, and actor best known for the 1955 film Rebel Without a Cause. Thought: Professions of Nicholas Ray are director, screenwriter, and actor. I need to search Elia Kazan next and find his professions. Action: Search[Elia Kazan] Observation: Elia Kazan was an American film and theatre director, producer, screenwriter and actor. Thought: Professions of Elia Kazan are director, producer, screenwriter, and actor. So profession Nicholas Ray and Elia Kazan have in common is director, screenwriter, and actor. Action: Finish[director, screenwriter, actor] Question: Which magazine was started first Arthur’s Magazine or First for Women? Thought: I need to search Arthur’s Magazine and First for Women, and find which was started first. Action: Search[Arthur’s Magazine] Observation: Arthur’s Magazine (1844-1846) was an American literary periodical published in Philadelphia in the 19th century. Thought: Arthur’s Magazine was started in 1844. I need to search First for Women next. Action: Search[First for Women] Observation: First for Women is a woman’s magazine published by Bauer Media Group in the USA.[1] The magazine was started in 1989. Thought: First for Women was started in 1989. 1844 (Arthur’s Magazine) < 1989 (First for Women), so Arthur’s Magazine was started first. Action: Finish[Arthur’s Magazine] Question: Were Pavel Urysohn and Leonid Levin known for the same type of work? Thought: I need to search Pavel Urysohn and Leonid Levin, find their types of work, then find if they are the same. Action: Search[Pavel Urysohn] Observation: Pavel Samuilovich Urysohn (February 3, 1898 - August 17, 1924) was a Soviet mathematician who is best known for his contributions in dimension theory. Thought: Pavel Urysohn is a mathematician. I need to search Leonid Levin next and find its type of work. Action: Search[Leonid Levin] Observation: Leonid Anatolievich Levin is a Soviet-American mathematician and computer scientist. Thought: Leonid Levin is a mathematician and computer scientist. So Pavel Urysohn and Leonid Levin have the same type of work. Action: Finish[yes] Question: What TV show inspired the saying 'Jumping the Shark'? Thought: I need to search "Jumping the Shark" and find the TV show that inspired the saying. Action: Search[Jumping the Shark] Observation: The idiom "jumping the shark" or "jump the shark" is a pejorative that is used to argue that a creative work or entity has reached a point in which it has exhausted its core intent and is introducing new ideas that are discordant with, or an overexaggeration of, its original purpose. The phrase was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. Thought: > Received response from LLM. Chain ID: 'b4876461-98de-4a95-bf8d-3340932f421e' Parent chain ID: '900cd209-1caa-49bb-87a7-4d1a1317290e' Text received from LLM: The idiom "jumping the shark" was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. So the TV show that inspired the saying is Happy Days. Action: Finish[Happy Days] > Ending chain. Chain ID: '900cd209-1caa-49bb-87a7-4d1a1317290e' Parent chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Output text: "['The idiom \"jumping the shark\" was coined in 1985 by radio " "personality Jon Hein in response to a 1977 episode from the fifth " "season of the American sitcom Happy Days, in which the character of " "Fonzie (Henry Winkler) jumps over a live shark while on water-skis. " "So the TV show that inspired the saying is Happy Days.', 'Action: " "Finish[Happy Days]']" > Agent has finished. Chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Parent chain ID: 'None' Action finish log: "['The idiom \"jumping the shark\" was coined in 1985 by radio " "personality Jon Hein in response to a 1977 episode from the fifth " "season of the American sitcom Happy Days, in which the character of " "Fonzie (Henry Winkler) jumps over a live shark while on water-skis. " "So the TV show that inspired the saying is Happy Days.', 'Action: " "Finish[Happy Days]']" The idiom "jumping the shark" was coined in 1985 by radio personality Jon Hein in response to a 1977 episode from the fifth season of the American sitcom Happy Days, in which the character of Fonzie (Henry Winkler) jumps over a live shark while on water-skis. So the TV show that inspired the saying is Happy Days. Action: Finish[Happy Days] > Ending chain. Chain ID: 'faa0c084-047f-4b9e-8cfe-7dedff07ee6e' Parent chain ID: 'None' Output output: "['Happy Days']" > Finished chain.
{'input': "What TV show inspired the saying 'Jumping the Shark'?", 'output': 'Happy Days'}