0% found this document useful (0 votes)
37 views11 pages

L1-Model - Prompt - Parser - Jupyter Notebook

The document outlines how to use LangChain for making API calls to OpenAI, including setting up prompts, models, and output parsers. It provides code examples for translating text styles and extracting information from customer reviews using structured output. Additionally, it emphasizes the importance of handling LLM output correctly and includes instructions for formatting responses as JSON.

Uploaded by

itsbeenhacked
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views11 pages

L1-Model - Prompt - Parser - Jupyter Notebook

The document outlines how to use LangChain for making API calls to OpenAI, including setting up prompts, models, and output parsers. It provides code examples for translating text styles and extracting information from customer reviews using structured output. Additionally, it emphasizes the importance of handling LLM output correctly and includes instructions for formatting responses as JSON.

Uploaded by

itsbeenhacked
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

LangChain: Models, Prompts and Output Parsers

Outline
Direct API calls to OpenAI
API calls through LangChain:
Prompts
Models
Output parsers

Get your OpenAI API Key


([Link]
In [1]: #!pip install python-dotenv
#!pip install openai

In [2]: import os
import openai

from dotenv import load_dotenv, find_dotenv
_ = load_dotenv(find_dotenv()) # read local .env file
openai.api_key = [Link]['OPENAI_API_KEY']

Note: LLM's do not always produce the same results. When executing the code in your notebook, you may get
slightly different answers that those in the video.

In [3]: # account for deprecation of LLM model


import datetime
# Get the current date
current_date = [Link]().date()

# Define the date after which the model should be set to "gpt-3.5-turbo"
target_date = [Link](2024, 6, 12)

# Set the model variable based on the current date
if current_date > target_date:
llm_model = "gpt-3.5-turbo"
else:
llm_model = "gpt-3.5-turbo-0301"

Chat API : OpenAI


Let's start with a direct API calls to OpenAI.
In [4]: def get_completion(prompt, model=llm_model):
messages = [{"role": "user", "content": prompt}]
response = [Link](
model=model,
messages=messages,
temperature=0,
)
return [Link][0].message["content"]

In [5]: get_completion("What is 1+1?")

'As an AI language model, I can tell you that the answer to 1+1 is 2.'

In [6]: customer_email = """


Arrr, I be fuming that me blender lid \
flew off and splattered me kitchen walls \
with smoothie! And to make matters worse,\
the warranty don't cover the cost of \
cleaning up me kitchen. I need yer help \
right now, matey!
"""

In [7]: style = """American English \


in a calm and respectful tone
"""

In [8]: prompt = f"""Translate the text \


that is delimited by triple backticks
into a style that is {style}.
text: ```{customer_email}```
"""

print(prompt)

Translate the text that is delimited by triple backticks


into a style that is American English in a calm and respectful tone
.
text: ```
Arrr, I be fuming that me blender lid flew off and splattered me kitchen wall
s with smoothie! And to make matters worse,the warranty don't cover the cost
of cleaning up me kitchen. I need yer help right now, matey!
```

In [9]: response = get_completion(prompt)


In [10]: response

'I am quite upset that my blender lid came off and caused my smoothie to spla
tter all over my kitchen walls. Additionally, the warranty does not cover the
cost of cleaning up the mess. Would you be able to assist me at this time, my
friend?'

Chat API : LangChain


Let's try how we can do the same using LangChain.

In [11]: #!pip install --upgrade langchain

Model

In [12]: from langchain.chat_models import ChatOpenAI

In [13]: # To control the randomness and creativity of the generated


# text by an LLM, use temperature = 0.0
chat = ChatOpenAI(temperature=0.0, model=llm_model)
chat

ChatOpenAI(verbose=False, callbacks=None, callback_manager=None, client=<clas


s 'openai.api_resources.chat_completion.ChatCompletion'>, model_name='gpt-3.5
-turbo-0301', temperature=0.0, model_kwargs={}, openai_api_key=None, openai_a
pi_base=None, openai_organization=None, request_timeout=None, max_retries=6,
streaming=False, n=1, max_tokens=None)

Prompt template

In [14]: template_string = """Translate the text \


that is delimited by triple backticks \
into a style that is {style}. \
text: ```{text}```
"""

In [15]: from [Link] import ChatPromptTemplate



prompt_template = ChatPromptTemplate.from_template(template_string)

In [16]: prompt_template.messages[0].prompt

PromptTemplate(input_variables=['style', 'text'], output_parser=None, partial


_variables={}, template='Translate the text that is delimited by triple backt
icks into a style that is {style}. text: ```{text}```\n', template_format='f-
string', validate_template=True)
In [17]: prompt_template.messages[0].prompt.input_variables

['style', 'text']

In [18]: customer_style = """American English \


in a calm and respectful tone
"""

In [19]: customer_email = """


Arrr, I be fuming that me blender lid \
flew off and splattered me kitchen walls \
with smoothie! And to make matters worse, \
the warranty don't cover the cost of \
cleaning up me kitchen. I need yer help \
right now, matey!
"""

In [20]: customer_messages = prompt_template.format_messages(


style=customer_style,
text=customer_email)

In [21]: print(type(customer_messages))
print(type(customer_messages[0]))

<class 'list'>
<class '[Link]'>

In [22]: print(customer_messages[0])

content="Translate the text that is delimited by triple backticks into a styl


e that is American English in a calm and respectful tone\n. text: ```\nArrr,
I be fuming that me blender lid flew off and splattered me kitchen walls with
smoothie! And to make matters worse, the warranty don't cover the cost of cle
aning up me kitchen. I need yer help right now, matey!\n```\n" additional_kwa
rgs={} example=False

In [23]: # Call the LLM to translate to the style of the customer message
customer_response = chat(customer_messages)

In [24]: print(customer_response.content)

I'm really frustrated that my blender lid flew off and made a mess of my kitc
hen walls with smoothie. To add to my frustration, the warranty doesn't cover
the cost of cleaning up my kitchen. Can you please help me out, friend?
In [25]: service_reply = """Hey there customer, \
the warranty does not cover \
cleaning expenses for your kitchen \
because it's your fault that \
you misused your blender \
by forgetting to put the lid on before \
starting the blender. \
Tough luck! See ya!
"""

In [26]: service_style_pirate = """\


a polite tone \
that speaks in English Pirate\
"""

In [27]: service_messages = prompt_template.format_messages(


style=service_style_pirate,
text=service_reply)

print(service_messages[0].content)

Translate the text that is delimited by triple backticks into a style that is
a polite tone that speaks in English Pirate. text: ```Hey there customer, the
warranty does not cover cleaning expenses for your kitchen because it's your
fault that you misused your blender by forgetting to put the lid on before st
arting the blender. Tough luck! See ya!
```

In [28]: service_response = chat(service_messages)


print(service_response.content)

Ahoy there, me hearty customer! I be sorry to inform ye that the warranty be


not coverin' the expenses o' cleaning yer galley, as 'tis yer own fault fer m
isusin' yer blender by forgettin' to put the lid on afore startin' it. Aye, t
ough luck! Farewell and may the winds be in yer favor!

Output Parsers
Let's start with defining how we would like the LLM output to look like:

In [29]: {
"gift": False,
"delivery_days": 5,
"price_value": "pretty affordable!"
}

{'gift': False, 'delivery_days': 5, 'price_value': 'pretty affordable!'}


In [30]: customer_review = """\
This leaf blower is pretty amazing. It has four settings:\
candle blower, gentle breeze, windy city, and tornado. \
It arrived in two days, just in time for my wife's \
anniversary present. \
I think my wife liked it so much she was speechless. \
So far I've been the only one using it, and I've been \
using it every other morning to clear the leaves on our lawn. \
It's slightly more expensive than the other leaf blowers \
out there, but I think it's worth it for the extra features.
"""

review_template = """\
For the following text, extract the following information:

gift: Was the item purchased as a gift for someone else? \
Answer True if yes, False if not or unknown.

delivery_days: How many days did it take for the product \
to arrive? If this information is not found, output -1.

price_value: Extract any sentences about the value or price,\
and output them as a comma separated Python list.

Format the output as JSON with the following keys:
gift
delivery_days
price_value

text: {text}
"""

In [31]: from [Link] import ChatPromptTemplate



prompt_template = ChatPromptTemplate.from_template(review_template)
print(prompt_template)

input_variables=['text'] output_parser=None partial_variables={} messages=[Hu


manMessagePromptTemplate(prompt=PromptTemplate(input_variables=['text'], outp
ut_parser=None, partial_variables={}, template='For the following text, extra
ct the following information:\n\ngift: Was the item purchased as a gift for s
omeone else? Answer True if yes, False if not or unknown.\n\ndelivery_days: H
ow many days did it take for the product to arrive? If this information is no
t found, output -1.\n\nprice_value: Extract any sentences about the value or
price,and output them as a comma separated Python list.\n\nFormat the output
as JSON with the following keys:\ngift\ndelivery_days\nprice_value\n\ntext:
{text}\n', template_format='f-string', validate_template=True), additional_kw
args={})]
In [32]: messages = prompt_template.format_messages(text=customer_review)
chat = ChatOpenAI(temperature=0.0, model=llm_model)
response = chat(messages)
print([Link])

{
"gift": true,
"delivery_days": 2,
"price_value": ["It's slightly more expensive than the other leaf blowers
out there, but I think it's worth it for the extra features."]
}

In [33]: type([Link])

str

In [34]: # You will get an error by running this line of code


# because'gift' is not a dictionary
# 'gift' is a string
[Link]('gift')

---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[34], line 4
1 # You will get an error by running this line of code
2 # because'gift' is not a dictionary
3 # 'gift' is a string
----> 4 [Link]('gift')

AttributeError: 'str' object has no attribute 'get'

Parse the LLM output string into a Python dictionary

In [35]: from langchain.output_parsers import ResponseSchema


from langchain.output_parsers import StructuredOutputParser
In [36]: gift_schema = ResponseSchema(name="gift",
description="Was the item purchased\
as a gift for someone else? \
Answer True if yes,\
False if not or unknown.")
delivery_days_schema = ResponseSchema(name="delivery_days",
description="How many days\
did it take for the product\
to arrive? If this \
information is not found,\
output -1.")
price_value_schema = ResponseSchema(name="price_value",
description="Extract any\
sentences about the value or \
price, and output them as a \
comma separated Python list.")

response_schemas = [gift_schema,
delivery_days_schema,
price_value_schema]

In [37]: output_parser = StructuredOutputParser.from_response_schemas(response_schema

In [38]: format_instructions = output_parser.get_format_instructions()

In [39]: print(format_instructions)

The output should be a markdown code snippet formatted in the following schem
a, including the leading and trailing "\`\`\`json" and "\`\`\`":

```json
{
"gift": string // Was the item purchased
as a gift for someone else? Answer True if yes,
False if not or unknown.
"delivery_days": string // How many days
did it take for the product to arrive? I
f this information is not found,
output -1.
"price_value": string // Extract any
sentences about the value or price, and o
utput them as a comma separated Python li
st.
}
```
In [40]: review_template_2 = """\
For the following text, extract the following information:

gift: Was the item purchased as a gift for someone else? \
Answer True if yes, False if not or unknown.

delivery_days: How many days did it take for the product\
to arrive? If this information is not found, output -1.

price_value: Extract any sentences about the value or price,\
and output them as a comma separated Python list.

text: {text}

{format_instructions}
"""

prompt = ChatPromptTemplate.from_template(template=review_template_2)

messages = prompt.format_messages(text=customer_review,
format_instructions=format_instructions)
In [41]: print(messages[0].content)

For the following text, extract the following information:

gift: Was the item purchased as a gift for someone else? Answer True if yes,
False if not or unknown.

delivery_days: How many days did it take for the productto arrive? If this in
formation is not found, output -1.

price_value: Extract any sentences about the value or price,and output them a
s a comma separated Python list.

text: This leaf blower is pretty amazing. It has four settings:candle blowe
r, gentle breeze, windy city, and tornado. It arrived in two days, just in ti
me for my wife's anniversary present. I think my wife liked it so much she wa
s speechless. So far I've been the only one using it, and I've been using it
every other morning to clear the leaves on our lawn. It's slightly more expen
sive than the other leaf blowers out there, but I think it's worth it for the
extra features.

The output should be a markdown code snippet formatted in the following schem
a, including the leading and trailing "\`\`\`json" and "\`\`\`":

```json
{
"gift": string // Was the item purchased
as a gift for someone else? Answer True if yes,
False if not or unknown.
"delivery_days": string // How many days
did it take for the product to arrive? I
f this information is not found,
output -1.
"price_value": string // Extract any
sentences about the value or price, and o
utput them as a comma separated Python li
st.
}
```

In [42]: response = chat(messages)

In [43]: print([Link])

```json
{
"gift": true,
"delivery_days": "2",
"price_value": ["It's slightly more expensive than the other leaf blo
wers out there, but I think it's worth it for the extra features."]
}
```
In [44]: output_dict = output_parser.parse([Link])

In [45]: output_dict

{'gift': True,
'delivery_days': '2',
'price_value': ["It's slightly more expensive than the other leaf blowers ou
t there, but I think it's worth it for the extra features."]}

In [46]: type(output_dict)

dict

In [47]: output_dict.get('delivery_days')

'2'

Reminder: Download your notebook to you local computer to save your work.

In [ ]: ​

In [ ]: ​

In [ ]: ​

In [ ]: ​

In [ ]: ​

In [ ]: ​

In [ ]: ​

In [ ]: ​

You might also like