从 PaLM API 迁移到 Gemini API

本指南介绍了如何将 Python 代码从使用 PaLM API 迁移到 Gemini API。您可以使用 Gemini 生成文字对话和多轮对话(聊天),但请务必检查您的回答,因为它们可能与 PaLM 输出有所不同。

API 差异摘要

  1. 方法名称已更改。generate_content 有一种方法可以同时执行这两项操作,而无需使用单独的方法生成文本和聊天内容。
  2. Chat 有一个辅助方法 start_chat,可使聊天变得更简单。
  3. 新 API 不是独立函数,而是 GenerativeModel 类的方法。
  4. 输出响应结构已更改。
  5. 安全设置类别已更改。如需了解详情,请参阅安全设置指南。

文本生成:基本

PaLM Gemini
pip install google-generativeai

import google.generativeai as palm
import os

palm.configure(
    api_key=os.environ['API_KEY'])

response = palm.generate_text(
    prompt="The opposite of hot is")
print(response.result) #  'cold.'
        
pip install google-generativeai

import google.generativeai as genai
import os

genai.configure(
    api_key=os.environ['API_KEY'])
model = genai.GenerativeModel(
    model_name='gemini-pro')

response = model.generate_content(
          'The opposite of hot is')
print(response.text)
    #  The opposite of hot is cold.'
        

文本生成:可选参数

PaLM Gemini
pip install google-generativeai

import google.generativeai as palm
import os

palm.configure(
    api_key=os.environ['API_KEY'])

prompt = """
You are an expert at solving word
problems.

Solve the following problem:

I have three houses, each with three
cats. Each cat owns 4 mittens, and a hat.
Each mitten was knit from 7m of yarn,
each hat from 4m. How much yarn was
needed to make all the items?

Think about it step by step, and show
your work.
"""

completion = palm.generate_text(
    model=model,
    prompt=prompt,
    temperature=0,
    # The maximum length of response
    max_output_tokens=800,
)

print(completion.result)
        
pip install google-generativeai

import google.generativeai as genai
import os

genai.configure(
    api_key=os.environ['API_KEY'])
model = genai.GenerativeModel(
    model_name='gemini-pro')

prompt = """
You are an expert at solving word
problems.

Solve the following problem:

I have three houses, each with three
cats. Each cat owns 4 mittens, and a hat.
Each mitten was knit from 7m of yarn,
each hat from 4m. How much yarn was
needed to make all the items?

Think about it step by step, and show
your work.
"""

completion = model.generate_content(
    prompt,
    generation_config={
        'temperature': 0,
        'max_output_tokens': 800
    }
)

print(completion.text)
        

聊天:基本

PaLM Gemini
pip install google-generativeai

import google.generativeai as palm
import os

palm.configure(
    api_key=os.environ['API_KEY'])

chat = palm.chat(
    messages=["Hello."])
print(chat.last)
    #  'Hello! What can I help you with?'
chat = chat.reply(
    "Just chillin'")
print(chat.last)
    #  'That's great! ...'
        
pip install google-generativeai

import google.generativeai as genai
import os

genai.configure(
    api_key=os.environ['API_KEY'])
model = genai.GenerativeModel(
    model_name='gemini-pro')
chat = model.start_chat()

response = chat.send_message(
          "Hello.")
print(response.text)
response = chat.send_message(
          "Just chillin'")
print(response.text)
        

Chat:对话记录

PaLM Gemini
chat.messages

[{'author': '0', 'content': 'Hello'},
 {'author': '1', 'content': 'Hello! How can I help you today?'},
 {'author': '0', 'content': "Just chillin'"},
 {'author': '1',
  'content': "That's great! I'm glad you're able to relax and
      take some time for yourself. What are you up to today?"}]
        
chat.history

[parts {
   text: "Hello."
 }
 role: "user",
 parts {
   text: "Greetings! How may I assist you today?"
 }
 role: "assistant",
 parts {
   text: "Just chillin\'"
 }
 role: "user",
 parts {
   text: "That\'s great! I\'m glad to hear
   you\'re having a relaxing time.
   May I offer you any virtual entertainment
   or assistance? I can provide
   you with music recommendations, play
   games with you, or engage in a
   friendly conversation.\n\nAdditionally,
   I\'m capable of generating
   creative content, such as poems, stories,
   or even song lyrics.
   If you\'d like, I can surprise you with
   something unique.\n\nJust
   let me know what you\'re in the mood for,
   and I\'ll be happy to oblige."
 }
 role: "assistant"]
        

Chat:温度

PaLM Gemini
# Setting temperature=1 usually produces more zany responses!
chat = palm.chat(messages="What should I eat for dinner tonight? List a few options", temperature=1)
chat.last

'Here are a few ideas ...
        
model = genai.GenerativeModel(model_name='gemini-pro')
chat = model.start_chat()

# Setting temperature=1 usually produces more zany responses!
response = chat.send_message(
        "What should I eat for dinner tonight? List a few options",
          generation_config={
          'temperature': 1.0
        })

print(response.text)

'1. Grilled Salmon with Roasted Vegetables: ...'
        

后续步骤