Von der PaLM API zur Gemini API migrieren

In diesem Leitfaden erfahren Sie, wie Sie Ihren Python-Code von der PaLM API zu der Gemini API. Sie können sowohl SMS- als auch Multi-Turn-Unterhaltungen (Chat) generieren mit Gemini arbeiten, aber prüfe deine Antworten, da sie von PaLM-Ausgaben unterscheiden.

Zusammenfassung der API-Unterschiede

  1. Die Methodennamen haben sich geändert. Anstatt separate Methoden zum Generieren Text- und Chat-Nachrichten senden, gibt es eine generate_content-Methode, die beides kann.
  2. In Google Chat gibt es die Hilfsmethode start_chat, die das Chatten noch einfacher macht.
  3. Anstelle eigenständiger Funktionen sind die neuen APIs Methoden des Klasse GenerativeModel.
  4. Die Struktur der Ausgabeantwort hat sich geändert.
  5. Die Kategorien der Sicherheitseinstellungen wurden geändert. Weitere Informationen finden Sie im Sicherheitseinstellungen finden Sie weitere Informationen.

Textgenerierung: Basis

PaLM Gemini
pip install google-generativeai

import google.generativeai as palm
import os

palm.configure(
    api_key=os.environ['API_KEY'])

response = palm.generate_text(
    prompt="The opposite of hot is")
print(response.result) #  'cold.'
        
pip install google-generativeai

import google.generativeai as genai
import os

genai.configure(
    api_key=os.environ['API_KEY'])
model = genai.GenerativeModel(
    model_name='gemini-pro')

response = model.generate_content(
          'The opposite of hot is')
print(response.text)
    #  The opposite of hot is cold.'
        

Textgenerierung: Optionale Parameter

PaLM Gemini
pip install google-generativeai

import google.generativeai as palm
import os

palm.configure(
    api_key=os.environ['API_KEY'])

prompt = """
You are an expert at solving word
problems.

Solve the following problem:

I have three houses, each with three
cats. Each cat owns 4 mittens, and a hat.
Each mitten was knit from 7m of yarn,
each hat from 4m. How much yarn was
needed to make all the items?

Think about it step by step, and show
your work.
"""

completion = palm.generate_text(
    model=model,
    prompt=prompt,
    temperature=0,
    # The maximum length of response
    max_output_tokens=800,
)

print(completion.result)
        
pip install google-generativeai

import google.generativeai as genai
import os

genai.configure(
    api_key=os.environ['API_KEY'])
model = genai.GenerativeModel(
    model_name='gemini-pro')

prompt = """
You are an expert at solving word
problems.

Solve the following problem:

I have three houses, each with three
cats. Each cat owns 4 mittens, and a hat.
Each mitten was knit from 7m of yarn,
each hat from 4m. How much yarn was
needed to make all the items?

Think about it step by step, and show
your work.
"""

completion = model.generate_content(
    prompt,
    generation_config={
        'temperature': 0,
        'max_output_tokens': 800
    }
)

print(completion.text)
        

Chat: Basis

PaLM die Gemini
pip install google-generativeai

import google.generativeai as palm
import os

palm.configure(
    api_key=os.environ['API_KEY'])

chat = palm.chat(
    messages=["Hello."])
print(chat.last)
    #  'Hello! What can I help you with?'
chat = chat.reply(
    "Just chillin'")
print(chat.last)
    #  'That's great! ...'
        
pip install google-generativeai

import google.generativeai as genai
import os

genai.configure(
    api_key=os.environ['API_KEY'])
model = genai.GenerativeModel(
    model_name='gemini-pro')
chat = model.start_chat()

response = chat.send_message(
          "Hello.")
print(response.text)
response = chat.send_message(
          "Just chillin'")
print(response.text)
        

Chat: Unterhaltungsverlauf

PaLM die Gemini
chat.messages

[{'author': '0', 'content': 'Hello'},
 {'author': '1', 'content': 'Hello! How can I help you today?'},
 {'author': '0', 'content': "Just chillin'"},
 {'author': '1',
  'content': "That's great! I'm glad you're able to relax and
      take some time for yourself. What are you up to today?"}]
        
chat.history

[parts {
   text: "Hello."
 }
 role: "user",
 parts {
   text: "Greetings! How may I assist you today?"
 }
 role: "assistant",
 parts {
   text: "Just chillin\'"
 }
 role: "user",
 parts {
   text: "That\'s great! I\'m glad to hear
   you\'re having a relaxing time.
   May I offer you any virtual entertainment
   or assistance? I can provide
   you with music recommendations, play
   games with you, or engage in a
   friendly conversation.\n\nAdditionally,
   I\'m capable of generating
   creative content, such as poems, stories,
   or even song lyrics.
   If you\'d like, I can surprise you with
   something unique.\n\nJust
   let me know what you\'re in the mood for,
   and I\'ll be happy to oblige."
 }
 role: "assistant"]
        

Chat: Temperatur

PaLM Gemini
# Setting temperature=1 usually produces more zany responses!
chat = palm.chat(messages="What should I eat for dinner tonight? List a few options", temperature=1)
chat.last

'Here are a few ideas ...
        
model = genai.GenerativeModel(model_name='gemini-pro')
chat = model.start_chat()

# Setting temperature=1 usually produces more zany responses!
response = chat.send_message(
        "What should I eat for dinner tonight? List a few options",
          generation_config={
          'temperature': 1.0
        })

print(response.text)

'1. Grilled Salmon with Roasted Vegetables: ...'
        

Nächste Schritte