A fine-tuned model created using ModelService.CreateTunedModel.
This message has oneof
_ fields (mutually exclusive fields).
For each oneof, at most one member field can be set at the same time.
Setting any member of the oneof automatically clears all other
members.
Attributes |
tuned_model_source
|
google.ai.generativelanguage.TunedModelSource
Optional. TunedModel to use as the starting
point for training the new model.
This field is a member of oneof _ source_model .
|
base_model
|
str
Immutable. The name of the Model to tune. Example:
models/text-bison-001
This field is a member of oneof _ source_model .
|
name
|
str
Output only. The tuned model name. A unique name will be
generated on create. Example: tunedModels/az2mb0bpw6i If
display_name is set on create, the id portion of the name
will be set by concatenating the words of the display_name
with hyphens and adding a random portion for uniqueness.
Example: display_name = "Sentence Translator" name =
"tunedModels/sentence-translator-u3b7m".
|
display_name
|
str
Optional. The name to display for this model
in user interfaces. The display name must be up
to 40 characters including spaces.
|
description
|
str
Optional. A short description of this model.
|
temperature
|
float
Optional. Controls the randomness of the output.
Values can range over [0.0,1.0] , inclusive. A value
closer to 1.0 will produce responses that are more
varied, while a value closer to 0.0 will typically
result in less surprising responses from the model.
This value specifies default to be the one used by the base
model while creating the model.
|
top_p
|
float
Optional. For Nucleus sampling.
Nucleus sampling considers the smallest set of tokens whose
probability sum is at least top_p .
This value specifies default to be the one used by the base
model while creating the model.
|
top_k
|
int
Optional. For Top-k sampling.
Top-k sampling considers the set of top_k most probable
tokens. This value specifies default to be used by the
backend while making the call to the model.
This value specifies default to be the one used by the base
model while creating the model.
|
state
|
google.ai.generativelanguage.TunedModel.State
Output only. The state of the tuned model.
|
create_time
|
google.protobuf.timestamp_pb2.Timestamp
Output only. The timestamp when this model
was created.
|
update_time
|
google.protobuf.timestamp_pb2.Timestamp
Output only. The timestamp when this model
was updated.
|
tuning_task
|
google.ai.generativelanguage.TuningTask
Required. The tuning task that creates the
tuned model.
|
Child Classes
class State