Imaginary Embeddings utilize Curved Contrastive Learning (see paper Imagination Is All You Need! (ACL 2023)) on Sentence Transformers for long-short term dialogue planning and efficient abstract sequence modeling.

This model does not use speaker tokens and was evaluated in the Long-Term planning and sequence modeling experiments.

setup

python -m pip install imaginaryNLP

Usage Sequence Modeling:

from imaginaryNLP.ImaginaryEmbeddingsForSequenceModeling import EvalImaginaryEmbeddingsForSequenceModeling

# Load the model
seq = EvalImaginaryEmbeddingsForSequenceModeling('Justus-Jonas/Imaginary-Embeddings-Classic', speaker_token=False)

# add candidates and context
seq.load_candidates_from_strings(["I'm fine, thanks. How are you?", "Where did you go?", "ACL is an interesting conference"])

# create context, pre-compute and keep 80% of utterances
seq.create_context(["Hi!",'Hey, how are you?'], precompute_top_p=0.8)

seq.sequence_modeling_with_precompute("I am doing good. Today I went for a walk. ")

Long-Term-Planning

from imaginaryNLP.ImaginaryEmbeddingsForLTP import ImaginaryEmbeddingsForLTP

ltp = ImaginaryEmbeddingsForLTP('Justus-Jonas/Imaginary-Embeddings-Classic', speaker_token=False)

# add a contex
ltp.create_context([' Hello', 'Hi , great to meet you ! '])

# add goals
ltp.add_goal(" great to hear that ! ")
ltp.add_goal(" Want to go for a walk ? ")
ltp.add_goal(" Bye !")

# greedy curving
ltp.greedy_curving()

# imaginary embedding chains
ltp.imaginary_embedding_chains()

# imaginary embedding chains with curving
ltp.imaginary_embedding_chains_with_curving()