To generate creative and coherent poetry using pretrained transformers, fine-tune a GPT-based model (GPT-3.5/4, T5, LLaMA) on poetic datasets, apply temperature & top-k/top-p sampling, and use prompt engineering with constraints for stylistic consistency.
Here is the code snippet you can refer to:

In the above code we are using the following key approaches:
-
Uses a Pretrained Transformer (GPT-4/GPT-3.5)
- Generates poetry leveraging pretrained knowledge of poetic structures, metaphors, and literary styles.
-
Creative Control with Sampling Techniques:
- temperature=0.8: Encourages creativity by increasing randomness.
- top_p=0.9: Enables controlled diversity while maintaining coherence.
-
Prompt Engineering for Stylistic Coherence:
- Uses a thematic prompt (e.g., surrealism, classical poetry, free verse) to guide the model.
- Implements a system message defining the AI's role as a poet.
-
Scalability & Customization:
- Can be fine-tuned further on poetry datasets using Hugging Face’s Transformers library with GPT-2/T5 models for offline generation.
Hence, leveraging pretrained transformers with well-crafted prompts, temperature tuning, and top-p sampling ensures both high creativity and coherence in poetry generation.