Experiments with foundational models
We continue to be at the forefront of using AI—for driving seamless integrations across complex systems. Research is a cornerstone of our mission, guiding us as we search for new avenues to apply AI in real-world challenges and deliver transformative solutions.
At iTheta, we’ve also been deeply engaged in pushing the boundaries of foundational models, combining rigorous research with practical applications. Our latest paper—now published—dives into how different embedding strategies affect the generation of Shakespearean-style text. By experimenting with real-valued, complex-valued, and hybrid embeddings, we sought to understand not just what these representations encode, but how they shape the creative expressiveness of a transformer-based architecture enhanced with multi-scale retention mechanisms.
In our study, we trained models under each embedding regime and evaluated the outputs across three key dimensions: expressiveness, continuity, and clarity. We discovered that complex-valued embeddings notably boost text coherence, yielding passages that flow more naturally and maintain thematic consistency. Meanwhile, the combination of real and complex embeddings produced the richest narratives—capturing emotional nuance and stylistic flourishes that mirror Shakespeare’s own dramatic flair. Real-valued embeddings provided a strong baseline, but it was the hybrid approach that truly unlocked the most expressive storytelling.
These findings not only highlight the potential of complex and combined embeddings for creative language generation, but also open new avenues for tailoring model architectures to specific stylistic goals. As we look ahead, we’re planning to scale these experiments across other literary genres, explore multilingual settings, and release our code and datasets for the community. If you’re passionate about generative AI, we invite you to read the full paper and join us as we continue to explore how advanced embedding techniques can elevate the art of machine-generated language.
We invite you to read the full paper and join us as we continue to explore how advanced embedding techniques can elevate the art of machine-generated language. Read the full paper here: Complex-Valued Embeddings for Transformer-Based Language Models with Multi-Scale Retention