CopyPress is a Florida-based fractional content marketing agency that helps brands build authority by scaling high-impact content. With clients across industries like healthcare, tech, finance, and retail, CopyPress blends human creativity with AI-powered efficiency to drive measurable growth. At the heart of its innovation is Thematical, a proprietary content intelligence platform designed to streamline topic discovery, content planning, and production at scale. Built with advanced machine learning and LLM prompting capabilities, Thematical empowers content teams to move faster, produce smarter, and create impact.
Josh Kunzler
Product Designer @ CopyPress
As a content-focused company, CopyPress built Thematical to support a sophisticated, multi-stage content creation workflow. This covered everything from SEO keyword research to topic ideation, AI-assisted research, brief generation, humanization, and QA. Behind the scenes, this process relied on a network of LLM prompts, each tailored for specific stages and running across multiple models from Anthropic, OpenAI, and other LLM providers. Coordinating these prompt chains introduced significant operational complexity and risk.
To manage their LLM workflows, the team initially stored prompts in a custom database. While this allowed some versioning and referencing, the setup became increasingly difficult to maintain as the product scaled. Once a prompt was live in production, making even minor changes introduced risk and required caution.
The team needed a reliable way to compare prompts and model behavior across tasks and providers. While they considered building internal tooling to support this, the scope and engineering effort quickly became a barrier to iteration at scale.
Thematical’s architecture relies on multiple LLMs collaborating across stages, each with a specific role in the process. Managing this orchestration manually across different models added operational overhead and increased the complexity of maintaining seamless handoffs between LLM calls.
Scaling LLMs with Orq.ai
Josh Kunzler
Product Designer @ CopyPress
CopyPress knew they needed better infrastructure to support the complexity of Thematical’s agentic LLM workflows. While they considered building internal tools for prompt management, evaluation, and model experimentation, the effort required to create and maintain such systems quickly became a blocker. After evaluating their options, the team adopted Orq.ai to streamline experimentation, decouple prompts from production code, and provide visibility into LLM behavior and usage. Orq.ai now plays a critical role in CopyPress’s workflow, enabling their team to iterate faster, scale more confidently, and maintain the accuracy and quality their content demands.
Josh Kunzler
Product Designer @ CopyPress
“Now, we don’t have to manage prompts or LLMs in our codebase. We can do all of that and more in Orq.ai.”
As Thematical continues to evolve, the team at CopyPress is focused on deepening their visibility into LLM performance, especially around token usage and pricing. These insights will help them make more strategic decisions about model selection and align costs with their commercial goals.
Looking ahead, CopyPress plans to keep scaling their LLM-powered workflows while maintaining the quality and reliability their customers expect. With Orq.ai as a key partner in their stack, they’re confident in their ability to innovate faster, experiment more freely, and push the boundaries of AI-assisted content creation.
Josh Kunzle
Product Designer @ CopyPress
“We’ve been using Orq.ai for over a year now and are happy with how the platform’s developed.
Looking forward to all the new features the team is cooking up.”