What 3 Major Reports Reveal About the State of Marketing AI
Marketers that operationalize AI with intentional programs produce better results.
As we enter 2025, artificial intelligence has moved from experimental curiosity to an operational necessity for marketers. Three major reports have emerged offering critical insights into this transformation: The State of Martech 2025 by Scott Brinker and Frans Riemersma, The 2025 State of Marketing AI Report by Marketing AI Institute (MAII) and SmarterX, and The State of AI in Marketing 2025 by Jasper.
These comprehensive reports surveyed over 4,000 marketing professionals, offering the clearest picture yet of where AI adoption actually stands. Of course, there is a mutual reflection on the current state of the inevitable AI "revolution” across the reports, but the data reveals a more nuanced story—one that's both more promising and complicated than the industry narratives suggest.
Here's what emerges when you look past the surface metrics.
Widespread Adoption Meets Implementation Challenges
Source: MAII
The numbers tell a consistent story across all three studies (Martech 2025, Marketing AI Institute, and Jasper): AI adoption has genuinely crossed into mainstream territory, with 63-87% of organizations now using some form of AI.
However, the definition of use can be interpreted loosely, and depends on the surveyor's point of view or bias, if you will. Whether use is defined as experimenting with ChatGPT, simply having meeting reports integrated into your Zoom calls, or attempts at integrating more complex agentic AI, there's an important gap worth understanding.
While adoption is widespread, systematic implementation lags significantly behind. 68% of adopters receive no formal AI training from their companies, and only 43% have structured programs in place.
What this really means is that the competitive landscape isn't as AI-advanced as it appears. Most organizations are in a similar experimental phase, creating opportunities for those who move more with strategic intent and systematic implementation. Those who claim to have deployed operational AI and are moving into the agentic phase may be loosely defining their efforts.
What separates organizations seeing genuine results from those still experimenting?
Three Consistent Patterns Worth Understanding
1. Efficiency Gains Are Real, But Narrowly Focused
All three reports identify the same primary value driver: Operational efficiency and time savings. The applications are remarkably consistent:
82% use AI to reduce time on repetitive tasks (Brinker/Riemersma)
78% focus on content creation (Brinker/Riemersma)
37% cite efficiency as their top AI benefit (MAII)
This pattern suggests AI's current sweet spot is well-defined, tactical work rather than strategic transformation. Content creation, meeting summaries, customer service responses, and data analysis tasks deliver measurable value because they're narrow problems with clear success criteria.
The limitation of optimizing existing workflows rather than reimagining business processes provides a short-term advantage for businesses. Every business and every marketing department that adapts AI well will experience similar benefits. That's not necessarily wrong—operational efficiency has genuine value—but it's worth understanding the scope of current impact. Which is executing work faster and more efficiently rather than innovating and improving it.
There’s a gap here, and Scott Brinker and Frans Riemersma’s Martec report calls it out well: Not using efficiency and savings to innovate and build new marketing programs is a missed opportunity. Some will choose to use AI to fuel growth, and those who do so creatively and innovate their marketing will eventually emerge as winners.
2. Organizational Maturity Creates Significant Performance Gaps
One characteristic of the AI era is intent. Some companies and organizations procure AI solutions for their marketing teams to use as ad hoc or simply experiment with. Others move with intent, process mining, and operationalizing AI into their daily and departmental workflows. The most striking finding across all reports is how dramatically structured approaches outperform ad-hoc adoption:
Organizations with AI roadmaps are twice as likely to have training, governance, and measurement systems (Brinker/Riemersma)
"Advanced" organizations are 4.4x more likely to measure ROI successfully (Jasper)
Companies with specialized tools show 37% better measurement capabilities than those using general-purpose solutions (Jasper)
This suggests the technology itself isn't the primary barrier—organizational readiness is. Companies and organizations seeing stronger results aren't necessarily using better AI; they're using AI better. This speaks to the larger discussion about AI success requiring a workforce transformation and capable humans who can deploy AI tools intelligently.
It also speaks to a historic trend of shiny object syndrome fails. Too often, throughout every technology boom, we see rushes to buy technology as the panacea for business problems. This is occurring again with AI. Whether it was ChatGPT three years ago or agentic AI in 2025, too much hype and promise with scant results lead to poorly made assumptions that AI will instantly improve business challenges.
3. The Tool Landscape Is Rapidly Evolving
Source: AI Supremacy
The data shows an interesting transition happening. While ChatGPT dominates current usage (69% adoption), other tools are starting to win out. Anthropic is now becoming the preferred LLM for developers. On the frontlines of marketing, specialized tools are gaining traction among more mature adopters:
71% of "very advanced" teams use domain-specific AI tools (Jasper)
92% of specialized tool users plan to expand usage vs. 74% of general-purpose users (Jasper)
The martech landscape now includes 15,384 solutions, with AI driving much of the growth (Brinker/Riemersma)
This evolution makes sense. As organizations move beyond experimentation, they need tools that integrate with existing workflows and provide marketing-specific capabilities. Professionals who perform domain-specific jobs will seek the best tools to accomplish their work.
By now, many are coming to understand that ChatGPT is the Swiss Army Knife of AI. While capable in many instances, it does not surpass domain-specific tools in most individual functions. The only area I find it superior to any other solution is the ability to create GPTs, the best low-code no-code agentic AI solution out there. In every other personal use case, I have found another tool does it better.
Three Areas of Genuine Disagreement
Source: Jasper
1. Employment Impact Concerns Vary Dramatically
Job displacement is dominating AI headlines this year, in large part because big tech companies nad leading AI startup CEOs keep claiming that AI will take them. In some cases, they have implemented workforce reduction programs. However, while widely aggregated stories do not necessarily translate into an industry movement.
The marketing AI reports present notably different perspectives on job displacement:
Marketing AI Institute: 53% see more jobs eliminated than created (neutral to very concerned)
Martech report: Only 21% express significant concern about job security
Jasper: Frames impact as "role evolution" rather than elimination
This variance likely reflects different survey audiences, biases and methodologies, but it highlights real uncertainty in the field. It may also reflect the Marketing AI Institute’s propensity to aggregate dramatic AI headlines to its readers and social media followers. Candidly, nobody knows exactly how employment patterns will evolve, and different organizations are experiencing different impacts.
Most pragmatic interpretations suggest a shift in roles rather than widespread displacement. Marketers are most likely to experience a 20-30% transformation in their roles, with increased expectations for AI-enabled productivity. The jobs that will be eliminated are those that are extraordinarily rote and do not require significant creativity or strategic thinking. For example, I would not want to be an SDR today.
2. General vs. Specialized Tools Show Mixed Results
As noted above, the tool preference data tells conflicting stories about AI adoption amongst survey cohorts. It may also favor the survey base, Jasper users are more likely to have operationalized AI and prefer domain-specific AI tools, while Marketing AI Institute users, who may not have a significant budget and are in smaller companies, use ChatGPT for writing and a wide variety of tasks.
Some more data points:
ChatGPT shows massive adoption but declining usage at larger companies (MAII)
Specialized tools correlate with better ROI measurement (Jasper)
Custom AI solutions are proliferating rapidly (Brinker/Riemersma)
This suggests we're in a transition period where different approaches work for different organizational contexts. The "right" tool strategy depends heavily on factors such as company size, technical resources, AI maturity, budget constraints, security concerns, and specific use cases. Anecdotally speaking, some users employ Copilot because that is what they have been provisioned with.
3. Training Priorities Emphasize Different Skills
Part of operationalizing AI is training staff to use the tools, not just on a tactical basis, but also on how to use it within the larger context of their department and the enterprise’s workflows. While all reports identify training as crucial, they focus on different aspects:
Marketing AI Institute emphasizes prompt engineering skills
Martech highlights organizational change management
Jasper focuses on formalized programs and governance
Each perspective has merit and probably reflects the bias of its survey base. For example, MAII likely surveyed many smaller companies, while Jasper tends to serve larger marketing departments. It would make sense that different marketing populations would not share a unified view on training. This suggests successful AI adoption requires multiple types of capability building rather than a single training approach.
Practical Implications for Marketing Leaders
Source: Jasper
The Measurement Opportunity Is Significant
Marketing is ultimately in the business of generating passive sales through digital and other means, or supporting sales organizations by building brand awareness and generating leads. Such marketing programs fuel revenue growth and are measured in ROI. Yet, only 49% of organizations can effectively measure AI ROI (Jasper and MAII).
This creates a clear competitive advantage for those who develop robust measurement frameworks, especially those that can demonstrate ROI for their AI programs. Bonus miles for those who use AI to fuel revenue growth and can tie their efforts to the larger bottom line. This isn't just about proving value—it's about creating feedback loops that enable continuous improvement and investment.
So far, the AI ROI story is a tale told by those with formalized programs:
Very advanced" organizations: 96% measure AI ROI (Jasper)
"Beginning" stage organizations: Only 22% measure AI ROI (Jasper)
Organizations succeeding here focus on outcome-based metrics (revenue impact, conversion improvements) rather than just efficiency measures. They also implement baseline metrics before expanding the use of AI.
Skills Development Requires Strategic Focus
The training gap is real, but addressable. For example, 62% of companies don't provide prompt engineering training (MAII), despite evidence that better AI literacy directly correlates with better outcomes. There is no statistic of note for more strategic training and governance programs outside of AI Councils (33% have, according to Jasper, and 49% according to MAII).
However, Brinker and Riemersma note that "Martec's Law" illustrates that technology changes exponentially, but organizations change logarithmically over time. This would indicate the ability to evolve culture and process to meet AI will be an evolution rather than a revolution.
The opportunity lies in developing comprehensive AI education programs that combine technical skills (prompting, tool selection) with strategic thinking (use case identification, measurement design, process mapping). Companies that lean too heavily towards tactics or strategy and fail to strike a balance will experience mediocre training.
Tool Evolution Favors Informed Decision-Making
The trend toward specialized AI tools creates both opportunities and complexities. Organizations need clear criteria for tool selection based on business requirements rather than following competitor announcements or consultant recommendations.
Successful approaches involve evaluating when to use general-purpose tools, when to adopt specialized solutions, and when to build custom implementations. Further, there is likely an advantage in using monthly or short-term contracts to procure SAAS AI platforms. AI advancements are so fast that in many cases, tools have become disposable.
Formal Implementation Programs Matter
Source: Jasper.
The three reports indicate that the marketing AI landscape is maturing, albeit unevenly. One major key insight from all three reports is that success correlates more strongly with implementation quality than with technology sophistication. The organizations achieving transformational results aren't necessarily using the most advanced AI—they're using AI most effectively and systematically.
This suggests a practical path forward focused on:
Starting with specific business problems rather than AI capabilities
Building measurement frameworks before scaling adoption
Investing in organizational capabilities alongside technology
Choosing tools based on integration requirements rather than feature lists
The marketing AI revolution is real, but it's happening more gradually and practically than current headlines suggest. For organizations taking thoughtful, systematic approaches, that's actually good news.
Still, the gap between ad hoc experimenters and systematic adopters is widening, creating opportunities for organizations willing to invest in structured approaches. It’s also illustrating a potential AI divide between those who have adapted and those who are lagging behind.
What patterns are you seeing in your own AI initiatives? Are the efficiency gains translating to broader business impact, or are you still in the optimization phase?
This analysis is based on data from The State of Martech 2025 (1,882 respondents), The 2025 State of Marketing AI Report (1,882 respondents), and The State of AI in Marketing 2025 by Jasper (503 respondents).