Blog

  • People Also Ask SEO: Why Human Insight is Your Key to Dominating SERPs

    People Also Ask SEO: Why Human Insight is Your Key to Dominating SERPs

    Summary: This article moves beyond basic data scraping for ‘People Also Ask’ (PAA). It provides a strategic framework for experienced SEO professionals to leverage PAA by understanding user intent, building topical authority, and measuring true business impact. The core argument is that human expertise, not AI alone, is the critical element for turning PAA boxes into a sustainable source of organic growth.

    Beyond the Scraper: A Strategic Guide to People Also Ask SEO

    Are you treating Google’s ‘People Also Ask’ section as a content checklist? If so, you’re leaving traffic, authority, and conversions on the table. Many SEOs see PAA as a simple question-and-answer game, but this tactical view misses the strategic goldmine. True mastery of People Also Ask SEO comes not from automating data collection, but from the uniquely human ability to decode intent, build narrative connections, and establish genuine authority.

    While AI tools can generate endless lists of questions, they can’t replicate the strategic oversight needed to win. True success in dominating these SERP features comes from understanding user intent, crafting nuanced answers, and integrating PAA content into a broader, more ambitious SEO strategy. Let’s move past the basics and explore how to make PAA a cornerstone of your organic growth.

    From Q&A to Authority: Building Content Hubs with PAA

    The most common mistake in PAA optimization is creating isolated pages for every question. This approach is inefficient and fails to signal deep expertise to search engines. Instead, view PAA questions as the building blocks for comprehensive content hubs that establish your topical authority.

    Think of a core topic, like “technical SEO audit.” PAA will reveal dozens of related queries: “what is included in a technical SEO audit?”, “how long does a technical SEO audit take?”, “is a technical SEO audit necessary?”. Instead of separate blog posts, these questions should become H2s or H3s within a single, definitive guide. This structure shows Google that you cover the topic from every angle, making your page the most satisfying result for a spectrum of related searches.

    • Group related questions: Identify thematic clusters within your PAA research.
    • Structure content logically: Use a primary PAA question for your H1 and related questions for subheadings.
    • Internally link: Connect your hub page to other relevant content on your site to reinforce the topical cluster.

    This method transforms your content from a scattered collection of answers into a cohesive library of knowledge, directly improving your chances of owning multiple SERP features for your main keywords.

    Decoding Intent: Guiding Users Through the Funnel

    Every question in a PAA box has an underlying intent. A user asking “what is PAA optimization?” is in a different stage of their journey than one asking “best tools for PAA research.” The first is informational; the second is investigational, bordering on transactional. Your answers must reflect this.

    Analyzing the intent behind the query allows you to craft answers that not only satisfy the user’s immediate need but also guide them to the next logical step. For an informational query, your answer should be clear and direct, perhaps linking to a glossary definition. For a query with commercial intent, the answer can be more detailed, subtly introducing your service or product as a solution.

    This requires a human understanding of the customer journey. AI can scrape the question, but it can’t grasp the subtle context of why the user is asking. By mapping PAA questions to funnel stages, you can create a content experience that nurtures users from awareness to conversion.

    How Your People Also Ask SEO Strategy Signals Expertise to Google

    Google aims to reward E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness). Systematically answering a wide range of questions related to your core topics is a powerful way to demonstrate your expertise. When your domain consistently provides clear, concise, and helpful answers that appear in PAA boxes, it sends strong signals to Google.

    You are effectively telling the search engine that you are a reliable source of information for an entire topic. This goes beyond a single keyword. By comprehensively addressing the universe of questions around your niche, you build a reputation with both users and algorithms.

    This is where consistency is key. Don’t just target a few high-volume questions. Develop a process to continually find and answer new and related queries. This ongoing effort solidifies your position as the go-to expert, making it more likely for Google to feature your content not just in PAA, but in other prominent SERP features as well.

    The Human Edge: Why AI Scrapers Are Only the Starting Point

    Let’s address the role of automation. AI-powered tools are incredibly efficient at scraping thousands of PAA questions and identifying potential opportunities. They save countless hours of manual research and are an essential part of the modern SEO toolkit. To ignore them would be foolish.

    These tools, however, lack the human understanding of context, nuance, and brand voice necessary to create answers that truly connect with users. An AI can tell you what people are asking; it cannot tell you how your brand should answer. It cannot interpret the emotional driver behind a question or craft a response that builds a lasting connection.

    Your competitive advantage lies in the layer of strategy you apply on top of the data. Use AI for discovery, but rely on human expertise for creation and implementation. This combination of machine efficiency and human intellect is what separates a basic PAA strategy from a dominant one.

    Is your website’s technical foundation ready for this level of content strategy?

    Before you build out extensive content hubs, you need to be certain your site is technically sound. A slow site, poor internal linking, or indexing issues can undermine even the best content. OnDigital’s comprehensive Technical SEO Audits provide the clarity you need to build your authority on a rock-solid foundation.

    Measuring What Matters: Beyond Rankings and Snippets

    Tracking your appearance in PAA boxes is a start, but it’s a vanity metric if it doesn’t translate to business results. A mature People Also Ask SEO strategy focuses on measuring the true impact on organic traffic, conversions, and overall brand visibility.

    You need to move beyond simple rank tracking. Use your analytics to answer more important questions:

    • What is the click-through rate? Are users who see your PAA answer actually clicking through to your site?
    • What is the post-click engagement? Once they land on your page, do they stay? Do they consume more content?
    • Does this traffic convert? Are users who arrive via a PAA snippet taking a desired action, like signing up for a newsletter or filling out a contact form?

    By connecting your PAA efforts to these core business metrics, you can demonstrate the real value of your work. This level of analysis allows you to refine your approach, focusing on the questions and topics that drive not just visibility, but tangible growth for your London-based business or your clients.

    Conclusion: Start Answering, Stop Chasing

    The ‘People Also Ask’ box is more than a SERP feature; it’s a direct line to your audience’s needs. By treating it with the strategic respect it deserves, you can transform it from a minor tactic into a powerful engine for building authority and driving growth. The future of PAA optimization belongs to those who combine the efficiency of technology with the irreplaceable insight of human expertise.

  • The Traffic Is Gone. The Dragon Is Awake. Your New Reddit Marketing Strategy Is Here.

    The Traffic Is Gone. The Dragon Is Awake. Your New Reddit Marketing Strategy Is Here.

    Summary: For digital marketers and publishers in Canada, the familiar wells of referral traffic from Google and Meta are drying up. Reddit, long considered a high-risk, unmanageable platform, has emerged as the most powerful, untapped source of audience engagement. The launch of Reddit Pro, a new suite of tools for publishers, provides the first-ever safe and effective way to engage these communities. This article outlines a new Reddit Marketing Strategy, proving that early adopters are already seeing massive returns and explaining how you can “wake the dragon” without getting burned.

    Your referral traffic is in a nosedive. Let’s not pretend it isn’t.

    If you’re a digital marketer, publisher, or brand manager in Toronto, you’ve been watching the analytics. You’ve seen the impact of Google’s AI-driven SGE, which answers questions before users can click. You’ve felt Meta’s decisive pivot away from news and outbound links. The old, reliable faucets are being turned off, one by one.

    This has sent our entire industry into a desperate search for new, scalable sources of engagement. We’re all looking for the next growth engine.

    Here’s the uncomfortable truth: it’s been staring us in the face for over a decade.

    We’ve all been ignoring the “sleeping dragon” of the internet: Reddit. It’s a platform we’ve written off as a high-risk, low-reward minefield, full of anonymous users notoriously hostile to any form of marketing.

    That fear was once a shield; it is now our single greatest liability. Reddit is “waking up.” It has just launched Reddit Pro, a set of tools specifically for publishers. This launch isn’t just an update; it’s a formal invitation. Reddit is now the most critical and untapped growth engine for your brand, and these tools are the key to finally, and safely, turning it on.

    The High-Risk Minefield We All Avoided

    For decades, the prevailing wisdom on Marketing on Reddit was simple: don’t. Brands that tried were often publicly shamed. Users, grouped in hyper-niche communities called subreddits, possess a near-supernatural ability to detect a “shill” or an inauthentic post.

    The fear of this backlash was palpable. We were terrified of our content being torn apart, of our brand accounts being downvoted into oblivion, of becoming the main topic on r/HailCorporate. This fear caused us to ignore one of the largest, most engaged collections of communities on the planet.

    Think about the scale. Reddit isn’t one monolithic site; it’s a constellation of over 100,000 active subreddits. These are not just broad categories. They are hyper-focused forums for every conceivable interest, problem, and identity.

    For a Toronto-based audience, there isn’t just r/toronto. There is r/askTO, r/TorontoRealEstate, r/TorontoBlueJays, r/PersonalFinanceCanada, and thousands more. These are places where your target audience is already gathered, actively discussing the exact topics you write about. They are asking questions, sharing frustrations, and seeking recommendations with a level of candor unseen on any other platform.

    While we were busy buying ads on platforms where users passively scroll, we ignored the platform where users actively congregate around specific passions. We avoided the “minefield” and, in doing so, missed the goldmine.

    The Referral Traffic Faucet Is Running Dry

    The digital landscape that allowed us to ignore Reddit is gone. The ground has fundamentally shifted beneath our feet.

    Referral traffic from the two giants, Google and Meta, is in sharp decline. This isn’t a temporary dip; it’s a permanent realignment.

    Google’s Search Generative Experience (SGE) and AI-powered overviews are designed to provide direct answers on the search results page. This “zero-click” search means that even if you rank number one for a query, the user may get their answer from your content without ever visiting your site. For publishers who built an entire business model on answering questions, this is an existential threat.

    At the same time, Meta (Facebook and Instagram) has been clear about its priorities. It is moving away from news, politics, and, in many cases, outbound links altogether. The algorithm now favors creator-driven “entertainment” over publisher-driven “information.” The days of a viral Facebook post sending tens of thousands of new visitors to your site are effectively over.

    This has created a massive traffic vacuum. Publishers and brands are now in a desperate scramble for new, reliable traffic sources. We are being forced to re-evaluate every assumption we’ve ever had. This new reality is precisely why overlooking a platform with over 1.6 billion monthly active users is no longer a viable strategy. It’s a business-ending mistake.

    Reddit Is Waking the Dragon: A New Reddit Marketing Strategy

    Here is the most important development: Reddit itself is actively “waking the dragon.” The platform’s leadership understands its unique value and is now building the bridge for content creators to cross.

    The launch of Reddit Pro is the clearest signal yet. This is not some minor feature update. It is a purpose-built suite of tools designed to solve the very problems that kept marketers away. It is a formal invitation to publishers, offering a safe-conduct pass into its communities.

    What does Reddit Pro actually do? It provides the “map” to the dragon’s lair.

    • Article Insights: This is the most powerful feature. Reddit Pro allows you to see where your website’s content is already being shared organically across the platform, even if you never posted it. You can see which communities are sharing it, what they’re saying about it, and how much discussion it’s generating. This is market research gold. It’s your audience telling you exactly what they find valuable and where they live.
    • AI-Powered Community Recommendations: The biggest question for marketers has always been, “Where do I even post this?” This tool answers that. It analyzes your content and suggests relevant, high-engagement subreddits where your article would be a good fit. It moves you from “guessing” to “data-driven placement.”
    • RSS Auto-Import & Post Creation Tools: These tools make it simple to get your content onto the platform and format it in a way that is native to Reddit. It lowers the barrier to entry, allowing you to focus on engagement rather than on manual posting.

    These features are not about broadcasting ads. They are about targeted, value-driven participation. This is the foundation of a modern Reddit Marketing Strategy. Reddit is no longer a black box; it’s an open ecosystem with a new set of keys.

    Proof from the Pioneers: The Dragon Can Be Tamed

    This is not a theoretical exercise. The thesis is already being proven by some of the world’s largest publishers.

    Major news organizations like The Atlantic, NBC News, and Vox Media were part of the early beta for these Pro tools. Their findings are staggering. Multiple outlets have reported that Reddit has, in a very short time, become a “top referral source.”

    Let that sink in.

    A platform that was considered “too risky” for a decade is now outperforming other channels for some of the most respected brands in publishing.

    This is tangible evidence that the dragon can be engaged, and the rewards are immediate. These publishers aren’t spamming links. They are using the Pro tools to identify relevant communities and share their best, most relevant journalism with people who are actively seeking it. They are finding that when you approach a community with respect and genuine value, that community responds with clicks, comments, and engagement.

    This early success from industry leaders is the final piece of the puzzle. It removes the “what if” and replaces it with “what is.” The model works. The traffic is real. The only remaining question is why your brand isn’t doing it yet.

    How to Engage the Dragon (Without Getting Burned)

    At this point, the old fears will creep back in. “This is all great,” many will argue, “but Reddit’s core value is its authenticity. The moment we show up, won’t we be met with that same fierce, brand-damaging backlash from users who can spot a ‘shill’ a mile away?”

    This is a valid concern. It is also the most important point to understand.

    This is precisely why the “sleeping dragon” metaphor is so fitting. You don’t fight the dragon. You don’t try to trick it or overwhelm it. You respect it.

    The old way—”broadcast-first” marketing—is dead on Reddit. That method involved barging into a community, shouting your promotional message, and expecting a result. It never worked.

    The new approach, facilitated by Reddit Pro, is a “value-first” approach. This is the core mindset shift. It’s about participation, not promotion.

    Your goal is not to “market” to a subreddit. Your goal is to become a part of the subreddit. Your goal is to use the Article Insights tool to find a community like r/PersonalFinanceCanada that is already debating a topic you’ve covered. Then, you join that conversation, not by saying “Read our new post,” but by saying, “This is a great discussion. Our team actually just published a deep-dive on this exact problem, and it might help answer some of the questions here.”

    The Pro tools are the map. They show you where the conversation is and what the community values. They help you find the places where your content is not an interruption, but a solution.

    This is the difference between a (hated) advertiser and a (valued) community expert. Reddit users don’t hate content; they hate inauthentic, low-value, self-serving promotion. Be the expert. Be the source of value. The tools now make that possible at scale.

    Your New Growth Engine: The First Steps

    The marketing landscape has been fractured. The old giants of traffic are faltering. We are all searching for what’s next.

    Reddit is what’s next. It’s the “sleeping dragon” that has awakened, and it’s looking right at us. The risk is no longer engaging with Reddit. The new, unacceptable risk is ignoring it.

    The platform is actively inviting you in. It has provided the tools. Your competitors, especially the major publishers, are already inside and reaping the rewards.

    It’s time to stop treating Reddit as an untouchable risk. The dragon is awake, and it’s offering a map.

    Here is your plan:

    1. Go to Reddit’s publisher page and sign up for the Reddit Pro beta today.
    2. Use the new tools to listen first. Run your domain through the Article Insights tool. See where you are already being mentioned. You will be surprised.
    3. Analyze what your target audience is actually talking about in these niche communities.
    4. Begin, slowly and authentically, to participate. Share your most relevant, valuable content in the right places, at the right time.
    5. Start building your new Reddit Marketing Strategy now, before your competitors corner the conversation.
  • Why Your GSC Impressions Dropped: Google’s War on Scraping Kills &num=100

    Why Your GSC Impressions Dropped: Google’s War on Scraping Kills &num=100

    Summary: Google has permanently disabled the &num=100 search parameter. This change ends the ability for SEO tools to request 100 search results in a single query. The immediate impact is a 10x increase in operational costs for rank trackers and SERP scrapers, which must now paginate through 10 separate pages. This is causing tools to break or error. A secondary effect is a sudden, sharp drop in desktop impressions reported in Google Search Console. This drop is not a loss of human traffic but the removal of bot impressions from these tools, revealing that historical impression data has been inflated for years. This is a deliberate, strategic move by Google to combat large-scale data scraping, and SEO professionals must immediately audit their tools and re-baseline their impression data.

    Google Kills &num=100 Parameter: Why Your SEO Tools Are Breaking and Your Data Is Wrong

    Have your SEO tools started failing in the last few days? Are your rank-tracking reports full of errors or incomplete data?

    Perhaps you logged into Google Search Console and saw a sudden, terrifying drop in your desktop impressions.

    Your first thought might be a penalty or a massive algorithm update. The truth is simpler and has more profound consequences for our industry.

    Google has effectively killed the Google &num=100 parameter.

    This isn’t a temporary bug. It’s a deliberate, permanent change. It represents a defensive move by Google to combat data scraping. The SEO industry, which has relied on this function for over a decade, is collateral damage.

    This single change fundamentally breaks the economics of most SEO tools. It also forces us to accept a hard truth: our historical impression data has been wrong for years.

    What Was the Google &num=100 Parameter (And Why Did It Matter)?

    For as long as most SEOs can remember, you could add a simple string to a Google search URL to change the number of results.

    That string was &num=100.

    A typical search for “London digital agency” would return 10 results.

    A search using google.com/search?q=london+digital+agency&num=100 would return 100 results on a single page.

    This was the backbone of the entire SEO rank tracking industry.

    Think about how your rank tracker works. It needs to check where your site ranks for a keyword, from position 1 to position 100.

    Using the Google &num=100 parameter, it could do this with one single request. One query. One page load. One proxy IP address.

    It was fantastically efficient. It allowed tool providers like Semrush, Ahrefs, Moz, and hundreds of others to collect massive amounts of data at a relatively low and predictable cost.

    That entire model of efficiency is now gone.

    The New Reality: A 10x Cost for SEO Tool Data

    As of this week, the parameter is dead. Sending a request with &num=100 now simply returns the default 10 results.

    What does this mean for a tool that needs to check the top 100 positions?

    It means it must now make ten separate requests.

    1. It requests page 1 (results 1-10).
    2. It requests page 2 (results 11-20).
    3. It requests page 3 (results 21-30).
    4. …all the way to page 10 (results 91-100).

    This is a catastrophic shift in operational mechanics.

    The cost of Google SERP scraping has just multiplied by ten overnight.

    Every tool provider now faces a 10x increase in the number of requests they must make. This means 10x the proxy IPs, 10x the bandwidth, 10x the infrastructure load, and 10x the risk of being blocked or served CAPTCHAs by Google.

    This is why your tools are breaking. They are failing to get data, timing out, or returning incomplete reports. Their entire scraping infrastructure is being re-engineered in a panic.

    This 10x cost will not be absorbed by the tool providers. It cannot be.

    Prepare for a new wave of price increases across the entire SEO tool industry. The era of cheap, daily, top-100 rank tracking is over.

    The ‘Desktop Impressions Drop’ Mystery Explained

    This tool crisis is running parallel to a second, confusing trend: the widespread desktop impressions drop in Google Search Console.

    At our London agency, we’ve seen clients’ GSC reports showing a sharp decline in desktop impressions, starting on the exact same day the parameter was disabled.

    The immediate fear is a loss of visibility or human traffic.

    This is not what is happening.

    You are not losing human visitors. You are losing bot impressions.

    For years, every time an SEO tool used the &num=100 parameter to scrape a keyword, it registered as a “desktop impression” for every single one of the 100 sites it found.

    Your site, ranking at position 78 for a high-volume keyword, might have been getting thousands of “impressions” per day. These were not humans. These were bots from Ahrefs, Semrush, and countless other rank trackers.

    Now that the &num=100 parameter is dead, that scraping has become 10x harder. The scraping volume has fallen off a cliff as tools scramble to adapt.

    The bot impressions have vanished from your GSC reports.

    The “drop” you are seeing is the removal of this long-standing data inflation. What you are left with is a much more accurate, clean baseline of actual human impressions.

    This is, in a way, a good thing. Our data is finally cleaner.

    The bad news? All your historical desktop impression data is inflated. Any report, any chart, any year-on-year comparison you’ve ever made using that data is based on a contaminated metric.

    This Is Not a Bug. This Is Google’s War on Scraping.

    Some in the SEO community have suggested this is a temporary test or a bug that will be reversed.

    This is wishful thinking.

    The widespread, uniform, and global nature of this change points to a deliberate policy decision. The 10x cost implication is not an accident; it is the entire point.

    This Google search parameter change is a strategic, defensive move.

    Google is in a data war, primarily against AI companies. Large language models (LLMs) are being trained by aggressively scraping Google’s search results. This taxes Google’s infrastructure and threatens its core business.

    By killing the &num=100 parameter, Google makes large-scale scraping 10 times more expensive and 10 times easier to detect. It’s harder for a scraper to hide when it has to make 10 distinct page requests per keyword instead of one.

    The SEO industry is not the target. We are simply the collateral damage in this larger conflict. Google is protecting its data asset, and it is willing to break our tools to do it.

    What SEO Professionals and Agencies Must Do Now

    We must accept this new reality and adapt immediately. Waiting for a fix is not an option.

    Here are the four actions you need to take this week.

    1. Audit Your SEO Tools. Contact your SEO rank tracking provider. Ask them directly how they are handling the removal of the Google &num=100 parameter. Are they stable? Are they paginating? Are they limiting tracking depth? Will their prices be increasing? You need these answers to trust your data.
    2. Re-baseline Your Impression Data. Go into Google Search Console immediately. Add an annotation for the date this change occurred. This is your new “Day Zero” for desktop impressions. You must explain to your clients, bosses, and stakeholders that the desktop impressions drop is a data correction, not a performance loss. All future reports must be benchmarked against this new, lower, more accurate baseline.
    3. Re-evaluate Your Tracking Needs. Do you really need to track 100 positions for every single keyword, every single day? For many keywords, tracking the top 20 or top 30 is more than enough. Reducing your tracking depth will be the primary way to manage the new costs that will be passed down from tool providers.
    4. Prepare for a More Expensive Future. The 10x infrastructure cost for SEO tool data is real. It will be passed on to us, the end-users. Budget for increased subscription fees for all your SERP-dependent tools.

    The end of the &num=100 parameter marks a new chapter in our industry. It’s one with more restricted, more expensive, but ultimately more accurate data. The sooner we adapt, the better.

  • The Unified Mandate: Why CWV, GEO, and AEO are Non-Negotiable for LLM Optimization

    The Unified Mandate: Why CWV, GEO, and AEO are Non-Negotiable for LLM Optimization

    Your current, siloed SEO strategy is obsolete. Relying on separate teams for technical SEO, content, and local optimization is a failing model in an AI-driven search world. Google’s Search Generative Experience (SGE) and other LLM-driven models do not just “rank” your content; they “ingest” and “synthesize” it to form direct answers. Winning in this new era requires a single, unified framework. This new, holistic SEO model merges technical performance (Core Web Vitals), local context (GEO), and answer-first content (Answer Engine Optimization) into a cohesive LLM Optimization strategy. This article explains why this pivot from keyword optimization to intent fulfillment is essential for survival and how to begin implementing it.

    Your technical SEO team just spent a month shaving 200ms off your Largest Contentful Paint (LCP). Your content team published five “keyword-optimized” articles. Your local agency is busy managing Google Business Profile reviews across your London offices.

    And yet, your visibility in AI-generated answers is zero.

    Why? Because these efforts are completely disconnected. You are meticulously optimizing for a search engine that is rapidly being replaced. The age of “10 blue links” is ending. The new battleground is the AI-generated answer box, and it plays by an entirely different set of rules.

    Surviving this shift demands a radical pivot. We must stop chasing keywords and start mastering “intent fulfillment.” This requires a holistic strategy where technical performance (CWV), local context (GEO), and answer-first content (AEO) are all optimized for ingestion and validation by Large Language Models (LLMs).

     LLMs Don’t “Crawl,” They “Ingest”: Your New Content Mandate

    For two decades, SEO has been about “crawling.” We built sites for Googlebot. We used keywords to help it index and rank a document.

    That process is now secondary.

    LLMs and generative AI experiences like SGE operate on a different principle: ingestion. They do not want to just list your page; they want to consume it. They extract its information, validate its authority, and synthesize its facts into a new, combined answer.

    If your content is not built for this ingestion process, it will be ignored.

    AI-driven search values your content differently. Success is no longer about keyword density. It is about:

    • Structured Data: Schema (like FAQPage, Article, LocalBusiness, Product) is no longer a “nice to have.” It is the instruction manual you give the LLM. It explicitly tells the AI what your content is, what your business does, and how to use your information correctly. Without it, the AI has to guess. It will not guess. It will use a competitor’s content that is structured.
    • Clear E-E-A-T Signals: Experience, Expertise, Authoritativeness, and Trustworthiness are the primary validation signals for an LLM. An AI model is trained to identify and prefer sources that demonstrate authority. This means clear, detailed author biographies, a robust “About Us” page, external citations from reputable sources, and transparent contact information. A page with “By Admin” is a page that an LLM will rightly judge as untrustworthy.
    • Answer Engine Optimization (AEO): This is the “AEO” pillar. You must stop writing “articles” and start providing “answers.” Your content must be formatted for synthesis. This means using clear, descriptive headings (H2s, H3s) that map to user questions. It means using concise paragraphs, bulleted lists, and tables. If a user asks a question, your page must provide the most direct, well-supported, and easily-extracted answer to that question.

    LLM Optimization begins here. You are no longer writing for a user; you are writing to be the source for an AI that is serving the user.

    Core Web Vitals and AI: Why Technical Performance is Now a Trust Signal

    For years, many marketing directors have viewed Core Web Vitals (CWV) as a separate, technical chore. A box to be ticked by the IT department to keep Google happy.

    This is a critical, and now dangerous, misunderstanding.

    A slow, janky site (poor LCP, high Cumulative Layout Shift) is, first and foremost, a bad user experience. AI models are trained on massive datasets to associate poor user experience with low-quality, untrustworthy content.

    Think of it from the AI’s perspective. Its primary goal is user satisfaction. If it synthesizes an answer and provides a link to your site for more information, and that page takes five seconds to load or shifts around as ads pop in, the user is frustrated. This frustration reflects poorly on the AI, not just your brand.

    The AI model infers this. It understands that a site that invests in a stable, fast, and secure user experience (good CWV, HTTPS) is more likely to be a legitimate, authoritative operation. A site that cannot be bothered to fix its technical foundation is probably not a reliable source of information.

    Core Web Vitals are no longer just a “Google” metric. They are a foundational trust signal.

    A technically sound site is the price of entry to be considered a trusted source for LLMs. A poor CWV score is a high-friction signal. The LLM will simply get its information from a lower-friction, higher-quality source. Your excellent, well-researched content will never even be ingested because your technical foundation failed the first test.

    Context is King: How GEO and AEO Create Relevance for LLMs

    LLMs thrive on context. A query like “best Sunday roast” or “compliance software” is functionally meaningless on its own.

    In the old model, the user would have to refine their search. In the new model, the AI does it for them.

    AI models are integrating user data by default. The most important contextual signal is location (GEO). That “best Sunday roast” query, coming from a user in London, is instantly understood as “best Sunday roast near me” or “best Sunday roast in Islington.”

    A query for “compliance software” from a device located in the City ofs London is understood as “MiFID II compliance software for UK-based financial firms.”

    Your content must be explicitly optimized for this contextual intent. This is where GEO (Local SEO) and AEO (Answer Engine Optimization) converge into a single, powerful tool for LLM Optimization.

    Look at your current content.

    • Bad Content: A blog post titled “Our 10 Favorite Sunday Roasts.”
    • Good Content: A local landing page titled “The Best Sunday Roast in Islington, London.” This page is structured with clear AEO-driven Q&As (“What time is Sunday roast served?”, “Is it kid-friendly?”, “What is the average price?”, “What are the vegetarian options?”). It is marked up with LocalBusiness and Restaurant schema, has an embedded map, and lists opening hours.

    This “Good Content” example is now the perfect, ingestible source for an AI. When a user asks their phone, “Where can I get a good Sunday roast near Angel station that’s good for kids?”, the AI can confidently synthesize an answer directly from your page.

    You are no longer just optimizing for a user searching on Google Maps. You are optimizing to be the definitive source that the AI uses to answer that user’s specific, location-aware, and high-intent query.

    This Isn’t More Work, It’s Smarter Work: The Compounding Returns of a Holistic SEO

    I speak to marketing directors and in-house SEO managers in London every week. The immediate pushback is predictable: “My teams are already at capacity. We cannot manage another ‘optimization’ trend. We are stretched thin managing our current SEO, content, and technical backlogs.”

    This reaction is based on a false premise. This is not another trend to add to the pile. It is the unification of your existing, scattered, and inefficient efforts.

    Right now, you have three different teams (or agencies) running on three separate treadmills, producing three separate, low-impact assets:

    1. Tech Team: Fixes a sitewide CLS issue. (Impact: Marginal)
    2. Content Team: Writes a 1,500-word blog post on a broad keyword. (Impact: Low)
    3. Local Team: Updates holiday hours on GMB. (Impact: Minimal)

    This is a massive waste of resources.

    The new, unified model creates a single, high-impact asset. Imagine your team is building a new page for a key commercial service.

    • The Process: The Content Strategist, GEO Specialist, and Technical SEO work together from the start.
    • The Asset:
      • The page structure is pure AEO. It is built as a series of direct answers to the most common user questions (“What is [service]?”, “Who needs [service]?”, “How much does [service] cost in London?”, “What is the process?”).
      • The content is enriched with GEO signals. It explicitly mentions the London boroughs or industries it serves. It includes LocalBusiness schema, client testimonials with locations, and embedded maps.
      • The page is validated by CWV. The technical team ensures this specific page loads instantly, is perfectly stable, and is flawless on mobile.

    The Payoff: This single asset now creates compounding returns. It serves all search masters simultaneously.

    • It ranks in traditional search for its target keywords.
    • It appears in local search and map packs for its GEO-specific terms.
    • It is now the perfect, ingestible, validated source for an LLM to use in an AI-generated answer.

    You have stopped bailing water with three different buckets. You have unified your team to build a single, faster boat. This isn’t more work; it’s smarter, more focused work.

    Stop Auditing in Silos: Your First Step to a Real AI Search Strategy

    Your current reports are lying to you.

    A “green” Core Web Vitals score means nothing if your content is unstructured mush that an AI cannot ingest. A high-ranking blog post is a vanity metric if an AI bypasses it entirely by providing a direct answer sourced from a competitor.

    The fundamental problem is that you are measuring the components, not the system. You are admiring the individual bricks while your house is being redesigned by someone else.

    The first step is to get an honest baseline. You must stop commissioning a “Technical Audit,” a “Content Audit,” and a “Local SEO Audit” as if they are unrelated. You must see how these elements perform together, in the context of your main competitors.

    Stop auditing your site in silos. At OnDigital, we have moved beyond these fragmented, outdated reports. It is time for a unified “AI Readiness Audit” that benchmarks your CWV, GEO,AEO, and LLM signals against your top competitors.

    This is the only way to see the real gaps and build a strategy that works for the next decade of search, not the last. The AI search era is here. You can either be the source it quotes or the link it forgets.

  • Hello world!

    Welcome to WordPress. This is your first post. Edit or delete it, then start writing!