FAQ by Customer Type

Real questions people ask, organized by who's asking. Click any question to see the answer.

New Startup

Just launched or about to launch

I just built my site with an AI builder. Is it too early for Scaup?

The misconception: People think they need traffic before they need SEO. It's the opposite. Google's index takes weeks to months to pick up a new site. Every day you wait is a day your competitors are building a lead.

The reality: AI builders produce great-looking sites with zero search visibility. The HTML is clean, but the titles are generic ("Home", "About"), there's no keyword strategy, and the content doesn't match what people actually search for. That's the gap between "having a site" and "being found."

What we do: We connect to your Google Search Console day one, analyze who your competitors are and what they rank for, then start rewriting your content to target real searches. The earlier you start, the faster Google picks you up.

Sales tip: "Your AI built the site. Ours gets it found." This is our strongest pitch for this audience.
My site only has 3-4 pages. Is there even enough to work with?

The misconception: "I need more content before I invest in SEO." Actually, having few pages is an advantage. On a 200-page site, improvements get diluted. On a 4-page site, every single change has a massive relative impact.

Why it's actually ideal: Most small sites rank for exactly zero keywords on purpose. Their pages exist, but they're not targeting anything specific. We take those 4 pages, figure out the highest-value searches they should target, and optimize them. That alone can start generating traffic.

Then we grow: Our competitor analysis finds searches your competitors rank for that you don't have a page for at all. We create those pages. A 4-page site might become a 10-page site in the first month, each page targeting a specific search opportunity.

How long until I see results?

Why this question matters: Anyone who promises fast results is either lying or doing something spammy. Google intentionally makes ranking a slow, trust-based process to prevent manipulation. There's no shortcut.

Honest timeline: 2-4 weeks for initial signals (pages getting indexed, first impressions in Google Search Console). 2-3 months for meaningful traffic on a new site. Existing sites with some authority see faster results because Google already trusts them.

What you see right away: Your growth plan, the changes we're making each week, keyword movement in your weekly email. You'll know exactly what's happening even before the traffic shows up. There's never a "black box" period.

Sales tip: set expectations early. Under-promise, over-deliver. "2-3 months" is honest and builds trust. Anyone promising page-one in 2 weeks is setting up for a chargeback.
Do I need to know anything about SEO?

Why they're asking: They've probably looked at SEO tools before and bounced off the jargon. "Crawl budget," "domain rating," "SERP features" - it feels like a different language. They're worried Scaup will be the same.

The answer: No. Zero. That's literally why we exist. You'll never see a score, an audit, or a list of "issues." You connect your site, we generate a growth plan in plain English, and every week you get an email saying "we improved 3 pages this week" - not "we optimized your H1 tag density."

If you can read an email, you can use Scaup.

Existing Site

Already has a site with some traffic

Will Scaup break anything on my existing site?

Why they're worried: They've invested in their site. Maybe they had a bad experience with a tool or contractor who broke something. This is a trust question, not a technical one.

The architecture that makes this safe: We never touch code, design, layout, or structure. We only edit content - text, titles, descriptions, headings. Every change runs through 3 layers of validation: (1) code-level parse check to make sure the file is still valid, (2) AI review to catch content problems like accidentally deleting important info, (3) a full build test on GitHub sites. If any layer fails, the change is rejected before it reaches the site.

And you're always in control: Preview mode is the default. You see every change as a before/after diff and approve or reject individually. Nothing goes live without your say-so. And every change is one-click reversible.

I already have some Google rankings. Will you mess those up?

Why this fear exists: They've heard horror stories about SEO "optimizations" tanking existing rankings. It happens - usually when someone blindly rewrites pages without understanding what's already working. That's the opposite of how we work.

How we protect what's working: Before touching anything, we pull your Google Search Console data. We know exactly which pages rank, for what keywords, at what position. Pages that are performing well get flagged as "protect" - we don't rewrite content that's already winning.

Where we focus instead: Underperforming pages (ranking position 8-20 where small improvements can push you onto page one), pages with impressions but low clicks (title/description aren't compelling enough), and keyword gaps where you have no page at all. We grow what's weak, not mess with what's strong.

And if anything does dip unexpectedly, every change is instantly reversible.

What if I already did some SEO work manually?

Why this is actually great news: A site with some manual SEO work is our ideal starting point. It means there's existing authority, some keyword traction, and a foundation to build on. Results come faster on sites that already have something working.

We don't start from zero: We analyze your current state: which titles are already good, which pages rank, where the gaps are. If you spent time crafting a great title for your homepage, we leave it alone and focus on the 15 other pages that don't have one. We're picking up where you left off, not overwriting your work.

Think of it as going from "I did SEO once" to "SEO happens every week automatically."

Do you handle backlinks?

First, let's talk about what backlinks actually do for a site your size. Backlinks are the #1 factor for competitive, high-volume keywords - think "best CRM software" or "cheap flights." But that's not where small sites and startups win. You win on long-tail, specific searches: "yoga studio downtown Austin," "invoice tool for freelancers," "Astro portfolio template dark mode." For those searches, content relevance and quality are what decide the ranking, not who has more backlinks.

The math: A site with 50 well-targeted pages beating the intent of each search will outperform a site with 5 generic pages and 200 backlinks. Backlinks open doors, but content is what makes people (and Google) stay.

What we do about it: We analyze your niche and provide a curated list of the best-fit websites where you can add a link to your site - relevant directories, communities, industry listings, publications that accept contributions. Not a generic "go build links" suggestion, but specific places with instructions. You spend 30 minutes going through the list, not 30 hours researching.

And: Better content naturally attracts more backlinks over time. People link to pages that are genuinely helpful. That's what we build.

Sales tip: don't dismiss backlinks - the customer is right that they matter. The argument is that they're not the bottleneck for a site this size. Content is. And we handle content + give them the backlink list.
Can I see what you changed and undo it?

Why this matters: Handing over control of your website content to an automated system is a big trust decision. Reversibility is what makes it a low-risk one.

Full transparency: Every change shows a clear before/after diff - the exact text that was there, and the exact text we replaced it with. No hidden changes, no side effects.

Full reversibility: For GitHub sites, every change is a git commit you can revert. For WordPress, we store content snapshots so you can restore with one click. There is no change we make that you can't undo in under 10 seconds.

In preview mode (the default), none of this is even needed because nothing goes live without your explicit approval.

WordPress

WordPress-specific questions

I use Elementor / Divi / WPBakery. Does Scaup work with page builders?

The real problem with page builders: Elementor, Divi, WPBakery, and similar tools store content in their own proprietary format - encoded blocks, shortcodes, nested JSON. Editing that from the outside is dangerous. One wrong character and your entire page layout breaks. We tried, and the risk-to-reward ratio isn't there.

What we do instead: We detect page builder content and skip it. We don't pretend we can safely edit it. That's an honest answer, not a limitation we're hiding.

What we can still do (and this is a lot):

- Page titles - the single most impactful ranking factor, and page builders don't control these
- Meta descriptions via Yoast/Rank Math - also outside the page builder
- New blog posts - WordPress native editor, no page builder involved
- Technical files - robots.txt, sitemap, structured data
- Competitor analysis and keyword strategy - this doesn't depend on how your site is built

For the page builder content itself, we draft improvement suggestions and deliver them as downloadable files you can apply manually. You still get the strategy and the writing; you just paste it in yourself.

Sales tip: be 100% upfront. "We can't safely auto-edit Elementor content, but here's what we can do" builds more trust than overpromising. If most of their site is page-builder pages with no blog, set expectations that our impact will be more limited.
Do I need Yoast or Rank Math?

Why it helps: Meta descriptions are one of the biggest levers for click-through rate. When someone sees your page in Google results, the meta description is what convinces them to click. Yoast and Rank Math expose an API that lets us write those descriptions automatically. Without a SEO plugin, WordPress doesn't have a clean way to set them.

Without a plugin: We can still update page titles, rewrite post content, create new blog posts, and manage technical files (robots.txt, sitemap). That's still the majority of what moves the needle.

Bottom line: Not required, but recommended. If they don't have one, Rank Math free takes 2 minutes to install and gives us full access. It's worth suggesting during onboarding.

We support Yoast SEO and Rank Math. All in One SEO support is coming.

How does Scaup connect to my WordPress site?

Why they're asking: They want to know if this is safe and if it requires installing sketchy plugins. Fair concern.

How it works: WordPress has a built-in system called Application Passwords (since version 5.6, 2020). You go to your WordPress admin, create a dedicated password for Scaup, and we use the standard WordPress REST API. It's the same official method that all reputable WordPress tools use. No sketchy plugins, no FTP access, no giving us your admin password.

For advanced features (automated meta description writes, static file management), we have a lightweight open-source plugin you install. It adds a few API endpoints - that's all it does.

Known edge case: Some managed WordPress hosts (WP Engine, Kinsta, Flywheel) disable Application Passwords for security. We detect this automatically and show a clear error with workaround instructions. It's fixable but worth mentioning upfront if they use one of these hosts.

My site is behind a firewall / Cloudflare. Will it work?

Yes, with a 2-minute setup. Our worker connects to your WordPress API from a fixed set of IP addresses. If your site is behind Cloudflare or another firewall, you just add those IPs to your allowlist. We show the exact IPs in your Scaup site settings.

Why this comes up: Many WordPress sites use Cloudflare or host-level firewalls that block unknown API requests. This is good security practice - it just means you need to tell the firewall "Scaup is allowed." One-time setup, takes 2 minutes.

Sales tip: if they seem overwhelmed by this, offer to walk them through it during onboarding. It's literally adding 2 IP addresses to a list.
Will it create draft posts or publish directly?

The concern behind this: They're worried they'll wake up to published content they didn't approve. That won't happen.

New content is always a draft. We create blog posts and pages as WordPress drafts. You review them in your WordPress editor, edit if you want, and publish when you're ready. We never auto-publish new content.

Existing content edits (titles, descriptions, body text improvements) default to preview mode - you see the change, approve or reject, then we apply it. Once you trust the system, you can opt into auto-apply where changes go live immediately but are always one-click reversible.

You choose the level of control you're comfortable with.

Developer

Technical questions from devs

What frameworks do you support?

Why framework matters: Different frameworks store content in different ways. A Next.js page looks nothing like an Astro page. We need to understand the file structure to make safe, targeted edits without breaking anything.

GitHub-connected sites: Astro, Next.js, Gatsby, Nuxt, SvelteKit, Remix, and static HTML. Astro gets the deepest support (we can create new pages, add components, insert structured data through our recipe system). All others get full content editing and new page creation.

WordPress: Standard WordPress.com and self-hosted. SEO plugin support for Yoast and Rank Math.

If they're on something else: If it outputs standard HTML files, there's a decent chance we can work with it under our static HTML support. Worth checking their repo structure.

How do you make changes to my codebase? Do you just overwrite files?

Why devs ask this: They've seen AI tools that dump entire files. That's terrifying when your site is in production. Rightfully so.

How we actually work: We parse each file into an AST (abstract syntax tree) and identify specific editable zones - a frontmatter title field, an HTML paragraph, a markdown section, a meta tag attribute. We edit only those zones, leaving everything else byte-for-byte identical. We never rewrite an entire file.

Three safety gates before any change reaches your repo:

1. Parse check - we re-parse the edited file to verify it's still valid syntax
2. Diff size guard - if a change is suspiciously large (e.g. 2000+ chars for a heading), it's rejected
3. Build check - we actually run your build command and verify it passes

If any gate fails, the change is rejected. It never touches your repo.

Sales tip: this is a strong trust-builder for developers. They've been burned by dumb AI tools. This is the opposite.
Do you commit directly to main?

The worry: Unreviewed commits landing on production. Understandable.

What happens: We commit to whatever branch your site deploys from. Each change set is one clean commit with a clear, descriptive message. We store the commit SHA so you can revert any individual change at any time.

Why not PRs? For most small sites, the friction of reviewing PRs weekly would mean the work never gets done. The system is designed for "set and forget." But the preview mode gives you the review gate if you want it - you approve changes before we commit, not after.

Does it touch my components, CSS, or layout files?

This is a hard technical boundary, not a soft guideline. The system literally cannot emit imports, JSX expressions, CSS properties, className attributes, React hooks, or component tags. It's not that we "try not to" - the code rejects it at the output level.

What we can edit: Plain text content, meta tag attributes, frontmatter string fields, markdown body, and HTML text nodes (headings, paragraphs, list items). That's the entire surface area. Everything else is off-limits by design.

This is a feature, not a limitation. Staying content-only is what makes it safe to run automatically.

Can I see the changes before they go live?

Yes, and it's the default. Preview mode shows every proposed change as a diff: the exact current text and the exact proposed replacement. You approve or reject each change individually.

The progression: Most people start in preview mode, review the first 2-3 batches of changes, realize the quality is good, and then flip to auto-apply. Auto-apply means changes go live immediately but are still one-click reversible. You choose the level of trust you're comfortable with.

Think of it like a new hire: you review their work closely at first, then give them more autonomy once they've proven themselves.

What about my CI/CD pipeline?

We work with your pipeline, not around it. Our commits land in your repo like any other commit. Your existing CI/CD triggers normally - tests run, deploys fire, whatever your workflow is.

But we also catch problems first: Before committing, we run your build command ourselves. If our change would break the build, we reject it before it ever hits your repo. Your CI pipeline never sees a broken commit from us.

If your pipeline has linting, type checking, or other checks, our content-only changes shouldn't trigger any issues since we never touch code. But if they do, the change gets caught at our build-check gate.

Hard Questions

Objections and tough questions

Isn't this just AI-generated content? Google penalizes that.

The misconception: "AI content = spam = Google penalty." This was true for the old-school article spinners. It's not true for modern AI writing, and Google has said so explicitly.

Google's actual position (official, published 2023): "Appropriate use of AI or automation is not against our guidelines." They evaluate content on helpfulness, not on how it was produced. A human-written page that doesn't answer the search query will rank below an AI-written page that does.

Why our content doesn't read like AI: We run every piece of output through anti-AI-writing filters. 50+ banned phrases and patterns: no "furthermore," no "in today's digital landscape," no "it's important to note," no em dashes. We strip the fingerprints that make AI content obvious. The output reads like a human wrote it because we actively prevent it from reading like a machine did.

What we never do: Fabricate facts, invent statistics, create fake testimonials, make up awards or certifications. Everything we write is grounded in the actual business and its real content.

Sales tip: if they push back, ask them to Google any competitive keyword in their niche and look at the top 5 results. Most of that content is AI-assisted already. The winners aren't the ones avoiding AI - they're the ones using it well. The question isn't "AI or human" - it's "helpful or not."
Why would I pay for this when I can just use ChatGPT?

The assumption: SEO = writing content. If ChatGPT writes content, SEO is solved. It's not. Writing is about 10% of the job.

The other 90% that ChatGPT can't do:

- Knowing what to write - ChatGPT doesn't have your Google Search Console data. It doesn't know which keywords you're close to ranking for, which pages have impressions but no clicks, or what your competitors rank for that you don't. We pull that data every week and use it to decide what to work on.
- Safely applying changes - ChatGPT gives you text. You still need to edit your codebase, update frontmatter, validate the build doesn't break, preserve your OG tags and canonical URLs. We do all of that automatically with safety validation.
- Doing it consistently - You'll use ChatGPT once, maybe twice. Then you'll get busy. SEO works through consistency - weekly improvements, fresh content, reactive optimization when keywords move. We run every single week whether you're paying attention or not.
- Strategy - ChatGPT writes whatever you ask it to. It doesn't tell you that your homepage title is wrong, that you're missing 12 keyword opportunities your competitors own, or that your new page should target "yoga classes Austin" not "yoga services."

The short version: ChatGPT is a pen. Scaup is the whole marketing department.

What if I don't like the changes you make?

Why this is a non-issue: You're never stuck with anything. The system is designed around reversibility because that's what makes automation safe.

Before it goes live: Preview mode (the default) shows you every change as a before/after diff. You approve or reject individually. Don't like a headline? Reject it. Like the description but not the body text? Approve one, reject the other.

After it goes live: Every change is one-click reversible. GitHub sites: we store the commit SHA and can revert it. WordPress: we store content snapshots and can restore the original. There's no scenario where a change is permanent and you're stuck.

The real question is the opposite: What if you love the changes? Then flip to auto-apply and let us handle it. Most people start cautious and move to auto within a few weeks.

You can't do backlinks, and that's the most important ranking factor. So what's the point?

Let's break down when backlinks actually matter. Backlinks are decisive for high-volume, competitive keywords - "best project management tool," "cheap flights to London," "CRM software." These are keywords where 10+ well-funded companies are fighting for position. For those battles, yes, backlinks are king.

But that's not where small sites and startups win. You win on long-tail, specific searches: "project management for 2-person teams," "Astro portfolio template with dark mode," "dog groomer near Bushwick." For these searches - which make up the vast majority of all Google queries - content relevance is what decides the ranking. The page that best answers the specific question wins, regardless of backlink count.

The math for a small site: 50 pages each ranking for a specific long-tail keyword will drive more traffic than 5 generic pages with 200 backlinks. Long-tail keywords have less competition, higher intent (the searcher knows exactly what they want), and convert better.

We don't ignore backlinks though. We analyze your niche and deliver a curated list of the best-fit websites where you should add a link to your site: niche directories, community forums, industry listings, relevant publications. These are specific places with high relevance to your business, not a generic "go build links" suggestion. You go through the list in 30 minutes.

And there's a compounding effect: Better content naturally attracts more backlinks over time. People link to pages that are genuinely useful. The content work we do today becomes the backlink magnet of tomorrow.

Sales tip: never dismiss the backlinks concern - the customer is right that they matter. The argument is: (1) they're not the bottleneck for YOUR site at YOUR size, (2) content is the thing you can control that makes the biggest difference right now, (3) we still give you a backlink action list. That's the complete answer.
How is this different from hiring a freelance SEO person?

What a freelancer actually delivers for $500-2000/month: One monthly report (a PDF you skim), a few title/description rewrites, maybe a blog post, and a call where they explain what you should do next. Between check-ins, nothing happens. And most of their recommendations require you or a developer to implement them.

What Scaup delivers: Every week - not once a month - we pull fresh Google data, identify opportunities, execute improvements directly on your site, and send you a summary of what changed. No meetings, no implementation gap, no waiting for the next monthly cycle.

Where a freelancer genuinely wins: Creative strategy, link-building relationships, hands-on outreach, and judgment calls on brand voice. A good SEO freelancer brings things automation can't replicate.

The honest comparison: 80% of what a freelancer bills for is the on-page content work - title rewrites, meta descriptions, content improvements, keyword targeting. We do that faster, cheaper, and 4x more frequently. The other 20% (relationships, creative strategy, outreach) is where humans still win. The question is whether that 20% is worth $500+/month to you right now, or if the automated 80% is enough to move the needle.

Sales tip: don't trash freelancers - the customer might know one or have worked with one. Respect the value, then show the math on consistency and cost.
What about page speed? My site is slow and that's hurting my rankings.

Let's put page speed in perspective. Yes, Core Web Vitals (page speed metrics) are a Google ranking factor. But they're a tiebreaker, not a dealmaker. Google has said this explicitly: speed matters when two pages are equally relevant. Content relevance still decides the ranking 95% of the time.

And here's the thing: Page speed is almost always a one-time fix, not an ongoing problem. Compress your images, enable lazy loading, let your CDN do its job. Vercel, Netlify, Cloudflare - they all handle this automatically. If your site is on a modern host, your speed is probably already fine. Run a PageSpeed Insights test - you might be surprised.

Why it's not what we do: Speed is a code and infrastructure problem, not a content problem. It requires image pipeline changes, JavaScript optimization, server configuration. That's a different discipline entirely. And unlike content, it doesn't need ongoing weekly attention - fix it once and it stays fixed.

The play: Fix your speed once (it's usually a few hours of work), then let Scaup handle the ongoing content optimization that actually moves your rankings week after week.

Sales tip: don't get pulled into debugging their Lighthouse score. Acknowledge speed matters, suggest a one-time fix, and redirect: "Speed is a one-time fix. Content is an every-week job. We handle the every-week part."
Can you guarantee first page rankings?

Red flag check: Anyone who guarantees rankings is either lying or doing something shady (buying links, cloaking, keyword stuffing). Google's algorithm uses 200+ signals and changes thousands of times a year. No one controls it.

What we can guarantee: Your content will be optimized for the right keywords based on real data. Your site will be updated consistently every week. Your competitor gaps will be identified and addressed. Your technical files will be in order. That's the work - and the work is what drives results.

What we've observed: Sites that consistently publish well-targeted, quality content almost always improve their rankings over time. It's not magic - it's math. More pages targeting specific searches = more chances to rank = more traffic. We just make that happen automatically.

The reframe: Instead of "will I rank #1," the better question is "will my site be doing the right things consistently?" Because the sites that do the right things consistently are the ones that win over time. And consistency is exactly what automation is good at.

Sales tip: this is a trust-building moment, not a weakness. "We don't make guarantees because we're honest about how search works. The companies that guarantee rankings are the ones that'll get your site penalized. What we guarantee is the work gets done, every week, based on real data."
General

Common questions from everyone

How does the setup work?

Why they're asking: They've set up marketing tools before and it took hours. They want to know the time commitment before they say yes.

Three steps, under 5 minutes:

1. Sign up with Google - one click, same Google account they already use
2. Connect their site - pick a GitHub repo or enter WordPress credentials
3. Connect Google Search Console - one click (same Google account, so permissions carry over)

That's it. We generate a growth plan automatically - they'll see it within minutes. First batch of improvements starts the same week.

The sell: "It takes less time to set up Scaup than to read a single Semrush audit."

What does the weekly email look like?

Why this matters: The weekly email is the entire customer experience for most users. They don't log into a dashboard. They read this email and feel good that their site is being taken care of. If the email is bad, the product feels bad.

What it says: Plain English summary. "This week we improved 3 pages, created 1 new page, and your site moved up for 5 keywords." Links to see exactly what changed. Links to revert anything they don't like. Takes 30 seconds to read.

What it doesn't say: No jargon, no scores, no audit results, no "you have 47 issues." No number that can go down while they're paying. Only progress and what we did.

The feeling we're going for: "Oh cool, Scaup did some stuff this week. My site is getting better." Then they close the email and get back to their actual work.

What does it cost?

The framing: Don't lead with the price. Lead with what they get and what the alternatives cost. A freelance SEO is $500-2000/month. Semrush is $130/month and you still do all the work. A content writer is $100-500 per article. We do all of it - strategy, execution, monitoring, content creation - for one flat price.

Structure: One plan, one price, everything included. Content optimization, new page creation, competitor analysis, technical files, weekly execution, backlink suggestions. No tiers, no per-page charges, no hidden AI usage fees.

Update this with actual pricing when finalized. The comparison framing ("less than a freelancer, more than a tool, actually does the work") is the strongest angle.
Can I cancel anytime?

Why they're asking: They're calculating risk. If it doesn't work, can they walk away? Yes.

Cancel anytime, keep everything. When you cancel, we stop running weekly executions. That's all that changes. Every improvement we've already made stays on your site - they're your pages, in your repo or CMS. We don't remove content, we don't revert changes, we don't lock anything down.

The implication: Even if you only use Scaup for 2 months, you keep all the optimized content, new pages, and improved titles permanently. The work compounds even after you cancel.

Sales tip: this is a strong close. "Even if you cancel after a month, you keep everything we built. There's no downside."
Do you support multiple sites?

Who asks this: Freelancers managing client sites, founders with multiple projects, or agencies exploring the tool.

Yes. Each site gets its own growth plan, its own competitor analysis, its own weekly execution cycle. They're completely independent - optimizing one site doesn't affect another.

The pitch for freelancers: "Connect your client sites. Scaup does the ongoing SEO work. You get the credit, they get the results, and you're not spending 5 hours per site per month on content updates."

What about AI search (ChatGPT, Perplexity, Google AI Overviews)?

Why this question is getting more common: People are noticing that when they ask ChatGPT or Perplexity a question, it cites specific websites. They want to be one of those cited websites. This is a real and growing distribution channel.

How AI search actually works: AI tools don't have a separate index. They primarily rely on Google's ranking signals plus their own crawling. The same things that make you rank on Google (relevant content, clear structure, authoritative information) make AI tools cite you. So everything we do for traditional search also helps with AI visibility.

What we do specifically for AI: We create and maintain an llms.txt file - a machine-readable summary of your site that helps AI crawlers understand what you do and what pages to cite. Think of it as a robots.txt but for AI tools.

What's coming: We're building direct tracking that shows which AI tools cite your competitors but not you, and surfaces the specific content gaps. This lets us create pages that directly target AI citation opportunities.

The bottom line: Google still drives 90%+ of search traffic. AI search is additive - it's a bonus channel, not a replacement. We optimize for both, and the work is mostly the same.