Getting Started with Internals
Internals is one of our favorite toolsets. As SEOs, we’ve spent hundreds of hours internally linking pages, and we know that a lot gets missed in this process.
Internals helps you find those missed internal link opportunities in existing pages and in new content as you’re drafting it, with both Internals - Auto and Internals - Scan.
Internals - Auto
Internals - Auto is our toolset for finding internal link opportunities within your currently published webpages.
Note: You need to have a working sitemap in order to use Internals. We pull your sitemaps from standard locations and from your robots.txt file. If for some reason you don’t have a robots.txt file and your sitemaps are in nonstandard locations, please contact Nate Matherson at [email protected].
To get started, we recommend integrating with Google Search Console via Positional’s Integrations tab:
Click on Import to get started. Next, you’ll be prompted to sign in with your Google Search Console account, and you’ll be able to choose the domain(s) you’d like to import. During the import process, you’ll also be able to choose the country that your website is primarily serving.
After importing your domain via Google Search Console, head over to the Internals - Auto tab in Positional.
In the Domains dropdown, you’ll see your imported domain with the Google Search Console logo next to it:
You can add domains without using the Google Search Console integration, but this will mean that Internals is unable to access Google Search Console query data — so you’ll generally get fewer internal linking suggestions.
After import, it typically takes 30 to 60 minutes for Internals to complete the crawl. While this is happening, you’ll see a “Crawling in progress…” status below Recrawl Domain:
While the crawl is running, you do not need to hit Recrawl Domain again. After the crawl has finished, the table will populate with internal link suggestions:
Note: If nothing appears in the table, there could be an issue with your sitemap, or our crawler could be getting blocked. While this is not common, you should reach out to Nate Matherson at [email protected] for additional assistance and troubleshooting. If necessary, we can provide our IPs for you to whitelist, which would allow our crawler to work properly. It’s also worth mentioning that, by default, we crawl only as many as 10,000 URLs from your sitemap. If you need to crawl more than 10,000 URLs, please reach out to us.
We have two different approaches: the Keyword Approach and the NLP Approach.
We generally recommend using the Keyword Approach first. The Keyword Approach uses the keywords that you’re currently ranking for to make matches. It also uses Google Search Console query data alongside keyword data from third-party APIs.
In the table, the Source pages are the pages to link from. The Target pages are the pages to link to. The Keyword is the keyword that the Target page is ranking for and that is appearing within the Source page — but nowhere in the Source page are you currently linking to the Target page.
The NLP Approach is similar but different. The NLP Approach uses your headers and URLs to predict what keywords your pages would rank for, and then makes matches that way.
For large and established websites, the Keyword Approach is going to be the most helpful. But if you’ve recently published a large number of new pages that aren’t ranking for many keywords just yet, the NLP Approach may pick up on some of those suggestions sooner. The NLP Approach is often more helpful for new and small websites, too.
You can add filters to remove noise and find more targeted opportunities. For example, you could add a /blog filter on the Source Page list and on the Target Page list, to see only internal linking opportunities between blog pages (assuming that your blog is located on the /blog path):
You could, for example, add filters to and from specific URLs and filter specific keyword phrases in or out, too. You can add only as many as three filters at one time. If you want to do more advanced filtering, we suggest exporting the data to a spreadsheet using the export feature:
By default, you can export as many as 40,000 suggestions at once. However, for most websites there won’t be that many suggestions — a more typical number would be in the range of a few hundred or a few thousand, depending on the size of the website. If you need more than 40,000 suggestions exported, please reach out to us.
You can recrawl your domain as often as you like. You can add multiple domains; the number will vary depending on the account tier. To remove a domain, simply click on the X button located next to the domain in the Domains dropdown.
Internals - Manual
The Internals - Manual scan should be used to get internal linking suggestions for new content as you’re drafting it.
Simply copy and paste the content into the text editor and click on Check For Missing Internal Links:
After the scan is finished loading, you’ll be presented with suggestions for internal links on your website, using the Keyword Approach:
Under Opportunities, you can review the internal link suggestions. Internals also provides the target page slug and suggests anchor text for the link. If there are multiple opportunities for links to the same target, the toolset provides anchor text for each suggestion.
Clicking on the button under Opportunities adds the link to the text into the editor and highlights it in green:
If you then copy the text from the editor, the internal links that have been added will be included in the text.
Internals FAQs:
Can I import multiple domains at once?
Yes. The number of domains that you can import at once will depend on the Positional account tier you’ve selected. Growth and Scale tier users are able to import as many as three or as many as ten domains to Internals at once, respectively.
Nothing is loading in the table. Why is that?
It could be an issue with your sitemap, or our crawler might be getting blocked. Please reach out to Nate Matherson at [email protected] for troubleshooting questions.
Is there a limit on the number of URLs that can be crawled?
Yes, the default limit is 10,000 URLs. Please reach out to us if you need to crawl more than that.
I’ve recently added some of the internal links that were suggested in the Internals - Auto table. If I recrawl the domain, will those suggestions be removed from the table?
Yes.
If I recrawl the domain for one user in the account, will that update the data for all users within the same account?
Yes.