If you have followed along with this guide so far, your website is optimized both onsite and off. The effort you’ve put in so far alone should yield substantial results, if not now then steadily in the future.
Keep in mind that altering any aspect of your SEO campaigns may not produce immediate results. Those results may not reveal themselves tomorrow or in three or five days.
The fact is, Google’s algorithm works when it wants to and you’ll have to be patient. Google will crawl your newly improved web profile in due time.
In the meantime, you can go to Google Search Console and determine if there are any crawl errors as you wait for higher rankings.
All of the audit steps explained so far have delved into the non-technical aspects of search engine optimization. Anyone can do them, regardless of technical capabilities, in most cases.
To be thorough, it becomes important to crank open the hood and take a look at the engine. We’re referring to those aspects of your website that keep it online and consistently live for your customers to visit.
These include other back-end site details, coding issues, server malfunctions, and all those other behind-the-scenes factors that could hold your rankings back or cause them to drop entirely.
Every effort will be made to ensure that these steps are easy to follow for anyone, even if you’re not a coder or otherwise not technically proficient. You’ll also learn about a few tools that will make analyzing the more technical aspects of your web presence a simple and easy affair.
Oh, boy! An entire book could be written about the SEO technique collectively known as “backlinking” and how much turmoil the practice has caused in the world of SEO. Backlinking used to be the go-to way to gain more prominent rankings in the SERPs. That is, until black-hat scammers began to game the system and thus ruined backlinking for the rest of us.
Instead of gaining links slowly and organically over time, as Google intended, black-hatters bought links by the thousands, for pennies on the dollar, and thus raked in the immense monetary rewards.
So many links were purchased and so many rankings gained fraudulently that Google’s SERPs became a cesspool of garbage. Top rankings were mostly scams, and Google was helpless to stop it. Until the day they released Penguin, the moniker given to the search engine’s most infamous algorithm update.
The day after the update occurred, which made backlinks largely obsolete, many webmasters who relied on backlinks found their rankings in the toilet, or purged from the search engine rankings entirely (also referred to as being “Sandboxed”).
By now you know my infamous history with backlinking and how the practice nearly caused me to fail at the very business I love.
You are also aware that I now treat failure as a lesson and, therefore, in my business, we don’t do any type of backlinking.
Some SEO professionals may decide to engage in processes like guest posting, which are currently seen as white-hat techniques. But I don’t recommend backlinking of any kind. I’ve been burned too many times and I’d hate to see the same thing happen to you.
We see too many impressive results from other, non-risky SEO methods to take the chance. Backlinks are too often faked, and Google is all too wary of them, so it’s best to disavow the practice of backlinking altogether. That’s just my opinion and, for my clients and business, the advice works well.
You can determine which sites are linking back to yours by accessing your dashboards at both Google Search Console and Moz Link Explorer. I recommend using both because each one misses some of your site’s backlinks.
On Google Search Console, click on “Links” and you can view internal links, external links, and all other types of links that may affect your search rankings.
Note: It’s recommended that you conduct a backlinks examination every few weeks. That way, you can systematically prune any backlinks that may be affecting your website’s online profile and reputation. You can also monitor any negative SEO that someone may have maliciously engaged in on your behalf to deliberately hurt your rankings (by building backlinks for you in the thousands).
If you do find suspicious or downright dirty backlinks from nefarious sources, you’ll want to disavow those links promptly to avoid any de-rankings or penalties.
Google offers a free tool that will allow you to disavow any backlinks that may be doing your rankings harm.
You can also use a tool like Link Detox (DTOX). As you can see by the graphic below, the service isn’t cheap, but it may be worth the cost if yours is a big organization with hundreds or thousands of offending backlinks.
On the other hand, if your site is infected with tons of backlinks that aren’t doing you any good, it may be time to secure a different domain and start anew, unfortunately.
If you don’t own the server that powers your website, you might find that the server location is affecting your rankings. Ideally, the location of the server will be in close proximity to your customer base. Clients in Seattle, Washington may experience slower load times visiting your website if your server is in New York City, for example. This matters greatly if your website relies on website traffic and customers within close proximity to the business location.
To test for server location, use a tool like www.site24x7.com. After entering your domain name, you’ll be able to see the website’s city, state, and other critical details, such as another analysis on your site’s load time.
If you deem that your server is too far away from your liking, contact your host provider and request a closer one. Or, change hosts entirely. You can also secure your own server so that you control everything, including up-time. You’ll also have to control maintenance and technical support. Keep those things in mind when making your decision about how and where to keep the server that enables your site to go live.
If you are experiencing downtime, database connection failures, such as “Internal Server Errors,” or database issues, it may be wise to switch hosting providers altogether. If the website utilizes shared hosting, consider VPS hosting, as the site may be sharing server space with Black Hat spammers, and that won’t bode well for your site’s credibility and prominence of search rankings.
All pages should employ HTTPS protocol to ensure security compliance. This requires an SSL (Secure Socket Layer) certificate, which you can obtain from your web host provider. Employing SSL tells visitors to your site that you are about security. Otherwise, your website will indicate that it is not secure right in the address bar of your visitors’ browsers.
What you want is the padlock in the address bar, that indicates your website is secure. That can only come with SSL certification. Incidentally, your level of website security matters for SEO. Google only wants to deliver to its users web listings where their information will be safe. Make that happen with SSL.
There are two sitemap versions. The HTML kind, the ones visitors might see as a web page on your site, tells prospective customers where each page of your list is located and acts as a nice “map” for navigating your website. An HTML sitemap is good for SEO because it can help potentially make your site more user-friendly.
If you are using WordPress, you can create a simple HTML sitemap using the plugin aptly named WP SEO HTML Sitemap (https://wordpress.org/plugins/wp-seo-html-sitemap/).
If you’re not using WordPress, creating a page with links to all of your pages and then including a link to that page with hypertext “Sitemap” is a simple enough fix. You should be able to do that easily enough in most website builders.
Then, there is the other type of sitemap, known as XML. If you are using the Yoast SEO plugin for WordPress, this functionality is built in.
If not, there is a handy platform that will do all the legwork for you. It can be found at XML-Sitemaps.com (https://www.xml-sitemaps.com/).
Once the map is generated by XML-Sitemaps, the resulting page looks like a bunch of code. That’s what it’s supposed to look like. Now, venture over to Google Search Console and click the Sitemaps tab on the left view pane.
From there, you simply input the XML sitemap URL that you were provided with by XML-Sitemaps, hit submit, and you’re golden. You’ll get a necessary boost from having both an HTML and XML sitemap. Well done!
The code within this file tells the search engine bots how you prefer to have your site crawled. Improperly using the robots.txt modifier can make or break your search performance. If robots.txt is set to “Disallow:/,” you are essentially telling Google that you don’t want the search engine to index your site at all. The hash (/) represents the root, so you’re disallowing your entire website! If you don’t see your website or any of its individual pages listed in the SERPs, this could be why.
If you don’t know how to check for robots.txt errors or you have no idea what that means, use a tool like Sitechecker Pro (https://sitechecker.pro/), which can check for these types of errors, as well as give you suggestions on how to fix them.
URL canonicalization is the process of telling Google which version of a webpage (usually the homepage) is the correct one.
This comes in handy in the following cases.
For instance, someone might type into their browser:
While the URLs above may seem like they may lead to the same location, Google sees them as separate. Your job is to tell Google which one is the best representative from the set.
You can ensure Google selects the right URL by using the proper URL form throughout your website.
If you notice errors with a particular page of your website, go into the code and find the <Head> tag. Underneath that, type <link rel=”canonical” href=”theoffendingwebpage.com” />.
Using a canonical tag (“rel canonical”) on your homepage tells Google that a particular version of the URL is the master. Even if users type your URL incorrectly or link to it incorrectly, Google will know where to send visitors.
If you get stuck, SEOMator offers a free Canonicalization tool, which should help to take care of any canonicalization problems you may be experiencing.
Each page of the website should possess a URL that is unique and user-friendly. Even category and sub-categories should be easily discernible. Think of your visitors and how they’d like to read the URL. Under-optimized and over-optimized URL structures can both be problematic. It’s best to go for simplicity, with important words separated by a hash, such as /your-site-here-category-sub-category-this-important-page/.
Analyze your site using Screaming Frog and look down the list of your URLs. Determine if any could be simplified or made to be more descriptive for SEO purposes.
Over-optimized URLs, such as www.awesomewebsitenews.com/awesome-website-news/website-news-awesomeness/ imply that something shady is occurring. If Google notices something like this, it could spell doom for that page’s rankings.
NOTE: If the site is performing well and you notice URLs that could be optimized a bit better, it might be best to just leave them alone, as you’ll have to redirect using 301 to the old URL. Why go through the trouble if nothing’s wrong?
Your site should now be performing well in the SERPs, as you’ve completed a general audit. While this particular audit may have seemed to take a while, the more experienced you get the faster the audits will go.
From here, it’s time to do an oil check on your site, just to see if everything is performing well. For this, we’ll turn to Google Analytics, which can provide you with insight into who’s visiting your site, for how long, and for what reason.
Knowing how to utilize Google Analytics effectively is the key to maximizing your SEO. After all, you can only fix what’s wrong after you ‘ve identified one or more errors, and GA is just the tool to keep you informed.