Professional SEO Tips & Tricks

Professional SEO Tips & Tricks

The search engines must be able to find, crawl and index your website properly. In this lesson, we've assembled a list of SEO Tips you need to be aware of, with tips to avoid mistakes that could sink your SEO ship.

SEO Tips - Checking Your Instrument Panel

Before we cast off and start talking technical, let's make sure your instruments are working. To do SEO well, you must have analytics installed on your website. Analytics data is the driving force of online marketing, helping you better understand how users are interacting with your site.

technical seo tips

SEO Tips - We recommend you install this free software: Google Analytics and possibly Bing Analytics (or a third-party tool). Set up goals in your analytics account to track activities that count as conversions on your site. Your analytics instrument panels will show you: which pages are visited most; what types of people come to the site; where do visitors come from; traffic patterns over time; and much more. Getting analytics set up is one of the most important technical SEO tips, since it will help you keep your search engine optimization on course.

SEO Tips - Casting Off ... Technical Issues to Watch Out for

1. Avoid Cloaking

Keep your site free from cloaking (i.e., showing one version of a page to users, but a different version to search engines). Search engines want to see the identical results users are seeing and tend to be very suspicious. Technically, any hidden text, hidden links or cloaking should be avoided. These types of deceptive web practices frequently result in penalties.

2. Use Redirects Properly

When you need to move a web page to a different URL, make sure you’re using the right type of redirect, and that you’re also redirecting users to the most appropriate page. As a professional SEO tips, we recommend always using a 301 (permanent) redirect. A 301 tells the search engine to drop the old page from its index and replace it with the new URL. Search engines transfer most of the link equity from the old page to the new one, so you won’t suffer a loss in rankings.

Mistakes are common with redirects. A webmaster, for example, might delete a web page but neglect to set up a redirect; this causes users to get a "Page Not Found" 404 error. Furthermore, sneaky redirects in any form, whether they are user agent/IP-based or redirects through JavaScript or meta refreshes, frequently cause ranking penalties. In addition, we recommend avoiding 302 redirects altogether. A 302 (temporary) redirect signals that the move will be short-lived, and therefore search engines do not transfer link equity to the new page. Both the lack of link equity and the potential filtering of the duplicated page can hurt your rankings.

3. Prevent Duplicate Content

You need to fix and prevent duplicate content issues within your site. Search engines get confused about which version of a page to index and rank if the same content appears on multiple pages. Ideally, you should only have one URL for one piece of content. When you have duplicated pages, search engines pick the version they think is best and filter out all the rest. You lose out on having more of your content ranked, and also risk having "thin or duplicated" content, something Google's Panda algorithm penalizes.

If your duplicate content is internal, such as multiple URLs leading to the same content, then you can decide for the search engines by deleting and 301-redirecting the duplicate page to the original page. Alternatively, you can use a canonical link element (commonly referred to as a canonical tag) to communicate which is the primary URL. Either solution should be used with care.

technical seo tips

4. Create a Custom 404 Error Page
When someone clicks a bad link or types in a wrong address on your website, what experience do they have? SEO Tips - Let's find out: Try going to a nonexistent page on your site by typing http://www.[yourdomain].com/bogus into the address bar of your browser. What do you get? If you see an ugly, standard "Page Not Found" HTML Error 404 message (such as the one shown below), then this technical SEO tips is for you!

SEO Tips - The default HTML 404 error is a bad user experience.

Since your 404 page may be accessed from anywhere on your website, be sure to make all links fully qualified (starting with http).

Next, tell your server. Once you’ve created a helpful, customized error page, the next step is to set up this pretty new page to work as your 404 error message. The setup instructions differ depending on what type of website server you use. For Apache servers, you modify the .htaccess file to specify the page’s location. If your site runs on a Microsoft IIS server, you set up your custom 404 page using the Internet Information Services (IIS) Manager. WordPress sites have yet another procedure.

We should note that some smaller website hosts do not permit custom error 404 pages. But if yours does, it’s worth the effort to create a page you’ve carefully worded and designed to serve your site visitors’ needs. You’ll minimize the number of misdirected travelers who go overboard, and help them remain happily on your site.

5. Watch Out for Plagiarism (There are pirates in these waters ...)

Face it; there are unscrupulous people out there who don't think twice about stealing and republishing your valuable content as their own. These villains can create many duplicates of your web pages that search engines have to sort through. Search engines can usually tell whose version is the original in their index. But if your site is scraped by a prominent site, it could cause your page to be filtered out of search engine results pages (SERPs).
We suggest two methods to detect plagiarism (content theft):

Exact-match search: Copy a long text snippet from your page and search for it within quotation marks in Google. The results will reveal all web pages indexed with that exact text.

Copyscape: This free plagiarism detection service can help you identify instances of content theft. Just paste the URL of your original content, and Copyscape will take care of the rest.

Try to remedy the plagiarism issue before it results in having your pages mistakenly filtered out of SERPs as duplicate content. Ask the site owner to remove your stolen content from their website. You could also consider revising your content so that it’s no longer duplicated. (SEO Tips: If you can't locate contact information on a website, look up the domain on Whois.net to find out the registrant's name and contact info.)

6. Protect Site Performance

How long does it take your website to display a page? Your website’s server speed and page loading time (collectively called "site performance") affect the user experience and impact SEO, as well. It's a site accessibility issue for users and spiders. The longer the web server response time, the longer it takes for your web pages to load. Slow page-loading times can reduce conversion rates (because your site visitors get bored and leave), slow down search engine spiders so less of your site gets indexed, and hurt your rankings.

You need a fast, high-performance server that allows search engine spiders to crawl more pages per sequence and that satisfies your human visitors, as well. Web design issues can also sink your site performance, so if page-loading speed is a problem, talk to your webmaster. (SEO tips: Use Google's free tool PageSpeed Insights to analyze a site's performance.)

mobile friendly seo tips

7. Use robots.txt Appropriately

What's the first thing a search engine looks for upon arriving at your site? It's robots.txt, a text file kept in the root directory of a website that instructs spiders which directories can and cannot be crawled. With simple "disallow" commands, a robots.txt is where you can block indexing of:

Private directories you don't want the public to find
Temporary or auto-generated pages (such as search results pages)
Advertisements you may host (such as AdSense ads)
Under-construction sections of your site

Every site should put a robots.txt file in their root directory, even if it's blank, since that's the first thing on the spiders' checklist. But handle your robots.txt with great care, like a small rudder capable of steering a huge ship. A single disallow command applied to the root directory can stop all crawling — which is very useful, for instance, for a staging site or a brand new version of your site that isn't ready for prime time yet. However, we've seen entire websites inadvertently sink without a trace in the SERPs simply because the webmaster forgot to remove that disallow command when the site went live.

(SEO Tips: Some content management systems (e.g., WordPress) come with a prefabricated robots.txt file. Make sure that you update it to meet your site's needs.)

8. Be on the Lookout for Hacked Content & User-Generated Spam

SEO Tips - Websites can attract hacked content like a ship's hull attracts barnacles — and the bigger the site, the more it may attract.

Hacked content is any content that's placed on your website without your permission. Hackers work through vulnerabilities in your site's security to try to place their own content on your URLs. The injected content may or may not be malicious, but you don't want it regardless. Some of the worst cases happen when a hacker gains access to a server and redirects URLs to a spammy site. Other cases involve bogus pages being added to a site's blog, or hidden text being inserted on a page.

SEO Tips - Google recommends that webmasters look out for hacked content and remove it ASAP.

9. Use Structured Data

Structured data markup can be a web marketer's best mate. It works like this. You mark up your website content with additional bits of HTML code, and the search engines read these notes to learn what's what on your site. The markup code gives search engines the type of context only a human would normally understand. The biggest SEO optimization benefit is search results may display more relevant information from your site — those extra "rich snippets" of information that sometimes appear below the title and description — which increases your click-through rates. Structured data markup is available for many categories of content (based on Schema.org standards), so don't miss this opportunity to improve your site’s visibility in search results by helping your SERP listings stand out.