Digital Trends | MIN READ

Tips For A Google-Friendly Website: Google Standards For Websites

Ariane Alyerunga

May 25, 2020

The Google webmaster guidelines provide general design, quality, and technical recommendations to help webmasters develop websites that are attractive and user-friendly. Below are some of the tips for building a Google-friendly website.



Give visitors the information they seek. You need to provide top-quality content for your web pages, particularly your homepage. This is one of the most important things you can do to ensure the success of your website.  If your pages are composed of useful information, their content will draw many visitors and convince other website administrators to link to your site.

To create a useful, information-rich website, develop pages that concisely describe your topic. Think of the words your users would type into their browsers to find your pages and incorporate those words into your site.


Ensure that other websites link to yours

Website links help crawlers detect your site, and ensure that your site receives more visibility in Google search results. When compiling the results for a search, Google utilizes complex text-matching mechanisms to find pages that are both significant and relevant to each search.

Google perceives a link from page y to page z as a vote by page y for page z. The votes cast by pages that are popular carry more weight and help make other pages important. Remember, though, that its algorithms can tell the difference between natural and unnatural links.

Natural links to your websites occur as a component of the dynamic nature of the internet; when other websites discover your content, believe its valuable, and conclude that it would help their visitors.

Unnatural links, on the other hand, are developed specifically to make your websites appear popular to web search engines. Some of these links, like the ones from link schemes and doorway pages, are discussed in Google’s webmaster guidelines.  Only natural links will help index and rank your website.


Your site should be accessible

You should develop your website with a rational link structure. Use a text browser like Lynx to evaluate your site. Most web crawlers view your site just like Lynx does. If features such as JavaScript, sessions IDs, Cookies, Dynamic HTML or Macromedia flash hinder your ability to view the entire site in a text browser, then crawlers may find it hard to process it.


Mistakes you should avoid

Keyword stuffing

Do not stuff your pages with keywords, attempt to hide pages, or publish crawler-only pages. If your site has pages, text, or other links that you don’t want visitors to see, then Google will consider it untrustworthy and may ignore your website completely.


Image-based text

Do not use images to display vital information such as names, links, or content. The crawler cannot interpret text that is embedded into an image.

You may use ALT attributes if the primary content and keywords for your page can’t be displayed in conventional HTML. Don’t create multiple versions of a single page under different addresses.


Duplicate content

Many websites offer text-only or printer-ready versions of pages that contain similar content to matching graphic-rich pages. If your website has copied content that can be accessed via different addresses, there are several ways to indicate the preferred version of the page. This will help you avoid penalties for having duplicate content.


Link schemes

Any links purposefully created to influence PageRank or a website’s position in Google’s search results may be considered privy to a link scheme and therefore constitute a breach of Google’s webmaster guidelines. This also includes any actions that alter the in-coming and out-going links to your site.

The following are examples of link programs that may negatively affect your website’s rating in search results:

  • Purchasing or selling links that confer page rank. This includes exchanging money for backlinks or content contains backlinks; sending people free products in exchange for reviews and links.
  • Too many link exchanges (link to my site and I’ll link to yours) or partner pages specifically designed for cross-linking.
  • Commercial article writing campaigns or guest writing campaigns with keyword-fixated anchor text links.
  • Using autonomous services or programs to create links to your website
  • Asking for backlinks as part of a Terms of Service or contractual agreement without allowing the signee to choose whether to approve the outbound link.

Furthermore, creating links that aren’t editorially verified by a website administrator, known otherwise as outbound links.

Here are a few examples of unnatural links that may breach Google’s guidelines.

  • Text advertisements that distribute page rank.
  • Native advertising where payment is accepted for articles that include links that pass on page rank.
  • Links with optimized anchor text in press releases or articles distributed on other websites.
  • Low quality (spam my) bookmark or directory site links.
  • Broadly distributed links in the templates or footers of various websites.

Please note that Pay-per-click advertising links that do not distribute PageRank to the buyer of the advertisement do not violate Google’s guidelines.


How to fix or avoid damage from link schemes

Link-builiding-640x340 Tips For A Google-Friendly Website: Google Standards For Websites

There are several ways to keep page rank from passing to other websites and disavowing bad backlinks. You could:


Tell Google the link is sponsored by adding a Qualifying attribute to the <a> tag.

  1. rel=”sponsored”: this attribute marks links that are advertisements or paid placements (also known as paid links) as sponsored. Though the Nofollow attribute was recommended for such links and is still an appropriate way to mark them, “sponsored” is preferred.
  2. rel=”ugc” Google recommends that user-generated content such as forum posts and comments should be marked “ugc.” However, to recognize trusted contributors, you may remove the attributes. Remember to avoid comment spam, though.
  3. rel=”nofollow” use the nofollow value when other the designations above do not apply, and you do not want Google to associate your site with, or crawl a particular page you’ve linked to. For internal links, you should use a robots.txt file.

Links marked with the rel attributes mentioned above are usually ignored by Google’s index. Remember, however, that linked pages may be found by other means, like sitemaps or links from other websites, so they could still be crawled. The rel attributes are placed only in <a> tags (since Google can only follow links marked by the <a> tag). Adding rel=”nofollow” to a link may not preserve PageRank the same way it used to due to changes in Google’s algorithm, but it is essential for links that may dilute the relevance of your subject and links to untrustworthy pages.

As mentioned earlier, paid links and ads must have a disavow attribute (such as “nofollow”). If you have paid links marked as “followed,” then search engines might think you are trying to gain an unfair ranking advantage and penalize your website. Google’s algorithm is good at finding unscrupulous paid links. Using the nofollow attributes will help you avoid punishment.

The nofollow tag will also help you handle links to off-topic pages, whether they’re external or internal. You should keep search engines from misinterpreting what your pages about. Linking relevant pages reinforces the relevance of your topic. Consequently, to keep your topics clear, use the “nofollow attribute when joining unrelated pages to each other.


How to use robots.txt

A robots.txt file determines which parts of your website search engines will index. It is a text file comprised of directives which tell search engines the pages that they should or shouldn’t add to their database. Keep in mind, though, that adding wrong instructions here will hurt your search engine rankings as it may prevent the search engines from crawling important pages. Robots are the applications that crawl through websites, recording the information they discover. In the robots.txt file, they are defined as User-agents. You may also hear them referred to as spiders, web crawlers and bots. However, these aren’t official search engine crawler names. For the txt file, you need to find the official names of the search engines you wish to target (the Google crawler, for instance, is called “Googlebot”). Using a robots.txt file is fairly straight forward.  All you need to do is tell the bots which pages to “allow” this means index, and the ones to “disallow” or ignore.

You need to create a txt file in your server’s root directory, then specify the user agents, the pages that should be indexed, and the ones that shouldn’t. Here’s an example

User-agent: *      #for all

Disallow: /wp-admin/

Allow: /wp-admin/admin-ajax.php


Use the link disavow tool

Google launched the highly anticipated “disavow links” tool in 2012. After several months of beta testing, it was added to the Google search console suite.

Website administrators are advised, however, to use it only as a last resort. It should only be employed if the site owners where the links originated cannot be contacted. The list of links to be disavowed should be placed in a text file. You have the option of disavowing individual links or all the links associated with a particular domain. The links to be disavowed should appear in the following format




You may also use several formats in a single file, as shown in an example below:

#contacted owner of on 8/3/2020 to

#request link removal but didn’t get a response.


#owner of took down most links but forgot these

To explain, the lines that start with a “hash” symbol are interpreted as comments, so Google ignores them.

The “domain:” keyword implies that you would like to disavow links from all the pages on a specific website (in this example, “”). You can also ask Google to disavow specific pages (in this example, three specific pages on Once you have made your file, you should access the disavow link tool through the central Google webmaster dashboard. You will select your website, go through the precautionary notifications, then submit your disavow file.

It will take a while

The disavowing process will not be immediate. It may be a few weeks before the changes go into effect. Google also reserves the right to ignore submissions if it feels that they cannot be trusted. Once you submit the file, there will be an option to download and resubmit it with changes. There’s a file size limitation of 2MB. Google’s processing delay means that should you make a mistake, it may take weeks to “reavow” the links you feel are important, so take care.

To modify the links that Google should ignore, download the current disavow file, edit it to reflect only the links that you want Google to ignore, then re-upload the file. You will have to wait for the new file to be processed by the indexing system. The disavow tool is similar to using the “nofollow” attribute, which allows websites to link to others without passing on rank credit.

Right now, quality is valued more than quantity. Google penalties have led many website administrators to not only stop link building but to start pruning instead. Low-quality links can kill your search engine rankings. Only the links that come from low-quality websites and pages related to your website will appear natural and help you avoid penalties. So you should avoid buying or soliciting links. Earn them the right way or not at all.

The best strategy to get other websites to create relevant, high-quality links to yours is to develop distinctive, useful content that will naturally gain audience attention on the internet.  Writing good content comes with several rewards.  links are like editorial votes that are given by choice, and the more useful your content is, the higher the chances that someone else will like that content and link to it.

Website optimization can be difficult in the beginning, but after you learn to follow Google’s policies, your website will rank in no time. Just keep yourself updated with the press releases on changes to Google’s rank algorithms and you will improve your performance.

Unfortunately, however, certain parts of the site optimization process like technical SEO and troubleshooting may prove to be complicated, and it may prove necessary to seek professional advice. There are several web design companies in Uganda that can help you. Some of these include Sam Web designs, Inovware Inc., and Sadja WebSolutions.