Submit a request
19 minutes to read

SEO ToS: Developing a Website for SEO-based promotion

Gone are the days when everybody made a website for their business just because it was the “in” thing to do. Nowadays, a business website is an important tool that helps your organization in a multitude of ways. Thus, it must be developed with website promotion in mind from the ground up.

Introduction

There is no getting around it: you need to bring in a search engine optimization (SEO) consultant as early as your website’s development, design, and layout planning stages.

An old way of creating a website:

Design —> Developing and layout coding —> SEO

The approach of today:

Design+SEO —> Layout coding+SEO —> Development+SEO

The proper way to create a website is to follow all three development pipelines at the same time, which will result in an up-to-date, tightly programmed, and SEO-friendly website. Obviously, your website’s development roadmap needs detailed terms of reference and technical tasks.

In this article, we will cover the main things to be aware of in terms of technical SEO optimization when developing and promoting your website online.

We will also cover the cases of webmasters missing or skipping certain steps and the associated positive and negative outcomes.

Content management systems

Content management systems (CMSes) are an important factor to consider when creating a website. There are many different options out there, including custom-made ones. Not every CMS is easy to manage, though, and some of them can be quite limited in terms of features. This might render you unable to add certain things to your website, since they might not be supported. If this happens to you, there are two possible ways out of the situation:

  • Finding a roundabout solution, such as your programmer creating and implementing additional code on top of an existing CMS, which can be expensive and time-consuming.
  • Migrating your website to a different CMS that supports the features you need. This may end up being difficult and expensive as well.

Because changing your CMS after the fact is a difficult process, you should choose a proper CMS for your needs from the get-go and stick with it.

Different CMSes are tailored to different needs. CMSes used for online shops, blogs, or corporate websites will have obvious differences. There are also general CMS solutions that fit most websites regardless of their goals. An extensive choice of plugins will let you tackle many different situations, so it’s important to find the one CMS that fits your specific goals.

Below is an overview of some quality CMSes we recommend

  1. Joomla. The second most popular CMS. It’s not very easy to handle as it requires some coding skills related to developing or configuring components, modules, and plugins. Lets you use multiple templates within the same website. The backend is neatly organized: you can easily add new menus and articles or restrict content on specific pages. It also supports multiple languages.
  2. Craft. The first content-based CMS tailored for ease of use and flexibility. The ideal choice for web designers thanks to its customizability and full access to the system, including access to its own HTML code.
  3. Umbraco. A great open-source CMS powered by the .NET framework. A good choice for launching corporate websites used by large transnational corporations. Used by companies such as Microsoft, McDonalds, or Mercedes Benz. One of the notable features of this CMS is integrated website development, which lets you load content from other platforms. Its stack is powered by Microsoft, and its servers are powered by Microsoft Windows, which makes the CMS easy to integrate with external systems and equips it with an integrated security solution. Requires no HTML knowledge.

The robots.txt file

The robots.txt file is located in your website’s root folder (site.com/robots.txt) and is used by the search engine bots. The file includes a list of rules that permit bots to access specific locations on your website or forbid them from doing so.

You should hide your website from indexing before you’re ready to launch it, as it’s not a good idea to let search engines “see” your website until it’s in prime condition and ready for the influx of users. Letting users access your website in an unfinished state may also negatively affect your reputation.

Here is what your robots.txt file should look like if you want to restrict any indexing:

User-agent: *
Disallow: /

This will let all search engine bots know that they shouldn’t index any part of the website.

Common mistakes

Sometimes, programmers enable the robots.txt file containing the Disallow: / setting without specifying the User-agent parameter. This is a huge mistake because  it makes search engine bots ignore the robots.txt file and, thus, index the website. This often means that an unfinished version of the website or a copy hosted on a different domain is indexed instead of the desired version. Always make sure you format your robots.txt file properly.

After your website is ready, update the rules listed in your robots.txt file and make the website open for indexing while it’s still hosted on the test domain. Only launch your website after you have the proper version of robots.txt set up and ready to go.

Here is what should be included in your robots.txt file:

  • A dedicated set of rules for each search engine in case they differ. If the search engines share the same rules, then you can continue using all rules up to User-agent: *.
  • If you wish to apply different rules to different search engines (e.g., making certain pages only indexable by Google, whereas other pages would only be accessible to Bing), then you have to utilize the corresponding User-agents (User-agent: Bing, User-agent: Google, and so on).
  • You should restrict access to system folders and settings you don’t want to index.
  • You must set up a correct sitemap link.

Template code (code validity)

All page templates must follow HTML rules and employ proper syntax. All your code must remain valid, which means it must comply with the current HTML standards.

Here are the rules for creating page templates:

  • All opening tags must be paired with closing tags.
  • Nesting should follow logical rules and not create any infinite loops.
  • If you miss important opening or closing tags, then this might ruin your page layout (e.g., code validation issues, microdata not displaying properly, and so on). Make sure that all the crucial tags such as html, dead, or body follow proper syntax and are included in every template.
  • Remove all redundant HTML code blocks if they don’t add anything to the page.
  • Load scripts from standalone JS files.
  • Don’t leave huge chunks of comments in your layout code. Ideally, you should remove all comments by the time HTML code is displayed.
  • Don’t use tags that make the text stand out in a visual or structural manner (strong, b, i, em) in your block design.
  • Don’t use the H1–H6 header tags for your design.
  • Don’t use рopups, clickups, bodyclicks, or popunders.
  • All CSS styles must be taken out of the main page body and contained in their own separate files.
  • Page size: the total size of all the content, CSS style files, graphics, images, and JavaScript files on the page must not exceed 10 MB; otherwise, the page might not get indexed.

Looking for broken tags

You can open a web page’s source code in your browser (such as Google Chrome). All of the unpaired tags will be highlighted in red.

The <base> tag

The <base> tag is placed within the <head> container and tells your browser the correct full base address for your website. This tag is most commonly used for websites with relative addresses (i.e., no domain name at the start, such as /catalog/tables/). The browser looks for the tag, uses it to discern the website’s full address, and loads it correctly. For instance, if a document’s address is specified as <base href="http://www.site/">, then adding images would requires you to use the only relative address <img src="images/picture.jpg">. The picture’s full destination will be set to http://www.site/images/picture.jpg, ensuring that the browser always fetches the correct file.


When setting up the <base> tag, you must always specify the full domain name, the correct protocol, and place the WWW at the start. If your website is indexed without the WWW and uses the HTTPS protocol, then simply type https://site.com/ in your <base> tag.

Generating friendly URLs

You must always make sure your URLs are SEO friendly.

A friendly URL is a unique website or webpage address that lets the user know the contents just by looking at the address.

Here are the important things to consider when generating URL addresses:

  • Page addresses shouldn’t be too long.
  • The URL must only contain Latin characters, symbols, or numbers, or it could also just use words.
  • Separate words with a hyphen. Don’t use spaces or underscores (_), as search engines may have issues processing them. Google specifically might ignore the underscores.
  • Don’t use the ! % ( ) " ' ; characters.
  • Use key queries in your address.
  • Use URL masking when creating a series of similar addresses with different parameters: parameters like “290х95х82“ can be displayed as both 290х95х82 or 290-95-82.

General URL requirements:

  • Must use a single format for all addresses within the website.
  • Must support transliteration.

Proper site merging

Domain merging

The user should be able to open either the WWW version or the no WWW version of your page by default. This means that you will have one of the two options missing. Thus, merging domains for either of them by using a Code 301 redirect is required.

Internal page merging

Internal pages should have only one true address: either with or without no / at the end of the address, although the version with the / is preferable. Merge the two variants using a Code 301 redirect.

Sometimes a page might also have multiple versions with different extensions (e.g., PHP, HTML, HTM). If this is the case, then you should also merge these versions with the preferred version that has a / at the end.

HTTP-to-HTTPS merging

For new websites, you should use HTTPS from the very start, as this is necessary for website promotion. Having an SSL certificate tells search engines that your website is secure. If you aren’t using HTTPS yet, you need to set up a redirect that would take users from the HTTP version of your website to the HTTPS version. If your main domain doesn’t use the WWW in its name, use a Code 301 redirect to send users from http://www to the HTTPS link. Finally, don’t forget to set up redirects from all the versions of your domain to the one domain you’re going to use. You can keep track of this information using various third-party online services.

Setting up error codes

We use smallseotools.com to check server responses for our web pages.

Code 404 errors

Code 404 errors are a great help for proper site indexing, so make sure to set them up. A Code 404 tells both the user and the search engine that the desired page could not be found. If you remove a page from your website and make it return a Code 404 error, then it will eventually be excluded from indexing.

Code 500 errors

A 500 internal server error is an umbrella HTTP status code, which means that there is some sort of a server-side issue rendering a requested page unavailable. Some programmers make unavailable pages return a Code 500 error instead of a Code 404 error. Make sure to check whether your website returns a 500 Internal Server Error on any of its pages, as these pages will often get indexed. To solve the issue and exclude the pages from indexing, make sure to change the 500 error to the 404 error. However, the 404 error should be relegated only to pages that don’t actually exist: if a page is unavailable due to temporary server-side issues but is still hosted on your website, you should use a 500 error code instead, as the page will become accessible later and must be indexed.

Putting correct links to internal pages in your code

All links to internal site pages in your code should be generated using the same ruleset and format:

  • If your links end with a / internally, then make sure that’s how they’re written in the code.
  • If you’re using full addresses for your links, then all your links should follow that format.
  • If you’re using relative addresses (i.e., there is no main domain specified), they should start with a / if you aren’t using the <base> tag for your templates.

Sticking with the same style and format is important for keeping all your internal links direct, as doing otherwise will produce unnecessary steps in the redirect chain, which should be avoided.

Sitemaps

A sitemap is a map of your website’s layout stored in the XML format. It tells search engines which pages are currently available and should be indexed. You shouldn’t include non-content pages in your sitemap so that they don’t get indexed. The sitemap must be located in the website’s root folder, and its path must be specified in the robots.txt file.

Your sitemap must:

  • include the list of your website’s URLs as a nest
  • specify page priority
  • include information on the latest updates to documents and their update schedule

Your sitemap should update automatically so that it displays only relevant pages, adds new pages once they appear on your website, and deletes the pages you removed from the website. The sitemap should also exclude filtered pages (the ones you don’t want to index) and paginated pages.

If your website’s content includes a lot of images, it’s recommended that you make standalone sitemaps for images and include the paths to them into the robots.txt file — especially if you host a lot of unique images.

External links

Most resources include links to external websites, such links to developers’ and promoters’ websites, links to social media, and others. Just as with internal links, such external links can be anchored or unanchored and are marked with the A tag. The issue is that these links share their link juice with other websites. So, what should you do with them?

  • Links to your social media should stay indexed.
  • Unindex links to the developer and promoter websites via nofollow, although you can leave them indexed depending on your goals and agreements.
  • Links to established websites like Google or cyclowiki.org can stay indexed as they add to the content and are not sitewide. Linking to websites with a good reputation might also positively affect your website’s ranking.
  • Unindex other external links via noindex nofollow: <a href="http://site.com" rel="nofollow"><noindex>Anchor</noindex></a> 
  • All external links should open in a new window.

Important note: When unindexing links via rel=nofollow, the donor still loses some link juice, even though it’s not transferred to the external website.

Mobile site versions and adaptive web design

Each new year sees a rapid growth of the smartphone user base, so the mobile version of your website should be as neatly designed and user-friendly as the desktop version. Your website should have either a standalone mobile version or utilize adaptive web design to suit all types of devices. However, desktop and mobile versions return different results, and thus they should not be identical. A mobile version of your website should basically be a standalone website tailored for mobile devices.

  • Adaptive web design lets you forgo the development of a dedicated mobile website, as you need only to edit some CSS parameters of the desktop version.
  • Having a standalone mobile version is a more costly solution, but it is more flexible. Having a different version lets you edit the website without affecting the desktop version, mobile websites can load faster and be made generally more lightweight by getting rid of redundant code, and they can also include dedicated features not present on the desktop version. However, you should make sure both versions share the same web address.

Whatever solution you choose, your website must be ready for desktop and mobile browsing, which is why you should think this through during the development stage. Create adaptive solutions tailored for both smartphones and tablets.

The mobile website should be hosted on the same domain and does not need its own subdomain.

Google tracking tags

Your website must contain tracking tags for all major search engines. This lets you:

  • Collect statistics on traffic and user actions.
  • Understand which pages are more interesting and useful and which are not.
  • Tell search engines when you update your website and receive new users. This keeps page indexing up-to-date, which in turn positively affects your site’s ranking and search results.
  • Find low-quality pages and other useful information

If you’re making a new website, create a new Google tracking tag for your website and place it at the top of your website’s template in the <head> block.

If you’re hosting your new website on an old domain (e.g., you’re redesigning it), simply transfer the old tag to that domain. This enables you to retain all your data and track your traffic once you launch the new website.

Hiding your pages from indexing

Larger websites often include complex filters that let you search for products in a catalog, which generates a lot of URLs with specific parameters. These parameters either cannot or should not be optimized. However, these URLs often return the HTTP 200 OK code, which means they can still be indexed. Depending on the website’s setup, it can also generate redundant URLs with redundant parameters for other pages, which might be duplicates of important pages. Because your goal is not letting useless pages get indexed, you should unindex these kinds of pages.

Here is how you can do sol:

  • using the rel Canonical tag
  • using the Disallow operator in the robots.txt file
  • using the noindex meta-tag in the robots.txt file

Setting up pagination

Large websites hosting extensive product or article catalogs often make use of the pagination feature. Pagination breaks your catalog down into multiple pages and gives each its own page number. Each page lists unique products and articles, usually with no repeats. Because paginated pages are basically all the same, they don’t really need indexing. The strategy here is to unindex the pages while still indexing the products and items they host; otherwise, they might get ignored by search engines. To do this, you need to let search bots visit the paginated pages without indexing them or collecting their link juice. You can use the Canonical tag for this.

Here is another way to set up pagination:

  • Configure meta tags on paginated pages.
  • Don’t index text taken from the main catalog page.
  • Don’t create Page 1, as it’d just be a duplicate. Instead, set up a Code 301 redirect to take users from the page with the page=1 parameter to the main section.
  • Set up the Canonical tag on paginated pages in favor of the corresponding main section of the catalog.
  • If paginated pages get indexed anyway, then include the noindex follow meta tag in robots.txt.

Tags for editing content

Typically, websites update their content fairly often. To make adding new information quick and easy, you need to set up the main content editing tags without adding styles inside the tags themselves, as styles should go into their own separate file.

For content editing, use simple tags (p, ul, ol, h2, h3, and so on) with no styles. Including classes in your tags will make editing and adding text much more cumbersome because you’d have to keep track of classes or tag styles and manually insert them. All of this will be taking up your time when you’re updating your content.

Page load time

One thing that makes your website stand out is how fast it loads. Load time affects your search engine ranking and sales conversion. You should always strive to make your websites load quickly, sometimes improving your load times using dedicated third-party services. Make sure your website loads as quickly and smoothly as possible before you launch it.

Obviously, load speeds might differ for desktops and mobile devices, with mobile websites usually loading much slower than their desktop counterparts. Make sure you bring them as close to parity as possible before launching your website. Ideally, the page load time should  not exceed 2 or 3 seconds.

Page load times are an important factor that both users and search engines take note of. If your website takes too long to load, the user might just leave, whereas search engines directly measure your website’s load time, giving higher rankings to faster websites.

Search engines also track user behavior, so if users stop visiting your website or close the pages before they load, then your ranking is going to decrease. Use the following services to check your website’s load times:

Last-Modified

If your resource includes a lot of pages, we recommend using the Last-Modified parameter. As search bots have a limited timeframe for parsing your website, they might not make it to the pages you need, which is why you need to use this parameter.

In simpler terms, if a robot has already visited your page once, then the next time it does, it’s going to check the Last-Modified code and compare the date of the last visit with the data in the code. If the data matches, then it skips the page and moves on to the next one. This means the search engine doesn’t waste time re-indexing pages that don’t need to be updated and takes less time to index the rest of the website.

The header also notifies the client about the date of the web page’s last update. According to the RFC 2616 specifications, the client can ask the web server if the page was updated after a specified date by sending the “If-Modified-Since” header to the server. If the page wasn’t updated, the server returns only the “304 Not Modified” header. Otherwise, the server returns the “200 OK” header and the page body. The browser doesn’t need to load the same page over and over, so the web-server won’t have to transmit as much data.

Breadcrumbs

Breadcrumbs are sequences of links that start from the main page and help with site navigation. They are used to specify the path of the page you’re currently on. Breadcrumbs are placed at the top of the page, before the main content.

Use cases:

  • Improving the website’s UX
  • Displaying the site’s structure.
  • Page cross-linking.
  • May affect the displayed website snippet.

Keep these things in mind when creating breadcrumbs:

  • You don’t really need them if your website’s not too large.
  • Breadcrumb names shouldn’t fully match the H1 headers and website’s section names.

Feedback forms

Feedback forms are website elements that help you interact with your user base. They let users carry out various actions instead of passively consuming content, such as putting products in their cart, making orders, asking questions, chatting, writing comments, or making calls.

You should include feedback forms when possible, but do so in a way that does not annoy users. Make them easily discernible, user-friendly, and easy to use. Use as few required fields as possible. In short, your goal is to make users actually want to engage with them. Create checks for certain fields, like email and phone number, so that you receive all the necessary information from users.

You also need to set up goals in tracking tags for your forms. This will let you track various user actions: whether they’re typing or not, putting a product in their cart, sending an order request, or when they drop the form if that happens.

Feedback forms help you:

  • Boost user loyalty
  • Improve your product or service
  • Receive feedback from users
  • Boost sales

SEO fields

When developing a new website, you should try to accommodate fields that would include your website’s main meta parameters, although this doesn’t always work for all fields. You should also keep in mind that field values must not overlap (e.g., the H1 field shouldn’t match the title field, and the title field shouldn’t match the description field).

Here are some admin dashboard fields you need to include:

  • Page name for the admin dashboard
  • Title field
  • Description field
  • Keywords field
  • H1 field
  • Texts field (for online stores, it should display texts before and after product listings)
  • SEO-friendly URL field
  • Code 301 redirect field
  • Breadcrumb name field
  • Menu item field

Doing this will make it easy to manually change any desired parameters at any time without having to do any programming. Moreover, the process to fill out all the SEO fields should be set up for all the various pages types by default. Keep in mind that all the fields must be unique, with no full overlap.

Cross-browser compatibility

Cross-browser compatibility implies that your resource is displayed in the same manner across all major web browsers, with no crashes, layout errors, or compatibility issues. The content must be displayed correctly and legibly across all browsers. To make sure you have no compatibility issues, try to forego any CSS hacks, thoroughly plan out your website’s layout design, and use prefixes.

Conclusion

Please keep these points of reference and technical tasks in mind during development, as they are vital for SEO and website promotion. Using this guide will equip you with all the basic know-how on how to promote your resource across all major search engines. Now the only thing left to do is make sure your website boasts a user-friendly interface and some engaging content. Good luck!

Are you looking for efficient SEO?

Contact us for professional service and support
Interested in our services? Apply here
Apply
Нажмите и держите для максимального увеличения