Loading
Phone :
+32 493 76 32 93
 
Email :
coffee@seobasics.info

After six months of hard work, you finally launched a new site. And then you understand that from the point of view of SEO, the developed site is lame: indexation is in full swing, duplicates are multiplying, the number of recommendations from search engines is growing. As a result, the site loses its visibility in the search.

Urgent to correct the structure of the site, clean it of duplicates and optimize the speed of loading pages – quite a laborious task. Yes, and the reaction rate in SEO is often measured for months.

How to develop the right site in terms of SEO? What needs to be put in the TK for the developer? I will talk about this in a new article.

What needs to be done at the start

Equal opportunity - Wikipedia

First of all, we select key queries and group them into a semantic core. For those who do not know how to collect semantics, read the chic post of my colleague Ekaterina Chikulaeva.

Having clustered semantics before our eyes, we think over the structure of the site in advance and distribute the keywords into sections. So subsequently, you don’t have to optimize the finished site according to the principle “these requests are here, but whatever is not suitable, we will put it on the blog”.

Based on pre-thought out texts and write TK for copywriter. Perhaps you should not remind that all texts should be unique?

And only after all this we start prototyping. At this point, pay attention to the headings H1, H2, H3 and the distribution of the text on the pages. Do not forget about the correct header structure – for example, this:

The correct header hierarchy

So, we have a structure and understanding of how to distribute text across pages. Let’s find out what else needs to be laid in the future site.

Meta tags

Meta tags are special HTML tags located in the head container that transmit page information to search engines and browsers. Without them, developing an SEO-competent website is unthinkable.  

The title tag is the page title. It is he who forms the title in the search snippet. Since this is a title, in the hierarchy, it must be higher than the meta tags that convey the text content. 

title meta tag

The page must have a description meta tag. When I do site analytics, sometimes I notice that they forget about him. To add it a little, you need to fill it with unique content c occurrence of a key request. The presence of the description meta tag is important: it participates in the formation of the search snippet. 

For social networks to understand your code, add Open Graph markup. I advise you to write a condition code: if the markup fields do not contain values, then these fields are not output. 

open graph

Viewport meta tag will inform search engines that your site is optimized for mobile devices. However, if the layout of the site is not adaptive, then viewport will not save.

Remember the canonical tag. This is an important element of the site, because it is he who tells the search engine the true address of the page. Canonical will get rid of problems with duplicate pages.

Finally, each page should have functionality for managing indexing through a construction of the form <meta name = ” robots ” content = ” index , follow ” /> from the admin panel. It allows search bots to index the page and follow links.

meta name = "robots" content = "index, follow"

URL

Here's What URL Actually Stands For | Reader's Digest

First of all, let me remind you of the obvious: the new site should work on the HTTPS protocol. This will remove some of the security problems and increase search engine loyalty.

If you are creating a new site instead of the old one, try to make the new page and section URLs match the URL of the old site as much as possible. If this is not possible, configure the 301st redirect from old pages to new ones.

The site must have one address, with the mirrors you also need to configure the 301st redirect. That is, if your main address is https://site.ru, then the site should not be accessible at addresses like https://www.site.ru , http://site.ru and http://www.site .ru.

The same applies to sections of the site: there should not be addresses of different types. For example, www.yoursite.com/contacts/ and www.yoursite.com/contacts . The URL must match the address specified in the canonical tag. From the wrong URL to the right, configure the 301st redirect.

Finally, the last tip for those who plan to start a blog or news feed on the site. On canonical pagination pages, display the address of the main page. 

meta name = "robots" content = "index, follow" / http://leadmachine.ru/?p=15362&preview=true#hcq=3ZjJ80r

Indexing

What is WEB INDEXING? What does WEB INDEXING mean? WEB INDEXING ...

Add the attributes rel = “next” and rel = “prev” to the links or headings of posts and news to indicate the relationship between several URLs. This will help Google determine that the content of these pages is connected in a logical sequence.

Breadcrumbs are needed and important! In addition to being an important element of usability, breadcrumbs help robots better understand the structure of the site and can participate in the formation of a search snippet.

The sitemap.xml file (or site map) transfers to the search engines a list of pages on the site. The composition of this file should not contradict what is open for indexing. Accordingly, what is closed for indexation should not fall into it.

Also sitemap.xml should report the date the page was last modified. If for some reason it doesn’t report, it doesn’t matter! Let the web server inform the date the page was changed To do this, you need to implement the response headers of the server if – last modified since header. This will save a lot of time the time the search robot works – and, therefore, the download limit that search engines allocate to sites.

And of course, pre-configure the robots.txt file – each CMS has its own peculiarities, so you will have to study the topic separately. The file will most likely need to be supplemented in the future, but the basic settings are already a great start.

Close the static weight leak

Let me remind you of the newcomers to the SEO world: static weight is the level of authority of a page in a search engine. It is transmitted from page to page using links – both internal and external.  

References to the phone number or email-address <a href=” mailto:t.azizov@leadmachine.ru “> and <a href=” tel:+7123456789″> should have the attribute rel = “nofollow”. After all, these are all the same links – it turns out that from all the pages where they are located, it leaks away. The attribute will help to avoid this.

Pages should not link to themselves. For example, when a page is active, the menu should not have a link to it. The same applies to the situation when the page has a floating navigation menu with links like <a href =” http://yourpage.ru#razdel “> section </a> Because search engines do not read what is located after the hash. – therefore, in their eyes this is equivalent to a page link to itself, in such links I recommend adding the attribute rel = “nofollow”.

Do not forget about the logo on the main page, there should not be a link on it either. 

Download speed

The Consumers Guide to Internet Speed | HighSpeedInternet.com

The site should work quickly on desktops, and on phones, and on tablets. There is data from search engines that a page should open up to 3 milliseconds. But it seems to me that this is too long. Therefore, I recommend instructing the developers of the site to configure server caching. The topic is big, so I’ll run through it thesis.

The server on which the site is located must run on Apache + NGINX bundles. The first is responsible for the dynamics, the second gives static (the result of scripts). Another option is the NGINX + PHP-FPM bundle: the first is responsible for the statics, and PHP-FPM is responsible for processing the scripts. This is currently the fastest and most productive environment for PHP sites.

Cache everything you can: database queries, image thumbnails, script results. The exception is the online store, you should not cache the session in it, otherwise the basket will not work.

Do not forget about browser caching. You should be careful with it and not set it up for a long time, otherwise the visitor may not see the changes on the site. 

Site scripts (CSS and JS) need to be compressed. CSS output in the head container, a JS plug in footer. If you output CSS after loading HTML, at some point you will see your site in pure HTML – it looks untidy and annoying.

HTML also do not forget to compress. Yes, after that the code will turn into a mess. But do not be alarmed, this is normal!

And of course, optimize all the images on the site for web devices. Often it is because of them that the pages load for a long time.

Last few tips

Ideally, give the site a permanent (static) IP address, not a dynamic one. This will give you a small plus in the karma of loyalty from search engines. Also remember that the site should not be accessible by accessing the IP address, otherwise write to the server administrator.

In the admin panel should be able to register alt and title for images. Do not use headers like <H1 class = ” xxx ” id = ” xxx ” > . Neither classes nor id should be there. Only so: <H1> text </H1> Keywords should be distributed evenly throughout the page.

Try to avoid large footsteps of text at the bottom of the page – search engines value such pages less. For those who do not know how to distribute the text on the page, I recommend reading the article by my colleague Elena Chulanova on optimizing the blog post.

Add HTML micro-markup (aka semantic markup) to the site, which with the help of markers informs the search engine about certain elements on the pages. Microdistribution will help to improve the presentation of the site in the results. By the way, according to Google, the presence of micro-marking affects the ranking

Awesome Works
Awesome Works

Related Posts