A – attention, I – interest, D – desire, A – action.
AIDA is a copywriting formula, but, as I understand it, this is a scheme suitable for any type of advertising. The essence of this formula is that there are four clearly defined stages (abbreviation of the method), in accordance with which advertising is carried out.
At the first stage (attention), it is necessary to attract the attention of the client, to make him get carried away by advertising.
At the second stage (interest), you need to keep the attention of the potential client, entice him with new interesting information. For example, it may be a fact, a life story that describes the advantages of the products sold or draws the attention of the client to some area of its use.
At the third stage (desire), it is important to correctly describe the advantages of a unique selling proposition – the benefits that it receives when making a purchase, reviews from those who have already made this purchase, possibly a guarantee. This stage encourages the customer to purchase.
And the last step (action) should create such an atmosphere that the client understands that the purchase must be done right now. At this stage, you need techniques that encourage the client to take action – this may be a limited amount of goods, a time limit, the possibility of receiving an additional bonus in case you make a purchase as quickly as possible, etc.
2. Visual text formatting
Text formatting can be done in many ways. Firstly, when writing HTML code, you can use different tags to create different effects – size, format, font color, working with the background, etc. But most often HTML is used to create the page structure, and CSS is used to render it visually. There are editors for formatting text in HTML.
3. SEO metadata filling
There are several tags in HTML that are called meta tags – these are title, description, keywords, the h1 and h2 header tag is also important.
Title is the name of the page and it is usually displayed as the name of the link in the search engine results. The title tag must include keywords (sorry for the tautology), as well as briefly, but comprehensively conveys the content of the page.
Description – a meta tag that carries a brief description of the page, is used by search engines for indexing, and sometimes when creating annotations to links in SERPs.
h1 – the tag in which we write the most significant title on the page – the title of the page, table, anything. It is only one on the whole page and when writing it use high-frequency queries.
h2 is the second most important heading on the page and it includes already secondary queries. Most often it is used to describe the main subsections.
Keywords – contains a description of the content of the page. That is, the content of the page is summarized in several keywords of this tag. For example, this may be the name of the section-subsections of the page, etc.
Need some help?
Here at SEO Basics, we’ve got a great team of experts who are passionate about helping you get better results.
Why struggle, and lose time and money over SEO, when you can have someone do it for you?
Skyrocket your online business with optimised SEO 🚀
4. Weight distribution on the pages of the site
A tool for distributing weight across the pages of a site is linking.
Promoting just the main page of a site is fundamentally wrong. The fact is that the main page of a site often does not display those features that could be used in promotion. For example, the pages of subsections in their meaning can better fit a particular request and it is better to use them to promote this request. Also, a well-advanced main page of the site will not change the situation if all the other pages are low. Moreover, PageRank is calculated separately for each page, therefore, promotion should be comprehensive.
When linking occurs, links are exchanged between the pages of the site. Thus, pages with higher indicators give their weight to other pages of the site. The methods of linking can be different and a single system does not exist. Experts describe the “star” and “ring” schemes, but I did not have to develop such schemes for specific sites and use them in practice.
5. Types of internal links
An internal link connects one page of a site to another, that is, an internal link is what internal linking is implemented with. Internal links are convenient to use when, when writing an article or any other content, it would be appropriate to refer to another article, which may be a source or may simply be of interest to the reader.
Internal links perform three main tasks:
- Help in navigating the site;
- Definition of the architecture and hierarchy of the site;
- Ranking pages on the site.
In addition, internal links improve the search engine optimization of the site – they help search engines, create a connected network of posts on the site, and are also the reason why readers stay longer on the site.
About the type of internal links.
The following types of internal links are distinguished:
- Links of the main menu (including the submenu, drop-down tabs, etc.);
- Reference navigation bar (path-address on the site);
- Anchor links inside the page (move to a given section of the page);
- Links from the right, left, upper, lower blocks of the site (advertising, logos, news, slideshows, etc.);
- Links within the body of the page (articles, material, descriptions, etc.).
6. CNC (human readable links)
CNC – human-readable URL (page address), this is the conversion of the page address from one that is distributed dynamically to an address understandable to a person. Such addresses (CNC) have their advantages. First, they are easier to remember. If the visitor is interested in certain specific content on the site, then he can easily associate the content of the page with its name, address, etc. Well, I think so.
Also, by the way addresses are compiled, the user understands how the navigation on the site is arranged.
The way the address of the page is filled, the presence of keywords in it, all this also affects the position of the site in the search results. If the page address contains keywords, then the link is highlighted as relevant and the number of clicks to it increases.
Breadcrumbs show the position of the current page in the site hierarchy, that is, they help the user navigate the site. Not only one chain of breadcrumbs can lead to the same article or page, the same material can correspond on the topic to several sections of the site.
Created bread crumbs in order to simplify the user’s work with the site, to give him the opportunity to navigate articles and sections, to make moving around the site convenient and understandable. The final result, apparently, is to create such conditions for working with the resource so that the visitor would like to work with this site in the future.
7. Harmony of usability and behavioral factors
Behavioral factors are metrics that characterize user behavior in search results and directly on the site.
Behavioral factors are also ranking factors. They characterize the resource by the way the user behaves on this site. For example, how often people follow the link to this site and how long they stay on it after the transition.
The task of behavioral factors is to improve the quality of construction of search results. To achieve this goal, search engines analyze user behavior on sites, their requests, etc.
Usability is the usability of the site. It depends on many factors – on the quality of the content on the site (the uniqueness of the articles, the quality of the graphics and video), as well as on the organization of navigation on the site.
In general, there is a rule of 3 clicks – if a visitor, getting to an unfamiliar site, does not find the required information within three transitions, he leaves the site. Accordingly, the site structure should be designed in such a way that a new visitor can easily find what he wants without too much time.
As I understand it, the harmony of usability and behavioral factors is to create a site with high indicators of convenience so that all the above conditions are met. Thus, a well-designed site with high-quality content will attract users and this will be reflected in their behavior – the time spent on the site will increase, the number of visits will also increase, and therefore the position of the site in the search results will increase. Did I manage to answer this question? 🙂
8. Compliance with clean code and CSS styles
Good clean code is primarily hand-written code, not generated by a visual editor. When writing html markup, some rules should be observed – the contents should be neatly formatted, should contain comments and be arranged in logical blocks in accordance with their functions.
Proper CSS application is also important. Firstly, the implementation of design using CSS comes out much more compact than on html, which loads the server. Secondly, the very structure of creating graphic design in CSS was originally sharpened for this task, and therefore making changes and adjustments to the CSS file is much easier and more comfortable and less laborious. If you observe all these nuances, then the process of indexing the site by search robots will be simplified.
There are ways to check html and CSS code for validity. The main standard in accordance with which the audit is carried out is the recommendations of the W3 consortium. There are a number of the most common errors and shortcomings due to which the code is recognized as not valid. But the main problem, which, however, is being solved now, is cross-browser compatibility. Due to the fact that different elements are displayed differently in different browsers, and in some they are not displayed at all, developers have to create obviously incorrect code that will not pass validation checks, but all elements will be displayed in all browsers.
9.301th redirect, 404th page and server responses
The 301st redirect redirects to a new address using the server.
Such redirection is used in cases where the page address, its url has been changed, and it is necessary to redirect the user stream from the old domain to the new one; to transfer the TIC and PageRank indicators to the new domain; to organize access to the same site through multiple addresses; when moving the site page to another place; to eliminate the mirror domain – combining the site address with www and the address without it. The 301st redirect redirects about 90.99% of the traffic from the old resource to the new one.
301st is a way to maintain a site’s ranking in search engines when transferring it to a new domain or changing the content management system. Forwarding can be done in several ways, depending on the installed software.
Error 404 occurs when there is no content at the requested address. When a user needs to go to a specific page on a site, he uses its address. The browser accesses the server at this address and downloads the contents of this page. If the site works, but there is no information at the specified address, the server responds with a 404 error – the page was not found.
This error can occur for several reasons. The reasons on the client side are a poor connection, the wrong address was entered.
The reasons why the page may not be on the site are either the page was deleted, or the link to this page is out of date, or its address has been changed, but redirects using the same 301 redirect that I wrote about above were not organized.
As smart people teach on habrahabr, the page of the 404th error can also be used for your own purposes. Firstly, this page must be – it is necessary to notify users of the problems they are facing and, if possible, offer them alternative options. In addition to the link to the main page, you can add an explanation of why it is missing. It is also advised to create the appearance of the 404th page the same as the rest of the site. In fact, there are a lot of options for creating 404 pages, and all of them are aimed at not only not losing the user a site, but rather creating an atmosphere so that he would like to use it further.
10. Robots.txt and sitemap.xml files
The robots.txt file is a utility file that allows the site developer to restrict access to the content of web documents for search engines. This file is needed so that search robots index those pages that are desirable for indexing, and pages with non-unique content, duplicates, service folders and documents that are of no benefit to the visitor – are not indexed by the search engine. The robots.txt file is a plain text file. In addition to this file, there is also Sitemaps, which performs the opposite function – makes it easier for robots to access the contents of the site.
Sitemap is an XML file with information for search engines (such as Google, Yahoo, Ask.com, MSN, Yandex) about website pages that are subject to indexing. Sitemaps can help search engines locate the site’s pages, the time they were last updated, the frequency of updates, and the importance of other pages on the site so that the search engine can index the site more intelligently.
Sitemap allows you to determine the location of certain pages to provide information about their updates, to assess the degree of their importance in comparison with other pages of the site. This is all done so that search engines can more intelligently index the site.
This file is not mandatory and in no case is a strict instruction for indexing the site. He only gives recommendations. Which however can be completely ignored by the search engine.
Do you want to get more leads?
Optimising your SEO strategy means optimising your lead generation. We offer support packages for all types of organisations.
Find out how we can help you skyrocket your online business.