Website positioning – Paraphrase Online https://www.paraphrase-online.com/blog Creative Writing Blog Wed, 13 Apr 2022 06:08:24 +0000 en hourly 1 https://wordpress.org/?v=5.0.16 SEO and JavaScript https://www.paraphrase-online.com/blog/special/positioning-of-websites-and-javascript/ Mon, 12 Apr 2021 05:20:13 +0000 https://www.paraphrase-online.com/blog/?p=1079 Continue readingSEO and JavaScript]]> Creating websites requires the use of many technologies and solutions. The rapid development of the industry and the growing expectations of users make it necessary to find new ways to present content. The appearance of websites and the way their content is rendered, unfortunately, does not always match the expectations of Google, which has had problems with rendering pages based on JavaScript for a long time. Until today JS website positioning requires a special approach …

Theory, or what is this creation – JavaScript?

The website is in the form of code that is read by browsers and rendered as indicated. The next elements of the website are added via HTML tags. The appearance of the page requires adding an additional file, written in css. The most common styles are located in the style.css file.

Websites built on HTML and CSS are relatively simple. Creating complex websites requires the use of more technically advanced solutions, allowing for dynamic content delivery, HTML rendering at the server level, and processing information from users and saving them in a database. The use of additional technologies also makes the form of providing information on the website more attractive.

Building dynamic applications is possible thanks to a wide range of programming languages, extending the functionality of websites. One of the most popular programming languages in recent years has been JavaScript. This is confirmed, inter alia, by The 2020 State of the Octoverse report summarizing the activities of developers within the GitHub platform.

JavaScript was created in late 1995. Initially, it was used primarily to validate information entered by the user (e.g. the correctness of an e-mail address in a form), today it allows you to build dynamic websites. Thanks to the scripts, the website “lives” and reacts to the user’s actions – the image gallery becomes interactive, pop-ups can be closed, and Google Analytics collects information about the traffic on the site. JS is also responsible for other elements without which websites do not seem to meet modern standards:
– infinite scroll (i.e. loading subsequent elements, e.g. products without reloading the page),
– comments and ratings,
– internal linking,
– “the best” lists of products or articles.

The choice of JavaScript as the technology on which the website will be based allows you to place on it elements downloaded from external sources, for example the Google Maps API or API from social networks.

JavaScript code is in a file saved as .js (usually script.js) – a link to it is inserted in the head section of the page or immediately before the final body tag (which is recommended). However, some code fragments are also placed directly between html tags (e.g. the script responsible for Google Analytics), which allows the code to be executed (i.e. the appearance of a specific action on the page) before the entire HTML and CSS structure is loaded. Unfortunately, such treatments usually have a negative impact on the rendering speed of the website – so it is worth using this option to a limited extent.

Within websites built according to this standard, at first the HTML structure is loaded, which is then supplemented with CSS code. Finally, the JS code is executed according to the sequence of the following items in the file – top to bottom.

Client-side Rendering or Server Side Rendering?

JavaScript is a specific language – it can be executed on the server side (server-side) and on the browser side (client-side). Both options allow you to build a modern web application that is attractive to users and web robots.

Server-side rendering (SSR) is the most popular method of displaying web pages. The browser sends information to the server, which sends the response in the form of rendered HTML, which is displayed on our screen. The response speed depends on, among others:
– internet connection,
– server location (distance from the computer from which the query was sent),
– traffic on a given page (how many inquiries are sent at the same time),
– from website optimization, e.g. cache and the possibility of storing some files in the browser cache).

Each subsequent click causes the necessity to reload the page – the server sends a response containing HTML with the same elements as the previous subpage.

Client-side Rendering allows you to render responses on the client side – most often the web browser. In this case, in response to the browser request, the server sends a JavaScript file, which is responsible for creating the HTML structure and adding content to it. In the case of moving to the next subpage, the website does not have to be reloaded – JS downloads the necessary content from the server and complements the previously rendered HTML skeleton. The server response is faster and the Internet connection is less loaded.

Client-side rendering is usually faster, but due to the need to execute all JS code, the first loading of the website may be longer than with SSR. Client-side rendering – rendering a view only for a given browser may cause problems with effective website positioning.

Google vs. JS – a bumpy road to success

The possibilities offered by JavaScript code had a significant impact on its popularity. JS allows, among others to create dynamic pages, based on a template written in html + css, supplemented with data downloaded, for example, from databases. Moreover, the potential of this language allows you to manipulate templates, create additional elements and render the page “on the fly” – while the program is running. On this principle, in the first years of its existence, Wix site builder – rendered websites based on JS code.

Unfortunately, as I mentioned above, this solution is not conducive to ranking high in the Google search engine. Network robots from the Mountain View company for many years were not able to analyze JavaScript pages well, which in turn resulted in the inability to compete in SERPs. In recent years, Google has declared to improve its capabilities in this area, but the effectiveness of reading the code files is not always satisfactory.

SEO JavaScript pages requires, first of all, to know how Googlebot processes JS code. For static pages, the robot’s flowchart is quite simple.

Google checks if it can enter a given address (access for robots can be blocked in the robots.txt file or tag in <head>), then downloads the page data – HTML structure and at the same time checks what is on all links in the code. Then the CSS files are read and the page is sent for indexing. This process is different for JavaScript pages.

As you can see, the path that Googlebot has to travel is much more complicated. After downloading the HTML file, it downloads the CSS and JS code which is necessary to render the page. It then supplements it with resources from external sources (if any) and renders the page. The appearance of the page and the necessary elements of its structure are contained in the JS code, which allows you to manipulate individual fragments and adapt them to the user’s needs.

Rendering the code before it is indexed may take a long time and Google does not guarantee that it will get all the information we wanted to include on the page. It has to do with with the number of URLs the robot scans during the day – the so-called “Crawl budget”. Taking into account the needs of Googlebot allows it to effectively crawl under the page, which translates into the visibility of the site in search results.

Drawing a Googlebot path gives a clear overview of the complexity of positioning JavaScript-based pages. Unfortunately, it does not inform about the risks and problems that may arise during the next steps.

JS pages – what does Google see?

Websites based largely on JavaScript code have always been a big challenge for Google robots. Their indexation level is increasing, but they are still not as high as we could wish for.

Google renders pages differently than the average browser, and uses the page in a different way than the user. The algorithm of operation focuses primarily on the elements that are necessary to render the website. It may omit those it deems less important and, as a result, ignore them during indexing. This is problematic especially in a situation when these fragments contain the content of the page, which was intended to become a ticket to high positions in search results. Data dependent on cookies are particularly exposed to the risk of remaining invisible – if the content is served on their basis, Google will probably not reach it. Additional issues are also the speed at which the code is executed – bad optimization may extend the entire process and cause the robot to abandon it.

The second problem is the lack of Googlebot activity when visiting the website – Google does not scroll the page, does not click where the user is and blocks automatic video playback. Unlike other visitors to the site, it may not reach all the prepared content and not have a complete picture of the site.

Information on how Google renders our site can be obtained using the Mobile Optimization Test. After entering the address, Mobile Test will download the subpage indicated by us and return information about the rendering process – including messages about problems. A preview of the rendered page will also appear on the right.

You can also check the page in Google Search Console – “URL Check”. Both forms of site rendering control allow you to obtain the data necessary to introduce any changes and improve the indexation of the site based on JS.

Single Page Apps – React, Vue.js and other frameworks

Strong emphasis on the speed of data delivery and the popularity of technologies used in mobile applications has resulted in the growing popularity of Single Page Apps websites. This type of website has one html file, which is used to render subsequent subpages. The content is downloaded dynamically from the server through JS requests.

SPAs are fast and well received by users. During the visit, the browser downloads all static elements, the remaining content is read during the visit. Transitions between successive fragments of the page are dynamic – we avoid the moment of reloading the page. From the user’s position, it looks the same as in the case of more traditional pages – e.g. links in the menu have a familiar form, but after clicking on them, the browser does not download data from the next html file – in fact it remains in the basic index.html, and the content is downloaded are from the database and rendered with JS.

To build Single Page Apps, the AJAX model is used, which allows communication with the server in an asynchronous manner, which does not require the document to be refreshed with each user interaction with the site. Pages are built with what Wikipedia refers to as the “application framework” framework. The framework is responsible for the application structure, mechanism of operation, components and necessary libraries. The most popular frameworks include React, which is used to build application interfaces. Vue.js and Ember.js are also frequently used. The framework developed and promoted by Google is called “Angular”. The most popular frameworks allow for server-side rendering of websites (which is recommended from the perspective of easy page crawling by Googlebot) and take into account the requirements of mobile browsers.

As we mentioned earlier, Google is not always able to deal with these types of sites, which makes the positioning of some Single Page App based on JavaScript without proper optimization may be impossible. A good example is the website. The website provides a lot of historical information (taken from Wikipedia), but Google does not see its potential.

While in the case of a website of this type, the lack of presence in Google is not that significant, in the case of a store it can significantly affect the interest of customers. Pages created using popular frameworks – React, Vue.js or Angular, allow you to introduce elements necessary to appear in the Google index and serve content in a way that allows you to compete in terms of position.

SEO and optimization of JS pages

The solutions enabled by the JavaScript code have a positive impact on the speed of the website and its reception by users. However, seeing the page by potential customers is only half the battle – most of them reach our URL only when they find it in Google search results.

Optimizing web pages with a lot of JavaScript code requires changes that seem obvious to static pages.

Access for network robots
As mentioned earlier, Googlebot will first check if the URL it encounters is accessible to it. This applies to all resources within it – including JS and CSS files. Google’s rendering of JS pages requires full code access, so avoid blocking these resources in your robots.txt file.

Urls
One of the problems that can be encountered when optimizing a site based largely on JS, is the lack of “traditional” links to subsequent pages of the site. Googlebot focuses only on the links placed in the href attribute, in the <a> tag.

Another problem with URLs typical of sites with lots of JS is the use of URLs with “#”. The cross allows you to avoid the need to scroll through the document and move directly to the selected fragment.

One solution to these problems is to use the HTML5 History API. It allows you to manipulate the browser’s history – e.g. change the address in the browser’s address bar and change the content, without having to reload the subpage. The API is built into most frameworks (including React Router), which makes it much easier to build websites that allow for positioning in Google. Note – such solutions will not be effective with old versions of browsers.

Among other problems with URLs, it is also worth noting quite a lot of potential for duplicate pages, with addresses that differ in case of letters or slash at the end. Placing a canonical link in the <head> section fixes this problem. The address they contain will be unambiguous information for Google.

Sitemap.xml
The element that makes indexing the site much easier are files containing links to all addresses within the domain – sitemap. The sitemap is a list to which all addresses to which we want to invite the bot should be included.

Redirects and the problem of apparent errors 404
One of the important elements of the website optimization process is catching broken URLs and directing them to new, correct ones. Redirects 301, 302, etc. are performed within the server. However, in the case of the Single Page App, this solution cannot be applied. As the Google Help Center suggests, and as proven by Search Engine Land’s testing, proper use of JS code will work in a similar way.

Redirecting JS will allow you to redirect the address to the next subpage, but it will not give the answer typical for redirects made on the server. However, from the point of view of presence in search results, this is not a problem. Redirecting effectively replaces the old address with a new one that may go further in the ranking.

window.location.replace replaces the current document with the one at the given address. The address and contents of the original document will not be cached. This type of procedure is another example of the effective use of the previously mentioned HTML5 History API.

Other important server response codes for network robots include 404, “Not Found”. In the case of JS pages (although this problem also occurs with badly configured static pages), the lack of a document at the given address will be read by the server as a correct 200 answer. As a result, the search engine index may find non-existent addresses with no content. In order to overcome this problem and inform robots about the need to look at other subpages, it is worth supplementing the code with a fragment that allows you to get the desired answer from the server.

Loading delay
Page speed is one of the factors contributing to good website visibility. The optimal speed of content delivery and other content elements allows for comfortable use of the website without overloading the user’s internet connection. One of the most effective solutions is lazy loading – delaying the loading of certain website elements. Note – incorrect implementation may block access to important website resources and, as a result, prevent Googlebot from accessing key content for positioning.

When loading, priority should be given to the HTML structure that allows you to “build the page” and its content. Next in the queue are the graphics, which most often have the greatest impact on the amount of data downloaded to load your site. The use of lazy loading allows you to render successive elements visible on the screen in the first one, allowing for later loading of fragments available after scrolling below.

Titles, descriptions …
You can’t do effective SEO without optimizing your titles and descriptions. In the case of Single Page App pages, based on frameworks such as React, Ember or Angular, it is worth considering adding a module or library that allows for any modification of these tags. In the case of applications built on React, the most frequently chosen library is React Router and React Helmet. The Gatsby framework based on React is also becoming more and more popular.

Testing and troubleshooting JavaScript application SEO

Positioning of Javascript-based websites has not been possible for many years. The development of methods to deliver content to web crawlers is in line with improving Google’s ability to render and read JS sites. However, there is still a significant risk of errors when indexing the content of our website – the solution provided by Google is not perfect.

The guarantee of the appearance in the SERPs is to allow access to the site by Google robots and control the content displayed by them. For this purpose, it is worth not to limit yourself to testing the website, especially with the help of the previously indicated Google tools.

Dynamic rendering – ready for Googlebot

As part of improving the relationship between Googlebot and JavaScript, Google in its documentation suggests using various tricks that allow for better processing of JS code. One of them is dynamic rendering.

Dynamic rendering is based on identifying the client (e.g. a browser or a web robot) and providing him with a response tailored to his technical capabilities. In practice, when the query is made by the user (web browser), the page will be rendered in the normal way – the HTML file will be downloaded and the desired content will be downloaded from the database with the help of a request sent by a JS script. When Googlebot asks for a given URL, the server will send a render of the page containing static HTML, which will enable faster indexing of its content.

An API called Rendertron can be used to implement dynamic rendering, which works as a standalone HTTP server. It renders the contents of URLs readable by bots that don’t execute JavaScript correctly. Rendertron allows you to save the rendered file in the cache, which significantly speeds up the sending of responses to the bot. The data in the cache will be updated automatically at intervals specified by us.

Pre-rendered documents are also useful from the point of view of other clients – they allow you to prepare content suitable for readers used by the visually impaired.

SEO and JavaScript – summary

The growing emphasis on the speed of serving content will certainly result in a further increase in the popularity of the JavaScript language. Google also takes this into account and is constantly working to improve the indexing of the content accessible with the help of JS. Appropriate optimization and the use of bot-friendly solutions are the key to high positions in search results, even for Single Page App and other JS-based sites.

]]>
SEO the company website https://www.paraphrase-online.com/blog/positioning-of-websites/positioning-the-company-website/ Thu, 01 Apr 2021 05:39:31 +0000 https://www.paraphrase-online.com/blog/?p=1084 Continue readingSEO the company website]]> The process of improving the visibility of your site in the search engine is called page positioning. Companies from almost every industry compete to ensure that their pages are in the highest position in the organic results of Google search engine. What is the whole process and what are the benefits of positioning a company website?

When starting the business several years ago, any mention of an enterprise on the Internet was an innovative activity. Even underdeveloped websites already looked modern, but few companies and consumers appreciated the value of the web. Now having a website is essential. For several years, the race for the highest positions in TOP10 results in the most popular search engine – Google has been going on. Currently, among the billions of websites on the web, everyone wants to stand out and be on the first page in search results when asked by a user.

SEO a company website consists of many factors, stages and optimization activities performed on the website and beyond. This process requires an individual strategy, constant analysis of variables and adaptation of optimization activities to the current guidelines.

How does the Google search engine work?

By entering a query in the Google search engine, you get from thousands to millions of websites that potentially contain information relevant to you. It is Google that decides what it displays in the search results, but everyone can help with this by positioning their company page.

Indexing
Google uses crawlers to organize information from websites. They analyze websites, open the links they contain and collect and send data about these websites to Google servers. The systems scan the content of the page and the information obtained is registered in the search index.

Algorithms
Google’s ranking systems analyze billions of websites thanks to algorithms. After entering a query into the search engine window, you get certain results – however, many factors affect the display of certain pages, including:
– content of the inquiry,
– the relevance of the page,
– source value,
user location,
– as well as about 200 other ranking factors.

Google’s mission is to organize the world’s information in such a way that it becomes widely available and useful for everyone. Hundreds of new websites are published on the web every second. Google handles billions of queries annually. In the maze of this information, the company, as it emphasizes, tries to protect the search results from dishonest companies and spammers who try to bypass individual systems and algorithms. On the other hand, however, they try to help business owners succeed by providing a variety of SEO tools and tips.

Goals of website SEO in the Google search engine

For many years, Google has been in the first place in the ranking of the popularity of search engines in the world. Therefore, the positioning of the company’s website on Google is decided by business owners who want to increase the visibility of the website in search results and are aware of how many benefits this can bring:
– increasing traffic on the website,
– more conversions (e.g. purchases or registrations),
– brand recognition,
– building awareness among consumers,
– creating the image of an expert and market leader.

Therefore, the website positioning process focuses on both sales and image goals. It is worth emphasizing that online activities also translate into reality and life outside the Internet. For example: a properly positioned website of a restaurant may be displayed to the user on the query: “Boston restaurant” in the first place in Google. A virtual user, even without going to this website, decides to trust the search engine and goes to this restaurant, turning from a user into a real customer. Such positioning effects, although impossible to measure, emphasize the essence of the entire process and confirm that it is worth investing in SEO and promoting the company on the Internet.

A multi-stage process of website SEO

The activities related to the optimization of the website in terms of search results (called Search Engine Optimization – SEO) include the optimization of the content, structure and code of the website. Positioning, apart from the aforementioned SEO, also includes activities in the field of content marketing, link profile building and web analytics.

The process consists of a series of works on the website, which are used to prepare the website for cooperation with the previously mentioned Google robots.

Website audit
The first step in the SEO process should be a website SEO audit. The verification and analysis should apply to both the technical side and the content available on the website. Information relating to organic traffic on individual subpages can be collected using Google Search Console and Google Analytics.

Based on the audit, a website positioning strategy is created. The most important thing is to be aware of the company’s business and marketing goals while working together, not forgetting the rules prevailing in the SEO world.

Key words or phrases
Another of the initial stages of website positioning on the web is the selection of appropriate keywords and phrases. The phrases are selected, after entering which in the search engine window, the positioned website would appear in the results. Words or phrases must be consistent with the nature of the company and the expectations of Internet users who are looking for information about products, services or issues that interest them.

Company website optimization

One of the key elements of positioning is website optimization, i.e. SEO. At this stage, the following occurs:
– domain link profile verification,
– developing content to be completed on the website,
– indexation of the page in the search engine,
– optimization of headers and metadata or execution of appropriate redirects,
– improvement of technical elements within the website (such as finding and removing broken subpages, using the 404 error in a positive way, optimizing the size of graphic files loading the page, etc.).

Verification and development of content for a website is an extremely important process – it is necessary to identify whether the website has internal duplicates and whether such duplicates do not appear on other websites. You should also verify that the content available on the website is unique and properly saturated with key phrases. It is therefore important to create or edit the content on the home page, product and category descriptions, and build valuable content. It is not only texts, but also graphics and photos. In cooperation with marketing specialists, you can optimize a website in such a way that it is valuable not only for Google robots, but above all for users and potential customers of a given company. Both Google’s guidelines and the usability and appearance of the website are taken into account.

An equally important element is the aforementioned optimization of headers and meta data (title and description) both on the main page and on individual subpages. Their proper use allows Google robots to find a given subpage and analyze the data contained therein. Meta title and meta description are displayed in the search results, showing users the topic and content of the page.

However, many technical aspects, such as the optimization of ALT attributes or HTTP headers, are invisible from the user’s perspective. It may also take into account, inter alia:
– page loading speed,
– responsiveness,
redirects,
– SSL certificate,
– website code.

The work then focuses on improving the internal linking structure, which is one of the highly regarded SEO practices. This properly performed optimization stage helps to direct the attention of Google robots to the most important content on the page.

SEO the company website – one timer or a marathon?

Looking at the above-mentioned components of website positioning, SEO cannot or should not be a one-time action. If the work is to bring the intended results, it requires thorough analysis, diligently carried out activities and constant monitoring of the effects. Google guidelines, algorithms, the system and even the market are changing so dynamically that in order to stay in the position you have earned, you have to work on it all the time. Time is needed for proper indexing, planning and creating content, as well as building a link profile.

Google offers many business owners many opportunities to grow their businesses. It provides tools that make promoting your own business even more effective.

Google My Business

For example, a company’s showcase in Google Maps supports local positioning, improves the website’s visibility in search results and confirms the credibility of a given company. It is a free tool thanks to which you can inform the user about the days and hours of the company’s opening, as well as indicate the physical address and the company’s website.

Online advertising is a huge support for the conducted marketing activities. Positioning the company website is one of the most powerful advertising tools which, if used properly, can guarantee great success for almost any company.

]]>
Website positioning https://www.paraphrase-online.com/blog/seo/website-positioning/ Thu, 08 Oct 2020 05:56:12 +0000 https://www.paraphrase-online.com/blog/?p=779 Continue readingWebsite positioning]]> Is website positioning necessary to be able to find it on the Internet? It is not – as long as you allow web crawlers to your site, you will be deployed into the modes of a huge machine known as Google. However, simply finding yourself in the search engine database does not bring traffic, and those interested in your content or services will not find it soon where you would like to invite them …

Hey Google, I’m here!

When Google recognizes our website, it is necessary to take steps to identify those elements that its users may find useful. The steps to encourage web crawlers to go on our side are mostly found in Google’s guides and webmaster manuals.

Basic activities focus on sharing your website and allowing traffic on it – both machine-made and human-made. Then, it is worth indicating the area with which the search engine should associate our site – e.g. by including keywords related to the industry within it. In the meantime, it’s also important to create access to a place where Google stores everything about our domain – Google Search Console. And then… then a more advanced part begins that should be discussed with a specialist!

But first, let’s focus on the basics and start positioning our site on Google!

Hey Googlebot, you’ve got Allow here

Positioning your site should start with the admission of web robots to your site. For old seo enthusiasts it is obvious, but for people who have not had any previous contact with robots, this issue may be interesting.

Internet crawler / indexer – a program whose task is to collect information about everything that is available on the Internet. There are different types of robots that specialize in collecting various data.

Googlebot performs a crawl – it goes through the links on the pages and indexes its content. The frequency of Googlebot visits to websites varies – visits may be repeated every few hours or every few days. Without crawling, it is not possible to display the page in search results.

Most of the web robots obey the rules imposed on them by the website. Information on this is contained in the robots.txt file and the “robots” tag located in the HEAD section of the page’s code.

Why is this information so important? When creating a page, in most cases you disable the ability to index your site, preventing robots from entering the page. Such action makes sense – an unfinished page, full of “lorem ipsum” or pictures of sweet kittens playing the role of place holders – will not be found in the search results by accident and will not mislead the potential customer. At the time of commissioning the website, the guidelines for web robots should be changed, but this is not always the case.

These types of mistakes are not uncommon and do not only happen to small players. In 2009, Skoda’s promotional campaign was widely discussed, in which the Internet search engine played one of the key roles. In the commercial, which was aired on television, it was encouraged to search for “Yeti kidnaps”. Problem? Access to the content on the page for robots was blocked in the robots.txt file.

The content of the result description is automatic information that appears when crawlers are not allowed to access the content. This situation is not conducive to website positioning, and the popularity of the slogan was used by the brand’s competition, quickly jumping over the website advertised on TV in Google.

Titles and descriptions

Page unlocked? Time for a title and description! The elements of the <head> section – title and description are the first fragment of your website that users encounter.

Both elements are also important, although it is assumed that the description (meta description) is of less importance from the SEO point of view. The title not only appears in search results, it also appears at the top of the browser tab.

The <title> element should contain information relevant to web robots and users. Keywords assigned to a given subpage have the highest priority here, and other information may follow them – for example, for industries such as road assistance it will be a contact phone.

The content of the title should be 65-100 characters long. Both values are approximate – for titles, this is where the number of pixels the title has been created is concerned. Google displays titles and lengths of about 460-600 pixels (depending on the size of the device we use), and in the case of longer ones – cuts the part and leaves a triangle, as you can see in the screen above. To make sure that the most important part of your title – your keyword – fits in the result, place it right at the beginning.

When it comes to <meta name = “description” content = “…”>, things are a bit different – its main goal is to encourage the user of the search engine to look at the site. When preparing a description, it is worth focusing on those aspects that make the website stand out from the competition. It is worth using the appropriate passwords – in the case of online stores, it will be e.g. information about free / immediate shipping. Call To Action should also be installed in the description – a password that will encourage you to take action – “Check now!”, “Find out more now!” and so on…

The description is limited to approx. 160 characters. When it’s too long, Google limits it to the appropriate size, leaving three dots at the end. The content of the description is available in the code of the website, but it is not displayed directly on it – its main purpose is to present the website in search engine search results.

During the process of positioning a website, analyzing titles and descriptions, and if necessary, changing them, it is one of the first steps that a positioner takes. Changing the above does not require technical skills, and in the case of popular CMSs, it is limited to placing appropriate content in dedicated places, usually marked as “title, description for search engines”.

Introduce yourself… or else Google will do it

In the absence of a title and description, Google decides what will be displayed instead of them. Unfortunately, sometimes Google also takes over when both tags are properly completed. In such cases, text fragments from the page appear in the SERPs containing the words and phrases searched by the user.

The lack of properly prepared title and description prevents the search engine from showing the most important keywords and finding the site in the right places in the search results. This hinders the procedure of positioning a website and associating it with a specific topic.

The presentation of topics and phrases important for our website and business also takes place through appropriate content. Content is as important to robots as it is to those more human users of the site. Each of them has their own preferences regarding its presentation… Robots like a well-described structure – texts with different headings – h1 as title and h2, h3,… as subheadings; subsequent paragraphs placed in the <p> tag, and the most important elements (keywords!) highlighted with <strong>. Additional description of it with the help of structured data will make the robot filter the content of the page faster (and more probably!). On the other hand, the web user will appreciate the structure of the text – its length, font size, breakdown into paragraphs, clarity of graphics and all other elements that will help accelerate the consumption and assimilation of the content.

Both aspects can be neatly combined, which usually has a positive effect on the visibility of the site in search results. Well-written and html-described text is popular among website visitors, and sometimes they even pass it on. Such action also stimulates its popularity among web robots, which translates directly into more frequent visits to the website and high positions.

What is website positioning?

Positioning is primarily emphasizing the advantages of a website, while eliminating all its disadvantages, in order to achieve the highest possible position in the search engine. Website optimization is nothing more than adapting to the rules that prevail in the Google search engine environment. It is these rules that are key to the order of the results displayed on the query.

Sounds simple and logical? Unfortunately, it is not – the above-described basic steps to be included in the Google index can be compared to putting the first track while learning to write – the title of a calligraphy master from this position is still far away. The following actions should be taken after the website is published and made available:

– optimization of its operation in terms of speed and efficiency,
– improvement of functionality, elimination of technical errors,
– appropriate description of the content of the positioned website – also using structural data,
– content analysis, creation and modification,
– activities aimed at extending the database of links pointing to our website,
– analysis of the effectiveness of these activities, and in the case of low effectiveness – changing the approach and building an improved strategy.

Positioning a website is a process dependent on many factors, including not only changes in the search engine algorithm, but also the specificity of the industry in which the promoted business or budget operates. Gaining SERP peaks requires consistency and constant control – the search engine is an extremely vital organism that is constantly being modified. Unfortunately, this causes uncertainty as to the stability of the position – which in the absence of appropriate and prudent actions – may result in spectacular mishaps.

In the case of positioning, it is worth relying on current knowledge and experience that allows you to develop an effective and safe strategy.

]]>
Google: New Link Attributes, rel = “sponsored”, rel = “ugc” https://www.paraphrase-online.com/blog/webwriting/google-new-link-attributes-rel-sponsored-rel-ugc/ Tue, 17 Sep 2019 05:48:26 +0000 https://www.paraphrase-online.com/blog/?p=255 Continue readingGoogle: New Link Attributes, rel = “sponsored”, rel = “ugc”]]>

If anyone ever thought that the positioner’s work is boring … our whole Paraphrase-Online team can ensure that this person is very wrong. On September 10, 2019, we were able to find out about it once again – with the appearance of the Twitter post:

 ” hundreds of hours of consideration, months of preparation, we finally arrived. Https://t.co/dhJJmdal4J

and a post on the official Google blog – here.

New Link Attributes

After years of reign of rel = “nofollow”, which until now allowed to limit the transfer of power of our domain, Google introduces new ways of marking outbound links – it is rel = “sponsored” and rel = “ugc”.

rel = “sponsored” – attribute of the link that should be marked with links related to the broadly defined advertisement,

rel = “ugc” – attribute of the “user generated content” link, i.e. links created by users on forums, in comments and other places,

rel = “nofollow” – an attribute that can be used in cases not covered by the two previously mentioned, in a situation where you do not want this link to be indexed.

This last point also indicates a change in the perception of links with the attribute rel = “nofollow”. Until now, according to Google, links from nofollow were not included in the search engine algorithm. With the introduction of new attributes, this is about to change and nofollow, sponsored and ugc are to be treated as tips that, along with other factors, will be helpful in determining the page’s ranking.

As Google emphasizes – and we’ve thought the same for a long time and think that “nofollow” has always functioned this way – every link to a page carries information and testifies to its value.

Practical changes

Google answers the most important doubts later in the post – entering new attributes does not require modification of existing links with “nofollow” at this time. However, the post emphasizes that every link that is an ad should have the “sponsored” attribute. There is a risk that the introduced changes may cause turbulence in the websites offering links – depending on how the websites providing the space for links will start to mark their links and of course – how it will look in practice using Google in the attribute hint.

Breaking down the types of links into smaller subgroups will definitely allow Google to analyze the links more accurately and define their value (and legitimacy). The main role here is played in particular by “sponsored”, which will determine whether, for example, the site provides links for a fee.

Far-reaching analysis suggests that “user generated content” can be much more beneficial for the site – especially if they are located in places where a group of enthusiasts and experts discuss, exchanging links to sites they recommend.

It is worth paying attention to one more element of the Google blog entry – in the last paragraphs it reminds you of the possibility to indicate to robots which links coming to our site should be ignored. With the change and perception of “nofollow” as a guide rather than exact guidelines, there is a risk of spammy links. With this in mind, it’s worth scanning your links and considering refreshing your Disavow file. Google gives us time until March 1, 2020 – when will “nofollow” be officially only – or until? – a clue.

]]>
Positioning of websites without limits of phrases – possible? https://www.paraphrase-online.com/blog/webwriting/positioning-of-websites-without-limits-of-phrases-possible/ Tue, 18 Jun 2019 05:47:55 +0000 https://www.paraphrase-online.com/blog/?p=199 Continue readingPositioning of websites without limits of phrases – possible?]]> Is it possible to contract for the positioning of websites that does not have any limitations related to the number of phrases, the frequency of their exchange, its duration and period of notice? Is such a contract model for SEO no limits achievable?

Positioning of websites without any limit

It is worth taking a closer look at these elements of the positioning contract, which most often appear in the context of questions about the lack of limits. We are fully aware of the fact that various agencies and people providing positioning services may come from other than our assumptions and have different observations. Therefore, we want to point out at the outset that this text presents our point of view and practices used in Paraphrase-Online.com. The subject itself is also a natural continuation of the entry from our blog, which explained one of the most important issues related to the concluded contract. Therefore, we will not deal with the theme of the contract period again, focusing on phrases related to phrases.

How many phrases will fit in the positioning agreement?

In this respect, we are very flexible – depending on the chosen contract package, we can position from 15 to … 1,300 phrases for a given website. Writing here about the positioning of a specific number of phrases, we mean positioned phrases, for which daily monitoring of the position is carried out, the results of which appear in the client’s panel. This number of phrases monitored every day is recorded in the contract. In positioning as such, there are no limits – Google determines how many phrases connect to a given site. That is why we check how many keywords for a given website appear in general in top 10 and top 50.

Our client supervisors advise on the adjustment of the optimal set of phrases – in terms of quantity and quality. We know from experience that the times of small (in terms of the number of phrases) SEO contracts go to the past. Customers who started from a modest variant of 5 key words often develop a positioning contract adding new contract terms. There will be no violation of any secret and confidentiality of the contract, if we write that one of our clients started positioning from 5 phrases, and today has more than 10 times more in the contract. This (in stages) decision to expand the set of keywords was associated with the conviction that the current number of keywords is too small to present the whole range of its offer. Therefore, the client came to the conclusion that this contract should simply be extended. Behind the decisions of our clients to extend the contract is the conviction that a small set of keywords brings less benefits, because it presents only a portion of their offer.

The indication in the contract for the positioning of the number of daily phrases tested for the position and the number of optimized and monitored for security subpages is the effect of everyday contact with our clients, for whom such a solution is a significant facilitation when evaluating the results of the positioner’s work.

Why 15 phrases has become such a quantitative minimum for positioning for us? Observations and daily contact with our clients have been decided. A set of 15 phrases allows for a varied presentation of the offer, contains less and more general phrases, and something with a long tail can fit here. Just 15 phrases let you feel that positioning works and brings a specific benefit to the company that decided to reach for this form of promotion.

Of course, a set of 15 phrase phrases will not qualify for each company as “a reasonable minimum that allows you to feel the benefits of SEO.” In the case of online stores, this threshold will be much higher.

Website positioning – how many phrases can fit on the website?

In the case of a contract for positioning websites, the selection of key phrases must be correlated with the content of the website that we want to promote in this way. On the one hand, it is related to the fact that you can effectively position only what has its own (available for web robots Google) representation on the website. This means that in practice there is no possibility of positioning “on the stock” – something that will only appear in the offer, and today there is no mention even on the site. Google simply will not display a non-query site in its results.

The second thing is the size of the site itself. It is not known today that Google prefers expanded (in the sense of the number of tabs) websites. He treats the multiplicity of subpages as the announcement that you are dealing with a content rich site. For this reason, one page pages are not a good starting point for the positioning process. Fortunately, the remedy for this problem is quite simple and easily accessible – the extension of the site by adding new subpages and content.

The number of keywords and subpages must be in balance. It is not beneficial for the effectiveness of SEO, for example, directing all phrases from the contract to the home page. If the website’s positioning specialist notices that the site has a problem with the room of all selected phrases, it will suggest extending the website. At Paraphrase-Online.com it is assumed that if our specialists suggest expanding the website, in view of the effectiveness of SEO activities, this does not entail any additional costs for the client.

Change or not change – that is the question!

Positioning is a school for patience. The effects rarely appear overnight, the results of work on key words often come gradually.

Working on specific keywords is a process divided into stages, which is counted in months. Changing keywords means stopping this previous process and starting again. Frequent change of keywords means that the waiting time for positioning results can significantly increase. Therefore, after changing the set of keywords, one should reach in justified circumstances the type of change in the offer and in the content of the website. Such plans for changing the set of keywords are also worth consulting with your client’s supervisor and an SEO specialist who deals with the website.

There is also no advantage in a strategy in which keywords that have already been positioned are turned into new ones – in the hope that those that are already in the top 10 will stay there permanently, and the positioner is no longer needed – so why overpay … Such temptations sometimes appear in the case of a contract for effect and are to constitute a “smart” way to reduce the cost of positioning services.

No limits or … with limits?

Let’s be honest, the slogan “SEO without limits” sounds very encouraging. Since there are no limits, competition can not give more. How does it look in practice?

The natural limit for SEO is the relationship between the number of phrases and the size of the website and the offer placed on it. Small service, limited offer (not everyone has to run a big business) – it offers us only a limited number of phrases in the context of which a given website will appear. It is not possible to optimize one subpage for an unlimited number of phrases – the limit of the meta title will be the limit here. Therefore, we make it clear in our contract and inform our clients about the limitations associated with the positioning service. We are convinced that such a realistic approach is more than just a “no limits” slogan. A targeted optimization of a website consisting of 10 bookmarks per 1000 phrases is not simply possible. We specify in the offer and in the contract for how many phrases we conduct daily monitoring of the position and for how many subpages we perform a special optimization in terms of keywords. At the same time, we are strengthening the entire website, which is reflected in the statistics – the total number of phrases in the top 10 and top 50 related to a given website.

]]>