Javascript and SEO

Javascript and SEO

Ever since its introduction in 1995, JavaScript has been constantly transforming the way in which people interacted with the web. It has been considered a point of Blaaze when it comes to engaging users in the most interactive fashion. This has come in as a massive advantage for people who come into the website through direct input on the browser address bar or from sources of referral traffic like social media. However, when it comes to performance on search engines, JavaScript has always taken a backseat.

It is indeed a thing of wonder because one of the most prominent JavaScript frameworks, Angular JS is the brainchild of Google, and the search engine giant has been instrumental in creating such a robust and flexible framework for the development of single page application. There is no point in pondering over why something has not happened. A better way would be to find out the ways to circumvent this problem and make JavaScript applications friendly for search queries.

To understand how JavaScript search engine optimisation works, it is important to understand what exactly is search optimisation and how does the entire process happen.

What is SEO?

Search engine optimisation is the art and science of ranking your web page on the top of the search results for relevant searches on search engines like Google and Bing. All the search engines use a combination of techniques to determine which web page should be listed first for a particular search query. There are a lot of parameters including but not limited to the content, the code, and the number of links from external sites that point to one particular site. The external links acts as democratic factors in determining the value and popularity of a page.

How Does It Happen?

When a new page is created, the search engine spiders crawl the web page and take a note of the URL and the code. The code is then processed and it is stored as meta data which contains information about the theme of the content on the web page. After all the information is taken, it is indexed on servers, ready to be presented as a search results.

The JavaScript Jinx

There is one challenge with JavaScript in this entire process! As we all may have known, most of the JavaScript frameworks are designed to render on the client side. It means that the page might not have a lot of relevant information when it first loads. The relevant information is decided to appear only when the user interacts with the page. Therefore, when search engine spiders crawl the page, they might not be able to get a complete hang of what exactly is there on the page. The challenge here is to ensure that Google gets the information about what the pages about, while at the same time, not compromising on the quality of interaction with the user. The solution to this challenge can be addressed by server side or pre-rendering.

The Way Out

Pre-rendering basically tells search engines that this is the content that will be displayed to the user after a certain pattern of interaction takes place. In essence, you are revealing to the search engine what the user is likely to experience. This might not affect the user in anyway! You retain the way in which a user interacts with your page but at the same time, inform search engines about what the pages!

The basics

Even in all the JavaScript, it cannot be denied that HTML can seamlessly integrate with the JavaScript framework. Therefore, if you can write the right titles and meta descriptions, it has a great possibility of being picked up by the search engine spiders and it can greatly helps in fixing the ranks for your web page. In addition to that, using unique snippets also helps in bringing quality visits to the web page.

It is quite known that the heaviness of the page also contributes negatively when it comes to the performance on search engines. Therefore, it is important to keep your web page as light as possible, irrespective of the technology that you have used to build your mobile app.

Conclusion

Adding a splash of pessimism, the crudest form of an innovation is when something cannot be accommodated! That is the case with JavaScript and it’s frameworks. It could possibly be that search engines are not that evolved to find out what exactly JavaScript framework applications intended to. There is quite a possibility that in the near future, JavaScript could be the order of the day and for all you know, it could be the most preferred to technology by search engines.

SEO For React

Search Engine Optimization (SEO) is a process of structuring and organizing your website to bring a better quality and quantity of traffic to your website by ranking higher in the search engine results, focusing on the specific keywords related to your business. The main aim of performing the SEO process is to gain more visibility on the internet and drive more traffic to your website.

Search engines today rely on crawling the content that is put on your websites. Since this process is automated, it is extremely important that the content is structured and formatted in a manner that is understandable by machines. The SEO process involves optimizing the website performance and also curating the content to provide hints to the search engine crawlers to easily understand your website.

This may sound fairly straightforward, but for websites built on React, this may not always be the case. Let’s have a look at the reasons why.

Challenges with SEO for React Websites

a. The use of React-based Single Page Applications

Since the start of the World Wide Web, websites worked in such a way that the browser needed to request each page completely. To simplify the server generates HTML and sends it back on every request. Even if the next page has a lot of the same elements (menu, sidebar, footer, …), the browser will still ask for the entire new page and re-render everything from the ground up. This causes a redundant information flow and additional stress on the server that needs to fully render each page instead of supplying only the required information.

With the advancement in technologies, the requirement for faster loading times increased. To resolve the issue with the full page loading time, developer communities came up with JS-based Single Page Applications or SPA. These websites do not reload the whole content on the website, instead, they refresh only the content that differs from the current page. Such an application improves the performance of the website drastically as the amount of data being transacted is reduced. One good example of technology that can be used to create SPAs is ReactJS, which is as well optimizing the way the content is rendered on the user’s browser (more about this here).

b. Main SEO issues with SPA

Although SPAs improves the website performance to a considerable extent, there are certain inherent issues in this type of setup when it comes to SEO.

Lack of dynamic SEO tags

An SPA loads the data dynamically in selected segments of the web page. Hence, when a crawler tries to click a particular link, it is not able to detect the complete page load cycle. The metadata of the page which is in place for the Search engine does not get refreshed. As a result, your single page app cannot be seen by the search engine crawler and will be indexed as an empty page, which is not good for SEO purposes.

Programmers can solve this problem by creating separate pages (more often HTML pages) for the search bots, and at the same time work with the Webmaster to discuss how to get the corresponding content indexed. However, this will increase the business expenses owing to the cost of developing additional pages and also make it difficult to rank websites higher on the search engines.

Search Engines may or may not run your JavaScript

Every Single Page Application relies on JavaScript for dynamic loading of the content in different modules of a webpage. A search engine crawler or bot might avoid executing JavaScript. It directly fetches the content that is available without allowing JavaScript to run and index your website based on this content.

Google made an announcement in October 2015 mentioning that they would crawl the JS and CSS on websites as long as they allow the crawlers to access them. This announcement sounds positive but it is risky. Although Google crawlers are smarter today and they allow execution of Javascript, one cannot decide solely based on a single search engine. There are other crawlers like Yahoo, Bing, and Baidu which see these sites without JavaScript as blank pages. To resolve this, one needs to create a workaround to render the content server-side to give the crawlers something to read.

Solutions for SEO of React-based SPA websites

There are 2 major ways to resolve the SEO issues being faced in the React-based SPA websites.

1. Isomorphic React

2. Prerendering

a. How does Isomorphic React help in SEO?

An Isomorphic Javascript technology based React website automatically detects if the JavaScript is disabled on the client side. In a scenario where JavaScript is disabled, Isomorphic JavaScript runs on the server-side and sends the final content to the client. In this manner, all the necessary attributes & content are available when the page loads. However, if JavaScript is enabled, it performs as a dynamic application with multiple components. This provides faster loading than traditional websites, a wider compatibility for older browsers and different crawlers, smoother user experience and the features of Single Page Application as well.