What is SEO?
In simple words, SEO (Search Engine Optimization) is the process of improving your website's visibility in a search engine by optimizing keywords that have good ranking and volume (by search traffic). Keywords are those specific search terms that people use to search on the web. These terms are globally ranked according to the content (blogs, websites, etc) quality and UX (user experience).
Google is the most popular search engine that provides a globally connected way for marketers and bloggers to analyse keywords and their rankings, and prepare content accordingly for their websites.
However, keywords are slowly becoming less relevant, since search engine algorithms are now capable of filtering quality content from bad content. This is why your website's code matters a lot.
Algorithms today rank websites (more relevant) along with keywords (comparitively a little less relevant), based on content quality and search intent as well. Search intent is the process of optimising relevant content for search terms based on location and content/website popularity.
Why is SEO Important For A Web Developer?
As the basic building block of any web page on the internet, HTML is a great way to ensure great SEO by writing semantically correct and clean code.
header(h1 - h6),
alt tags, canonical tags like
a, and responsive meta tag attributes like
scale, are all highly important SEO focused tags in HTML that will set up a foundational site optimized for search engines. This is called on-page SEO and is very important to establish website visibility.
But using a frontend technology like React Library is proven to be bad for SEO.
Why React Is Bad for SEO
It is important to understand how search engines work here. Search engines have crawlers that continuously crawl the web and index websites and their pages based on content quality, semantically proper HTML, and website usability.
Since traditional websites have their entire content in the HTML page, it is very easy for crawlers to index this landing page and its content, check for SEO focused tags, and then they start ranking the site and its pages for visibility.
SPAs or Single Page Applications have been on the rise over the last few years (which is also the reason for popularity of libraries and frameworks like React, Angular, Vue etc). But majority of the websites in existence are still traditionally architectured.
In short, this is how crawlers generally work -
🤖 A crawler bot has a queue of website URLs that it needs to crawl and index.
🤖 At idle times, the crawler hits up each URL in the queue, makes a request to its server, and fetches the HTML (to read the content).
- If needed, then this URL is moved to a "render" queue.
- This queue is visited much later on.
🤖 Once all the canonical link tags are extracted and indexed, the content is finally added to the search engine's index.
Problems faced with React for SEO
1. Empty Content
2. Load Time & UX
3. Lack of Metadata
<meta> tags for individual pages are highly important for visibility and content indexing. Since React renders all the content including meta tags on the client-side in a single app shell, adapting these meta content for individual pages is very difficult.
4. No Built-In Sitemap
Sitemap files contain all the information of the website structure, the pages, media, and their interlinked relationships. This helps crawlers to intelligently index your site and hence improve ranking.
However React does not have a default way to generate sitemaps. React Router is a necessary tool to handle routing in a React app, so you would have to use a tool to generate sitemap, but that takes a lot of effort again.
So should you stop using React? Absolutely not. React is one of the high performant, in-demand technologies that is a great tool for developers to create rich UI/UX in websites and web applications.
There are two primary advantages to this -
- Page load times are a lot faster
Achieving SSR from scratch can be challenging. This is why SSR frameworks like GatsbyJS and NextJS that are built on React have made lives easier.
While Gatsby is great for static content rendering (no real-time content changes), NextJS goes one step ahead to give a hybrid solution for both static and dynamic sites.
NextJS follows a lean development practice with stricter file name conventions. But these are small adaptive measures that developers can take to achieve great SEO and user experience. It also uses dynamic routing and custom server routing to segment a website into easily readable and structured paths.
The best part is that all the great things about React like components, component properties, and component states, are not traded off.
The company behind NextJS, Vercel, offers free hosting with
HTTPS integrated as well. This means another bonus point for SEO since SSL certificate is now mandatory for sites to be ranked well.
While traditional web development methodologies have trade-offs between either SEO or UI/UX or performance or even Developer Experience (DX), React and NextJS solves all these issues without having to trade-off on anything.
Stay tuned for more such articles on web development ⏳
Meanwhile, sign up for our full stack web development course and become a web developer today 🚀