WHAT IS TECHNICAL SEO?
Technical SEO is the optimization of web pages and server to help search engine spiders to crawl and index your website more effeciently and effectively.
What comes under Technical SEO?
- Website Architecture
- Crawl Budget
- Page Speed & On Page SEO
- Duplicate Content
1. WEBSITE ARCHITECTURE
Website architecture is how a web site’s pages are structured and linked. The structure and the linking must be easy for the users and the crawlers to find and navigate.
- Use Flat Architecture- Means an user can reach any page or link on your website in 4 clicks or less. Avoid Deep Architectures
- Keep Things Simple
- Use Category Pages- It becomes really easy to navigate with category pages
- The URL structure is very important- Keep it short and user friendly. Follow similar structure throughout.
- Linking of the internal pages: Navigational links are html, not JS or Flash
- Sitemap-A sitemap is a great way to increase the crawlability of your website. The spiders and crawlers find it convenient.
- SiteLinks: Sitelinks are a bonus benefit that you get from a strong site architecture. There’s no structured data markup for sitelinks. They happen automatically when your site is authoritative and interlinked.
3. CRAWL BUDGET
Crawl Budget is the no of pages GoogleBot crawls and indexes on a website within a specified timeframe.
Why is it important?
If Google doesn’t index a page, it is not going to rank for anything.
If your no. of pages exceed your site’s crawl budget, there will be pages on the website which will remain unindexed.
When to take care of crawl budget?
You run a big site: If you have a website (like an e-commerce site) with 10k+ pages, Google can have trouble finding them all.
You just added a bunch of pages: If you recently added a new section to your site with hundreds of pages, you want to make sure that you have the crawl budget to get them all indexed quickly.
Lots of redirects: Lots of redirects and redirect chains eat up your crawl budget.
4. PAGE SPEED
It is the amount of time a webpage takes to load on a server. The different factors affecting page speed are: Server, Page Filesize, Image Compression,etc.
- Image Compression (Maximum Reduction)
- Cleaner the code, faster things will load
- Compress your code using a program called GZip.
- Set Up Browser Caching in your .htaccess file.
- Set Up a CDN
Robots.txt is a file that tells search engine spiders to not crawl certain pages or sections of a website. Most major search engines (including Google, Bing and Yahoo) recognize and honor Robots.txt requests.
When do you need a Robots.txt file?
- When you want to block a non-public page like a login page or when a page is in the staging phase
- Maximize and Optimize crawl budget
- Prevent Indexing of Resources: Using meta directives can work just as well as Robots.txt for preventing pages from getting indexed. However, meta directives don’t work well for multimedia resources, like PDFs and images. That’s where robots.txt comes into play.
The bottom line? Robots.txt tells search engine spiders not to crawl specific pages on your website.You can check how many pages you have indexed in the Google Search Console.