1 / 37

PrestaShop Development Services

Hire PrestaShop Developers from eGrove for all kind of PrestaShop Development Services, Migration, Customization, Integration, Themes and Modules Development

Télécharger la présentation

PrestaShop Development Services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SEO

  2. What is SEO? Search engine optimization is a technique that helps to improve visibility of our site through coding and content. Analytics and Web intelligence Keyword and content On-page SEO & Site architecture – technical Link development or off page SEO – mostly non technical

  3. Technical SEO • Site architecture • Providing the readable content to search engines • Avoiding duplicate issues • Delivering the content faster to users and search engines • Improving user experience

  4. Site Architecture • Site need to be designed based on the intended target audience • For example if the site is for kids then color, theme and font size need to be selected to attract kids. Kids also feel comfortable in spending time in the site. • Mostly Business analysts, Project managers and Designers will take care of this part

  5. How to provide content to search engines • Unique URL- SEO friendly • Meta data to provide details about the page • XML site maps for regular pages and for videos • Updating the sitemap if there is a change in the site

  6. URL is important for good ranking. Best practices to follow for URL structure: • Relevant, compelling and accurate. • Use hyphens to separate words and don’t use underscores. • URL to have primary keyword. Adding secondary keywords will be an additional value. • Limit URL slug to 3 -5 words.

  7. Here the following URL has 2 slugs. http://www.egrovesys.com/application-development/prestashop-development.html

  8. Whereas the following URL has 7 slugs. It is using H1 • Programmers should not use such default functionality and need to use a solution for this. • If your SRS doesn’t say about this, clarify with Xavier. • Try to limit 3 levels deep unless clients are specific about this.

  9. Title and Meta description are very important and need to follow SEO guidelines

  10. Title should not exceed 65 characters and Meta description should not exceed160 characters. Otherwise the content provided in the Meta will be truncated if they are used by search engines.

  11. The Open graph protocol OG will help to control how a link appears in Facebook when a page is shared or liked there. So they have a value in FB, but none in search engines. • Since Facebook is growing, more and more clients are interested to have OG implementation in their sites. I recommend you to raise question if there is no mentioning.

  12. Avoiding the duplicate content, title and description • Duplicate issue is one of the serious issues that programmers need to avoid Some example sources: http://www.example.com and http://example.com www.example.com and www.example.com/index.html www.example.com and www.example.com?session-id=1234 www.example.com/1 and www.example.com/1/

  13. 301 redirect 301 redirect will help if any page is no longer required and can be permanently redirected. Advantage with this practice is there won’t be any loss in the link values. • For example egrovesys.com and egrovesys.com/index.php render the same page. • See it from Screenshots in next Slide

  14. Solution is one of them can be 301 redirected to other URL.

  15. Canonical element Canonical element will be helpful to avoid duplicate issues arising out of URL generation with session ids, query parameters and tracking codes. Example:Pages with session ids generate duplicate title issues in this example:. • http://www.thisoldhouse.com/toh/article/0,,1147475,00.html • http://www.thisoldhouse.com/toh/article/0,,1147475,00.html?xid=hinewsletter 081908-47-skills Need to implement canonical back to the preferred URL to resolve this issue. Need to add the canonical element within the head section. <link rel="canonical" href=" http://www.thisoldhouse.com/toh/article/0,,1147475,00.html"/> * Business analyst and programmer can add this feature as an additional scope in the project development.

  16. Pagination handling Pages in series or galleries will normally generate duplicate title and description issues which can be avoided by using rel=”next” and rel=”prev”  Let us See In detail from Following Screenshots..

  17. http://www.realsimple.com/food-recipes/tools-products/14-surprising-uses-for-your-microwave-10000001035388/index.htmlhttp://www.realsimple.com/food-recipes/tools-products/14-surprising-uses-for-your-microwave-10000001035388/index.html

  18. http://www.thisoldhouse.com/toh/article/0,,451111,00.html

  19. Egrovesys.com Portfolio Page

  20. Pagination Screenshots Explanation • The first page only contains rel=”next” and no rel=”prev” markup. • Pages two to the second-to-last page should be doubly-linked with both rel=”next” and rel=”prev” markup. • The last page only contains markup for rel=”prev”, not rel=”next”. • rel=”next” and rel=”prev” values can be either relative or absolute URLs (as allowed by the<link> tag). And, if you include a <base> link in your document, relative paths will resolve according to the base URL. • rel=”next” and rel=”prev” only need to be declared within the <head> section, not within the document <body>.

  21. No index Meta and robots.txt • “No index” meta will be useful if we don’t want to index a page. robots.txt can be used to block any particular section of a site from crawling. • If the page is already indexed, robots.txt will not have any impact. So wherever possible use “noindex”.

  22. Page load time Page load time one of the factors that could influence users to stay and do transaction in the sites. Some of the areas where programmers can use their intelligence are: Avoiding Excessive CSS in Head: Placing CSS inside of head should be avoided for helping spiders to reach the text quickly. Example: http://www.health.com/health/static/buzz/contests_and_giveaways.htm External CSS file is recommended to handle such issues like <link rel=”stylesheet” type=”text/css” href=”externalcss.css” /> We Can See it from Screenshot in next Slide

  23. Avoiding Excessive JS in Head • Pages that contain excessive java script need attention from development team to find the possibility to move either to the bottom or to the external file. • Google and other search engine spiders are more advanced nowadays and can be able to detect page text even if there are excessive java scripts. Time required to reach the text will be the important factor. • Java scripts are unnecessary areas for spiders. Excessive java scripts in the head will consume spider time with no reason. So delivering the required text to spiders quickly by eliminating lengthy java scripts ahead of body text will improve the ranking. Example:http://www.health.com/health/anxiety We can See it From Screenshot in Next Slide

  24. Avoiding Excessive JS in Body • It is recommended to reduce the JS in the body to help spiders to quickly crawl the page. • Page load time also improves if JavaScripts are handled properly. In order to load a page, the browser must parse the contents of all <script> tags, which adds additional time to the page load. By minimizing the amount of JavaScript needed to render the page, and deferring parsing of unneeded JavaScript until it needs to be executed, we can reduce the initial load time of page. Example: http://www.health.com/health/appendicitis We can See it From Screenshot in Next Slide

  25. Avoiding Excessive Whitespace Minifying code is recommended which refers to eliminating unnecessary space, new line characters, comments etc. Example: http://www.health.com/health/library/mdp/0,,d04537t1,00.html

  26. Following Heading Rules • H1 should come first in the source code and should be the first Header tag parsed by any search engine crawler.  Do not precede the H1 with any other Header tag. • You should have only one (1) H1 tag per page.  Thereafter, you can have as many H2 – H6 tags as necessary to layout the page and its content, but use a logical sequence and do not “style” your text via Header tags in your CMS. ---H1---- ---H2----- --H3— --H3— ---H4-- ---H2---

  27. Custom 404 error page: • HTTP requests are expensive. So making an HTTP request and getting a 404 error or “not found” will slow down the user experience. Some sites have helpful and creative 404 error page to cover bad user experience. Still such pages waste server resources (like database, etc). Particularly bad is when the link to an external JavaScript is wrong and the result is a 404. • It is a good practice to keep 404 errors to minimum level through other means like blocking unnecessary URL generation. As a final resort 301 redirects can be used. But such redirects should go to main page or any other related page. • Google maintains that 404 errors won’t impact site’s search performance, and can be ignored if we’re certain that the URLs should not exist on our site. It’s important to make sure that these and other invalid URLs return a proper 404 HTTP response code, and that they are not blocked by the site’s robots.txt file.

  28. Bad design: taking advantage from their brand value

  29. Redirected to home page: O.K

  30. Custom 404 page: Good

  31. Better: designed with search option

  32. Ajax implementation Ajax implementation in the site needs to follow Google guidelines to display AJAX URLs in the search results.   For example: www.egrovesys.com/portfolio#1  Should become:www.egrovesys.com/portfolio#!1

  33. Q and A

  34. Thank You Part II: • Combining images: • Browser caching • Lossless compression of images • Inline Java script • Rich snippets for ratings and reviews • Moving a site to a new host • Ajax implementation

More Related