Search Engine Optimization (SEO)

From Library Success: A Best Practices Wiki
Jump to: navigation, search

Contents

Search Engine Optimization (SEO)

Search engine optimization is a methodology of strategies, techniques and tactics used to increase the amount of visitors to a website by obtaining a high-ranking placement in the search results page of a search engine (SERP) -- including Google, Bing, Yahoo and other search engines. What is SEO


Why not take 3 minutes to watch an introductory video about search engine optimization (SEO)? [1]

Other resources

  • Grappone, Jennifer, and Gradiva Couzin. Search Engine Optimization: An Hour a Day. San Francisco, Calif: Sybex, 2006. Print. [2]
  • Arlitsch, Kenning and Patrick OBrien. Improving the Visibility and Use of Digital Repositories through SEO: A LITA Guide. ALA TechSource, 2013. Print. [3]

It is also helpful to go through Google's Webmaster Academy to view the videos and learn how Google works, make sure Google knows about your site, and influence your site's listing in search.

SEO Keywords and Page Content Guidelines

When search engine spiders crawl a webpage they look at the frequency of keywords (e.g. how often a keyword appears on a page), the weight of a keyword (e.g. beagle may "weigh" more than dog) and the proximity of keywords to each other (e.g. "used books" or "scholarly articles").

Content producers should have a good understanding of what users are looking for (e.g. what keywords are they searching) and understand how they are looking for this information. Relevant keywords need to be incorporated in the the webpage title, url, text and metadata. From Davis, Harold. 2006. Search engine optimization. [Calif.]: O'Reilly. [4].

   *Provide quality content in format friendly to webbots
   *Important keywords should go in
       1. Title: putting relevant keywords in the HTML title tag for your page is probably the most important single thing you can do in terms of SEO
       2. Headers: keyword placement within HTML header styles, particularly h1 headers towards the top of a page, is extremely important
       3. Links: use your keywords as much as possible in the text that is enclosed by <a href="">...</a> hyperlink tags on your site in outbound and cross bound link. Ask webmasters who provide inbound linking to your site to use your keywords whenever possible
       4. Images: include your keywords in the alt attribute of your HTML image <img> tags
       5. Text in bold: if there is any reasonable excuse for doing so, include your keywords within HTML bold (... ) tags
       6. Meta:  not as important to search engines as the actual content of the page. However, the meta description tag can improve click-through rates as the description text does display to users and can help them understand the relevance of your page.

Resources

SEO URL Guidelines

From "Optimizing Your Online Shingle: On-Page and Off-Page Best Practices. Published In: Law Firm Marketing Updates, Law Practice Products & Services Updates" [5]

  • The fewer the parameter in your dynamic URL, the better. One or two parameters is much better than seven or eight. Avoid supperfluous/nonessential parameters sucha s tracking codes.
  • A static-looking URL (one containing no ampersands, equal signs or questions marks) is more search -optimal that a dynamic one. (Dynmaic pages opened using scripts that are passed values are too useful to avoid. Most search engines can traverse dynamic URLs provided thay are not too complicated).
  • Having keywords in the URL is better than no keywords.
  • A keyword in the filename portion of the URL is more beneficial than in a directory/subdirectory name.
  • Hyphens are the preferred word separator. Underscores are not, and have never been considered to be, word seperators by Google (this according to renown Google engineer matt Cutts.) So if you have multiple-word keyword phrases in your URLs, I'd strongly recommend using dashes to separate them.
  • Stuffing too many keywords in the URL looks spammy. Three, four or fice words in a URL look perfectly normal. A little longer and it starts to look worse to Google, according to Cutts.
  • The domain name is not a good place for multiple hyphens, as it can make your URL look spammy. Although that said, sometimes a domain name should have a hyphen, as the domain fauz pas "arsecommersce.com" demonstrates.

From Beginners Guide to SEO [6]

  • Make sure URLs are appropriately descriptive
  • The shorter the url, the better

SEO Link Guidelines

Search engines measure who is linking to a site and what they are saying about the site. Links are viewed as a vote of popularity, trust and relevancy metric. Spiders use links to navigate through webpages so you need good link structure. Where you place links on a page can impact SEO

See:

SEO Tools & examples of how libraries are using them

  • Google Web Master Tools : Submit your sitemap and check out your search queries. There is a whole section dedicated to Optimization. UCLA Library used Drupal [7]
  • Browseo A web app that allows you to input a url and obtain a report which focuses on the pure HTML, similar to what search engines see when they visit a page. You can quickly determine a page’s structure as well as its relevance for specific search terms by toggling options such as: Server response code.Is the URL redirected? If so, what kind of redirect is used? Number of words on the page. Headings (H1-H6. Number of internal links on the page (links to pages on the same domain). Number of external links on the page. META information such as title tag, meta description, meta robots tag and any other tags that are present.
  • Duck Duck Go A search engine that does not employ cookies so you can use it for when you want to look at a SERP (Search Engine Results Pages) without seeing personalization from cookies or going through the hassle of going incognito in other search engines.
  • Soolve: A keyword research tool that provides the suggestion services from all the major providers in one place. [www.soovle.com]
  • SEO Analyzer: Turn off JavaScript, CSS, and cookies in your web browser and view your website. This is how the search engines most likely see your website. If you can successfully view your content and navigate your website your site is mostly search engine friendly. The only other thing to check is your URLs. Not using a session ID or 'id=' in your query strings is also very helpful.
  • SEO Doctor: an extension that can be added on to firefox that allows for SEO diagnosis and problem solving. [8]
  • Screaming Frog:available as a free program you can download and install on your computer which spiders websites' links, images, CSS, scripts and apss from an SEO perspective. It gives you data about the site in tabbed excel format. Watch to video to see what it can do. UCLA Library used this to quickly identify client error in links (404 and 403 pages). [9]
  • Open Site Explorer, the Search Engine for Links. Watch the video on the site to see how it works. It tells you who is linking to your site. It can tell you who is linking to other sites. [www.opensiteexporer.org]

Search Log Data (varies) UCLA Library obtains search log data from multiple sources. We see the search queries in Google Webmaster Tools and Google Analytics. Because the data can be rather limied from just looking at the terms folks type in the search engines, we also study the search log data from Drupal selected Search pulldown options: Site, UCLA Library Catalog, and Melvyl. We also obtain data from Webservices statics, Voyager Analyzer Reports, Serials Solutions Counter and Springshare Libguides.

SEO Tools for Drupal

Google Analytics

PathAuto (UCLA Library employs)

Metatag (UCLA Library is testing)

XML SITEMAP

SEO Spider from Screaming Frog (DePaul Univ is testing)

SEO Community Forums & outreach & Conferences

SearchEngineLand

SEOMoz

SMX West

SMX Advanced

SEO Rubric Case Studies: "Not Search Engine Friendly" web site examples as SEO teaching models

SEO elements of site
URL Notes
http://www.ucla-cir.org/ The URL is static – but content is different. Framed. Gives me great pause in terms of being search friendly. How do you track use of specific page accurately? Tracking is more difficult for framed sites see Set Up Tracking Framed Sites How does user cite specific page? How do other sites link to a specific page? (thus improving page rank in search algorithm)

Conclusion: The framed site URL structure does not bode well for search friendly and is difficult to set up tracking.

Suggestion: Consider a url taxonomy which follows the SEO URL Guidelines

http://www.semel.ucla.edu/cir/index.html http://www.semel.ucla.edu/cir/ gives “Forbidden” and http://www.semel.ucla.edu/cir/index.htm gives “Forbidden”

Suggestion: Consider appropriate redirects

http://www.semel.ucla.edu/cir/investigators.html#

Consider change to http://www.semel.ucla.edu/cir/integrative-focus-team/

Suggestion: URLs follow a taxonomy using important keywords

http://www.ucla-cir.org/

and http://www.semel.ucla.edu/cir/index.html

Create pages that are friendly to webbots. “Google puts the “simple is best” idea this way: “If fancy features such as Javascript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.” (Davis, 2006) The only way to know for sure whether a bot will be unable to crawl your site is to check your site using an all-text browser. Try site in text only browser (Lynx).

Conclusion: Frames and Flash are not (in general) good for SEO

Suggestion: Try http://www.delorie.com/web/lynxview.html and fix so text only browser can see content

http://www.ucla-cir.org/

and

http://www.semel.ucla.edu/cir/index.html

Same content, but on two domains. One is .org and the other is .edu. Bad for SEO.


   Avoid SEO pitfalls like creation of multiple similar pages:  Google frowns on the creation of pages, domains, and subdomains that duplicate content—although obviously there are places to legitimately duplicate content)

Suggestion: Evaluate if it necessary to create multiple domains that duplicate content

http://www.ucla-cir.org/sitemap.xml Did not find a site map. Navigability – pages should be structured with a clear hierarchy. Major parts of site should be easy to access using a site map. Every page in your site should be accessible using a static text link

Suggestion: Put an HTML site map page on the site, and use an XML Sitemap file to help ensure that search engines discover the pages on your site.

http://www.ucla-cir.org/

and

http://www.semel.ucla.edu/cir/index.html

Splash page. I’m not fond of splash pages – the keywords are contained in <meta> but it is more beneficial to have page keywords contained in content of page.


Suggestion: Consider creation of page content that contains identified keywords to match goals of site.


UCLA Center for Image Research

Science Technology Images

Science and Technology of Images

UCLA CIR

UCLA STI


Names of affiliated researchers

The researchers might consider creating an ORCID Record and this can be used as a link


Suggestion: Consider addition of actual page content. Perhaps, community content (e.g. blog updated often and containing relevant keyword phrases)


http://www.ucla-cir.org/

SEO Rubric Case Studies: "Search Engine Friendly" web site examples as SEO teaching models. Identify exemplary model sites (e.g. best library web site with optimized microdata).

References and Journal Articles

  • Black, E. L. (2009). Web Analytics: A Picture of the Academic Library Web Site User. Journal of Web Librarianship, 3(1), 3-14. doi:10.1080/19322900802660292
  • Cahill, K., & Chalut, R. (2009). Optimal Results: What Libraries Need to Know About Google and Search Engine Optimization. Reference Librarian, 50(3), 234. doi:10.1080/02763870902961969
  • Houghton-Jan, S. (2007). Twenty Steps to Marketing Your Library Online. Journal of Web Librarianship, 1(4), 89-90. doi:10.1080/19322900802111445
  • RUSHTON, E., & FUNKE, S. (2011). The Goodness in the: Evil of SEO. Searcher, 19(9)
  • Rushton, E. E., Kelehan, M. D., & Strong, M. A. (2008). Searching for a New Way to Reach Patrons: A Search Engine Optimization Pilot Project at Binghamton University Libraries. Journal of Web Librarianship, 2(4), 525-547. doi:10.1080/19322900802484248
  • Whang, M. (2007). Measuring the Success of the Academic Library Website Using Banner Advertisements and Web Conversion Rates: A Case Study. Journal of Web Librarianship, 1(1), 93. doi:10.1300/J502v01n01•07

"What is SEO"

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox