Web Design & Development Services

Do not assume that all web developers understand the SEO implications of what they develop. Learning about SEO is not a requirement to get a software engineering degree or become a web developer (in fact, almost no known college courses address SEO).

SEO however is heavily impacted by major technology choices you make. These decisions can make or break your online strategy and so making the wrong decisions here can be disastrous for your website.

Some technical areas we can look at together which may be impacting your SEO strategy may include:

  • Some CMS platforms do not even allow you to have titles and meta descriptions that vary from one web page to the next, create hundreds (or thousands) of pages of duplicate content.
  • Links in submission-required forms: Search spiders will not attempt to “submit” forms, and thus, any content or links that are accessible only via a form are invisible to the engines. This even applies to simple forms such as user logins, search boxes, or some types of pull-down lists.
  • If you use JavaScript for links, you may find that search engines either do not crawl or give very little weight to the links embedded within them.
  • Links embedded inside Java and plug-ins are invisible to the engines. In theory, the search engines are making progress in detecting links within Flash, but don’t rely too heavily on this.
  • PowerPoint and PDF files are no different from Flash, Java, and plug-ins. Search engines sometimes report links seen in PowerPoint files or PDFs, but how much they count for is not easily known.
  • The robots.txt file provides a very simple means for preventing web spiders from crawling pages on your site. Use of the NoFollow attribute on a link, or placement of the meta Robots tag on the page containing the link, is an instruction to the search engine to not pass link juice via the link.
  • Google has a suggested guideline of 100 links per page before it may stop spidering additional links from that page. This “limit” is somewhat flexible, and particularly important pages may have upward of 150 or even 200 links followed. In general, however, it is wise to limit the number of links on any given page to 100 or risk losing the ability to have additional pages crawled.
  • Technically, links in both frames and iframes can be crawled, but both present structural issues for the engines in terms of organization and following.
  • CSS is commonly mentioned as a best practice for general web design and development, but its principles provide some indirect SEO benefits as well. Keeping file size low means faster load times, lower abandonment rates, and a higher probability of being fully read and more frequently linked to.

My Web Development Skills:

  • Net Framework 1.1, 2.0, 3.5, C# 3.0
  • ASP.Net
  • JavaScript
  • HTML
  • CSS
  • SQL Server 2000 – 2008
  • ADO.Net Entity Framework
  • Transactional Replication
  • Full-text Search.
  • Visual Studio 6 to 2008
  • Classic ASP
  • VBScript
  • JScript
  • IIS
  • SMTP Server
  • Telerik