14
Nov
John Mccarthy asked:

Recently I’ve run into situations where business owners seem surprised by my statement that the SEO campaign is going to be delayed by at least 2-3 months as their website is not SEO-ready. “What do you mean my site is not SEO-ready?”often replies the business owner, “I’ve just paid good money to have my site built”. And so begins one of my heart-to-heart conversations with the business owner that the adage of “if you build, they will come” does not really apply to Internet and search engine marketing. more

Site Architecture Issues

An SEO-ready website refers to a website that is ready to start an SEO campaign. The website will not require considerable re-programming, development or removal of barriers. Today most websites we encounter are dynamically generated websites in contrast to static websites. As late as 2005, it is true that search engines like Google had a challenge indexing dynamically generated websites, but no more. Search engines can easily index both static and dynamic websites provided best practices are followed. Where dynamically generated sites can create problems for SEO campaigns is in their database or content management system (CMS). Sometimes these databases produce multiple query string parameters that are so long that the search engine spiders are inhibited or worse yet they just cannot crawl the site.

In other situations the content management system is the culprit. We’ve seen situations where the content management system forces the website to have duplicate Title and Meta tags for every page in the site. In this situation the search engine often drops the pages from their index as this is perceived as duplicate content. Other CMS systems require the page name to duplicate the Title Tag, adds underscores, and other unfriendly SEO deficiencies or problems.

Page Construction

An SEO-ready website should have full support for standard on-page elements such as: Titles, Meta Tags, Headings, Image Alt Tags, etc. In at least 65% of the SEO audits we perform, the sites are missing Meta Tags or Headings. While it is true that these variables are not as important as they were in past, without them the site is clearly deficient compared to the top ranked sites and makes it that much more difficult to get well ranked. While they have improved significantly, ecommerce sites usually have the most deficiencies in terms of page construction problems.

Content

Content and link popularity continue to be major ranking factors in terms of keyword relevancy. An SEO-ready website should have enough content and link popularity compared to its peers to be considered “relevant”.

Recently we performed an SEO audit and found the client’s site had 200 pages of content. As expected the content was heavily product centric and did not have non-product oriented content. We also discovered the client content was duplicated elsewhere. It turns out the client enabled their affiliates to repurpose the product for their own use. While this helped the affiliates, it hurt the client as they lack unique content in order to differentiate their site from 1,200 affiliates.

A competitive analysis of the top ranked sites on Google, Yahoo and MSN found that the top 10 averaged 2,800 pages of content or 14 times more content. Most of the content from these sites were non-product centric. In fact 70% of the sites had blogs or forums, and a few had both.

Link Popularity

In terms of link popularity the same client site had 5,000 inbound links and a PageRank of 3 which on the surface sounds impressive. A link analysis found that of the 500 links, 69% of the links had anchor text that employed different versions of the company name and another 12% were image links. Less than 5% of the anchor text from the inbound links contained any keywords from the SEO campaign.

In contrast competitive analysis found the top ranked sites averaged 254,000 inbound links and a PageRank of 6. Yes, 254 thousand. These companies have been in business for years and as a result have built an amazing amount of links from product reviews, public relations, and trial promotions.

Web Server Configuration

An SEO-ready website does not require major time or resources to fix the web server. “What do you mean fix the web server?” questions a business owner. Whether you use Internet Explorer, Firefox or Safari – the browsers mask a lot of errors that originate from the web server. For example, we recently performed an SEO audit have found the site had implemented a URL rewrite but it was implemented incorrectly. As a result all the pages performed 302 redirects and will likely be interpreted by the search engines as a potentially fraudulent website. A 302 redirect is a temporary redirect – usually referencing that a page has moved temporarily. While this can occur from time to time, it is not a best practice that every single page in a site is temporarily redirected. Today we are monitoring over 20 issues that originate from web server configuration.

If your website is not SEO-ready, our first goal is to make the site ready for a successful SEO campaign. It is not uncommon for a website to have problems in all the above areas: Site Architecture, Page Construction, Content, Link Popularity and Web Server Configuration.

Making site architecture and page construction changes within static sites is often much easier than dynamically. Nonetheless, site architecture issues often take one to two months to implement. Page construction changes range from as little as a few hours to several weeks depending on the size of the site.

Webserver issues may take as little as 5 minutes or as long as 5 months. Yes, 5 months. We worked with a major company and it took 5 months to get a web server problem addressed. It took no less than 4 weeks for the client to confirm the problem internally, 6 weeks to determine if the problem could be resolved, and the balance to get the problem resolved and then in the queue for a site release.

For most clients, content and link deficiencies are the critical path to success. Usually developing content and links becomes an ongoing process.

For the average small to medium sized business, addressing Site Architecture, Page Construction, and Web Server Configuration issues take about 1-2 months. Content and links typically add another 2-4 months. Ideally these items would be implemented in parallel but often clients cannot dedicate enough internal staffing resources.

So make sure your site is SEO-ready. If you are not sure, invest in an SEO Audit. It often proves to be an excellent investment before starting an SEO campaign.

premium wordpress themes

Comments Off on Is Your Website SEO-Ready?
30
Mar
Christopher M Rogers asked:

Building external links is one of the vital things SEO technicians do to get their clients to the top of their search engine listings. These links are important to promote SEO for a site, but it’s also important to look at the internal links – those links that are already located within your website.

Internal links sometimes gets overlooked, but can help your SEO immensely. You really should be taking care of your internal links before working on external ones. There’s times when a site with a better internal structure gets ranked hirer than one with a poor internal structure with a number of external links. The great thing about internal links is that they’re coming from an authoritative site which is your own. Authority is something that external links sometimes lack.

Internal links are used by search engine spiders to see what is on all of your pages. So you’ll want to build the content on your website by using keywords you’re targeting through out it. When you increase the page rank of your internal pages it will overall increase your site’s ranking.

If your website has been on the internet for a good amount of time, it tends to get better SEO than others because it has a history of internal links that search engines use. If your website is newer, it will take time for the results to take effect, but it can definitely be ranked higher through internal links if done correctly.

Also, remember that not only should keywords be put into your content, but should be put into file names as well. SEO keywords should cover codes, tags, anchors, picture files, folder names, etc. It isn’t just limited to the content of your website.

If you place an image on your website and it’s named “1023940.jpg” then the search engines aren’t going to read it. But, if you named it “onlinebirthdaycards.jpg” for your on-line birthday cards website, then the search engines will be able to pick that up and rank you better.

Internal linking is important to get all of your pages found by search engine spiders. It’s useless to have 40 pages throughout your website if only six of them are being found. Keep internal linking in mind in your SEO efforts and internet marketing techniques.

spring wedding favors

Comments Off on Build Stronger SEO With Internal Links
22
Dec
Matt Garrett asked:

“Arachnophobia” was a U.S. summer blockbuster back in 1990, starring Jeff Daniels and John Goodman, as well as an improbably large special-effects generated spider, which had hitched a ride in the coffin of one of its victims, from the Venezuelan rainforest to a small Californian farming town.

Arachnophobia is also, the clinical term for the fear of spiders. Arachnophobics can also dread getting close to areas which might hide spiders.

One would assume that Arachnophilia would be the clinical opposite of “arachnophobia”, referring to those who collect spiders or raise them as pets. Or perhaps it could be used by the media to refer to the groupies of Tobey Maguire, from his role in the Spiderman movies.

But for the purposes of this article we will use it to refer to those owners of websites who are forever searching for ways to get the search engine spiders to visit their websites..

Many people who have websites do not build them themselves and do not have the fully understand how they are constructed. It’s really fairly simple though..

Everything that is visible in a web browser when you visit a website, including the font size, colors and styles (underline, bold, italic etc.) appears as it does because of “coded” instructions given to it by the site designer.

These standard “codes” are enclosed by pairs of “tags” which tell the Web Browser displaying the site how it should appear to the visitor. These “tags” consist of angle brackets, and >, at the beginning and end of each section of text.

The World Wide Web was the brain child of Tim Berners-Lee, of the European Laboratory for Particle Physics, and it’s creation had nothing to do with particle physics, but was instead designed as a medium to easily store, access, update and ultimatley share vast amounts of data.

Tim Berners-Lee was building on the concept of hypertext. Hypertext, when it was originally created, referred to any text that contains links which allow you to move wherever you want within its parameters, without having to do so in strict sequence.

In 1990 Berners-Lee wrote the first version of the “HyperText Markup Language”, now known simply as “HTML”, which is the code from which all present day text based webpages are made.

So What’s W3C got to do with this?

The links in a web page would not always take the user to the link or data they wished to access, as different formats of hypertext were being used. In other words there was more than one protocol and not all matched up.

So in 1994, to help establish true World Wide Web intercommunication, Berners-Lee and other WWW pioneers established the World Wide Web Consortium, or W3C.

In the past thirteen years W3C has set many voluntary standards for HTML used to build webpages, enabling those website designers who choose to adopt them to build websites which will be accessible by any computer operating system. It’s because of the overwhelming acceptance of these W3C standards that we now have such a reliable and universally useable Internet.

This is why you can view the same web page in many different internet browsers and still make sense of them, whether you are using FireFox, IE, Netscape, Mozilla, or Opera.

W3C HTML has since been enhanced with CSS, and will eventually be surpassed by W3C’s XHTML.

So just how does W3C compliance help to get your website noticed and better indexed for SEO by the search engines?

Well, that’s where we come back to the spiders. If you want to do well in SEO terms, you need to be a good acrophiliac and make your website prime spider-attracting real estate.

Google keeps tags on over eight billion web pages and does so with several different “bots”, aka “crawlers”, aka (to maintain the integrity of our metaphor) “spiders”.

These include DeepBot, FreshBot, MediaBot, AdsBot, ImageBot, GoogleBot-Mobile, and Feed-Fetcher Google, which for some reason has been excluded from the “Bot” club. That means Google has eight different types of spider scuttling around World Wide Web, deciding what is worth adding to it’s 8 billion pages and what’s not..

Now if you were one of the itsy-bitsy spiders assigned to crawl over, examine and make decisions about all those web pages, you might just feel a bit overwhelmed.. If you could rule out some of those pages as not up to scratch for any reason, you might just be tempted to do so, right?

Well if you couldn’t read the content of a page easily, that would probably do it. The spiders have been trained to read W3C HTML (or CSS or XML) code, and if a site is coded in something else, or errors in the code, the spiders aren’t going to like it quite so much and may go looking elsewhere. Which you don’t really want to happen..

So making your website code W3C compliant keeps those spiders happy and encourages them to come back to your site again and again. Hence SEO and W3C go hand in glove, just check the glove for spiders before putting it on. 🙂

Just remember that spiders do not see what a human visitor sees when they look at a your website. Web browsers can make allowances for badly written code and still bring up a page that looks pretty much the way the designer intended it to look. But the spiders get to see the code in the webpage and will know if it’s W3C compliant or not.

In other words if you want maximize your search engine optimization for your site, take the time to verify your website’s compliance with W3C standards.

you can start by submitting your URL to http://validator.w3.org for a check. If your site scores what you think is an unreasonable number of errors, get your website source code gone over and brought into W3C compliance by an expert.

The first, and non-paying, visitors to your website will normally be the search engine spiders, make sure they want to come again. Making your site W3C compliant will give you the best chance of that.

Making W3C compliance part of your SEO strategy will mean your human visitors will not be far behind!

premium wordpress templates

Comments Off on SEO and W3C: Spider Tracks