Register Advertise with usHelp Desk Today's Posts Search

Rate this Entry

SEO for Wordpress

Submit "SEO for Wordpress" to Facebook
Posted 25th November 2014 at 03:52 PM by imarkedy

As we continue our SEO for Wordpress website series, we arrive at site structure. Once you have the basic structure of your web site in place, you can implement several additional advanced structural considerations to optimize your site for search engine purposes. For example, beyond setting up your web site so it is indexed, you may want to instruct search engines not to index certain pages. A robots.txt file allows you to tell search engine spiders what they may and may not do when arriving at your domain. Robots.txt files also provide you a means to prevent both potential copyright infringements and search engine spiders from consuming excessive amounts of bandwidth on your server(s).

One primary example of advanced web site structuring includes the use of nofollow attributes. The nofollow attribute instructs search engines that they should not follow a particular link or view that link as anything of significance when determining ranking. Because search engines count links from your website to another web site as a vote for search engine ranking purposes, you can add nofollow if you do not want the search engine spider to credit the link.

A second advanced web site structural consideration is the way you structure URLs which is short for universal resource locator. URLs must be structured so that they are easily spidered and create a user friendly web site navigation system. For example, search engines as well as people prefer URLs that are simple and that include the keywords describing page content.

A third structural consideration is the use of an .htaccess file. This file type is the Apache Web server’s configuration file. It is a straightforward yet powerful text file that can accomplish a wide variety of functions that allow you to protect your web site from content stealing robots. Moreover, .htaccess is useful in that it allows you to dynamically rewrite poorly formed URLs that shopping cart and blog software generate.

Content Creation
Creating well-written, original content is absolutely critical to your long term search engine optimization success. Content is what visitors use to determine value and one of the primary factors that search engines use to rank your web site. Whether your web site ends up on the first page of the one-hundredth page of Google largely depends on the quality and relevance of your content.

Although you should keep SEO principles in mind when you create content, the key to building long-term rankings is writing content for people, not search engines. Original and naturally flowing content provides your readers with a positive, enjoyable user experience and greatly improves your chances of top search engine rankings. Avoid creating content solely for the purpose of achieving high SEO and you’ll greatly increase your chances of long term ranking success.

When you write content, you must avoid duplicate content. Duplicate content occurs when your web site contains content that already exists on the Internet. Duplicate content issues have a detrimental effect on your Seo success and should be avoided at all costs. Writing original content is the most obvious way to avoid this pitfall. If you feel you don’t have the time to build large amounts of unique content, you can employ tools on your web site such as user reviews that allow user-generated content and what is best is these reviews are totally free! User generated content allows your content to remain fresh, which is one of the factors search engines use to rank one web site over another.

Writing original content and adding user-generated content doesn’t protect you from duplicate content issues entirely. You must protect yourself from others stealing your content because Google can’t, thus far anyway, detect who owns content. One way to establish your ownership of content is to post it, make sure it is indexed by spiders before others copy it. Fortunately, tools such as Copyscape are available that help you avoid and prevent duplicate content issues.

Although you should write content for people and not search engine spiders, you should also use topic descriptive keywords throughout your web page content. First, you want to optimize each page of your web site for one or two target keywords, while at the same time making sure you do not inadvertently repeat non-target keywords. By using available tools to maximize keywords, you can incorporate a substantial number of keywords throughout your content without compromising the natural flow of writing your content.

You should keep in mind a few important creation principles as you build your web site. First, search engine spiders cannot read text included in images. Therefore, always include important text and target keywords in images by using text to describe the image. This tag is known as ‘image alt-tags’. Second, when drafting your content, you can use a powerful content creation principle called latent semantic content, which involves using keywords very similar to your keyword targets to enhance theme and relevance of your web page.

Brian Dale
Keyword Research Coach
Posted in Uncategorized
Views 793 Comments 0
Total Comments 0



All times are GMT -6. The time now is 09:51 AM.