It is a well known fact in Web Designer circles and even the web user community at large that Internet Explorer 6 (or rather its continued endemic use, especially by large corporations) is one of, if not the main, reason for the web being held back from reaching its full potential on the desktop. Its fair to say that IE6 has accounted for a truly vast amount of ‘wasted’ web designer hours spent figuring out its many quirks and bugs. This is well documented elsewhere, so I won’t go into detail here.

I thought that the latest Internet Explorer release, IE8, was a huge step in the right direction from Microsoft. And indeed it might be, at least in comparison to its predecessors. However, a quick experiment using the CSS3 Selectors Test on the css3.info site reveals that IE8 itself also leaves a lot to be desired in terms of compliance with the latest W3C standardisation efforts. Whilst no part of CSS3 has reached W3C recommendation status at the time of writing, there is considerable effort by most browser vendors to implement it and a fair amount of excitement amongst developers and designers about what it offers.

So, running the aforementioned tests using FireFox 3.6 on Windows tells me:

From the 43 selectors 43 have passed, 0 are buggy and 0 are unsupported (Passed 578 out of 578 tests)

Using Google Chrome 4.0.249.89 again on Windows, I get:

From the 43 selectors 43 have passed, 0 are buggy and 0 are unsupported (Passed 578 out of 578 tests)

Internet Explorer 8 gives me:

From the 43 selectors 22 have passed, 1 are buggy and 20 are unsupported (Passed 349 out of 578 tests)

Oh well.

Whilst this is hardly a complete or particularly scientific study,  it gives another reason why IE should be no one’s browser of choice. Microsoft push stuff to my PC all the time in the form of Windows updates, so why not use that mechanism to keep IE in tune with what other browsers are doing in terms standards compliance?

I’ve come to the conclusion that giving IE users a slightly degraded experience (i.e. the site might not look quite as good but is still fully functional) as and when required is the best approach to take. This is quite easy to achieve and might have the side effect of making IE users realise that its time they changed their browser.

For information on hashtags on twitter, please reading this introductory guide article to Twitter hashtags.

An interesting thing about Twitter Hashtags is that they aren’t officially supported by Twitter. Hashtags have been developed and introduced by Twitter users independently of Twitter.

Currently you can’t actually follow Twitter Hashtags as such through the main official Twitter website. You cannot sign up to receive all tweets that contain a particular Twitter #hashtag. The only way you can currently be automatically sent tweets is by choosing to follow another Twitter account. Twitter #hashtags are not Twitter accounts so cannot be followed. There is no follow button available for #hashtags (although it would be good if there was the option!) A Twitter hashtag is merely a tag or label for a chosen keyword or abbreviation and can only currently be tracked by searches through Twitter or third party sites such as www.hashtags.org.

If you want to just check once in a while or every so often, this can be achieved easily with manual #hashtage searches. You could use the www.hashtags.org site or you can use the official Twitter site by going to Twitter Search.  You can revisit the page or leave it open and keep refreshing it.

If you want or need to track a #hashtag for a longer time; be it a few hours, days or longer, there are other tools available. Searches on Google or Bing will find lots of third party independent Twitter tools for using and monitoring Twitter. These are not just for PCs or Mac, but also handheld devices such as the Blackberry or iPhone.

Some of these facilitate ‘column based tracking’, which allow you to set up columns to track Twitter tweets based on Twitter #hashtag search terms. An example of this is Tweetdeck, which runs on your computer.

With Tweetdeck you can configure up to 10 columns to follow tweets according to your specified criteria. This could be all those you follow on Twitter, or a subset or group of those you follow, or the ongoing results of a Twitter hashtags search. So if you search for #rugby via Tweetdeck, a column will appear showing you all the latest tweets using that Twitter #hashtag and it will automatically update for you. You can add, delete, or reconfigure columns at anytime.

There are also configurable web-based Twitter tracking tools that offer similar capabilities (just Google or Bing search). We will add an article in the feature regarding Twitter web-based tracking tools so make sure that you regularly visit the Gatorweb Gator Web Design UK blog.

Please follow Gator Web Design on Twitter; we automatically tweet our new posts and articles.

Written by Gator Web Design – Web design in Reading Berkshire UK

Twitter Hashtags are a very useful and powerful tool that can help spread your tweets and gain followers.

Essentially, in a tweet you put a # immediately before a keyword or abbreviated code to create a Twitter Hashtag. For example, if this tweet is about something relating to rugby, a Twitter hashtag of #rugby could be used. However, #Rdg is often used for those discussing something about the English town of Reading.

Users of Twitter or sites such as www.hashtags.org can then conduct a real time search that picks up all tweets that contain the Twitter hashtag used in the search. (Please note you should include the # in the search, for example #rugby). This can be very useful finding Twitter tweets about a subject of your choice that has been produced by someone you are not yet following. Conversely, it can help you get found and then followed by other Twitter users.

Please follow Gatorweb and Gator Web Design on Twitter for UK SEO web design and search engine optimisation tips and help.

Article written by Drobs from Gator Web Design in Reading Berkshire UK

Tweet or twit? Understanding Twitter – the basics

Is there any point to twitter? Is it worth using? Can it help my web business? These are all common questions relating to twitter asked by clients of Gator Web Design, a SEO company in Reading, UK.

I think it’s worth providing some brief information about twitter and how twitter works. Essentially you set up a twitter account and then send your tweets, which are messages up to 140 characters.

These can be viewed by someone who clicks on your twitter profile and they are automatically sent to anyone who has chosen to follow you. Therefore in order for your message to reach people, you are reliant on people either choosing to click your profile or deciding to follow you. A third way is for someone who receives your message as a retweet from a follower of you, which means the message is resent to their followers. A retweet is a good achievement because your message is spread wider and hopefully some of the recipients choose to subsequently follow you. You can hope people like what you have tweeted and want to share it via retweet or you can actually ask; Twitter slang for this is pls RT.

Why would people choose to follow you? Either they know you or they are likely to be drawn to you by your twitter id/name. They could have found you via a search of a your name, word contained within your name or they might have seen your name when looking through the followers or who is following another user or they may have received a retweet. It is worth emphasising the point that anyone can view who is following or being followed by another user. Therefore, if you are using an account for business purposes, it is probably sensible not to choose to follow those that could have an adverse affect on your business image. Leave those for your personal account. It doesn’t matter about followers so much because anyone can choose to follow anyone they want. I would consider all followers and good followers.

Therefore it is recommended that you choose a catchy name that reflects what you’re about and the content of your tweets. Take for example the tweets in respect of a site we run: www.caversham.info. The twitter id, cavershaminfo is virtually identical and tells people who see it that the tweets are likely to include information about Caversham, a town in Berkshire, England. This will attract all those who find us as followers or followed or if someone does a search for Caversham.

A quick way of expanding your follower base is to follow other accounts that are closely related to your own. If, for example, your tweets are in respect of the sport rugby, it is a good idea to follow other rugby accounts and then people might find you by looking who is following them and see you. It might be slow to start with, but things will hopefully speed up exponentially. A fair bit of time and effort will be required to start with by then you should start to organically gain followers.

When you send out tweets it is good to include a link back to your site, to draw visitors in and so that they don’t have to exert much effort finding your website. Website URLs can quickly and easily be shortened using a service such as bit.ly or TinyURL.com so that you minimise the number of characters you use from your 140 limit.

You can also create tags of keywords using hashtags; by placing a # in front of the keyword. If then someone searches for the keyword via twitter, it is likely to bring up your tweet. For example, if discussing Reading (the town in Berkshire), you might use #Rdg and if talking about Caversham you might use #Caversham.

If you are running a blog it is possible to install a plugin, such as WP to twitter for WordPress, that automatically tweets new posts. This is very useful as it automates tweets and updates followers of changes to your site.

You can follow Gator Web Design on Twitter. Gator Web Design undertakes ongoing SEO campaigns for it’s clients in Reading and UK.

Drobs - Gator Web Design – Web Design and SEO services in Reading Berkshire, UK

If you have been running and updating your website consistently for a couple of years, it’s likely that you’ve secured some reasonable traffic from the search engines and obtained a decent Google page rank.

An important thing to remember is that just because you’ve got traffic, it doesn’t mean that it will increase or even continue. There are various factors that can happen that can cause your site to fall or drop out completely of the search engine rankings.

So what should you do in order to maintain or improve your search engine rankings? Some legitimate sites are being mistaken as spam sites by searchbots so care needs to be taken to avoid this. Below are a couple of good simple tips.

1) Be careful who you link your website to.

While backlinking is one of the most important things in terms of search engine optimisation (SEO), you need to be careful who you are linking to. If you link to sites that are considered spam sites bad neighborhoods, you are likely see a drop in your search engine results/positions.

This is because the search engines searchbots assume that you are linking just to increase your search engine ranking (which may be true, but they need to at least appear as useful and relevant links). Avoid linking to sites and pages that are mostly a list of links. Many of these sites are clearly spammy and also link to many other questionable sites.

If you want to put your link in a link directory, go for high quality link directories with Google page ranks of at least 2, especially if these directories require a reciprocal link back to their site. Personally I object to paying for links but some consider it worthwhile because they believe attached a cost to a link filters out most of the spammers.

The penalty attributed to linking to dodgy sites can be severe so be selective. If you aren’t sure about a site, you can check it out by using online tools that tell you if your website is linking to a bad neighborhood, such as http://www.bad-neighborhood.com/text-link-tool.htm

2) Make sure your keyword density isn’t too high

Your keyword density is an important factor that affects your search engine ranking positions. Search engines determine the relevance of your website to a particular subject by the use of keywords and phrases in your web page text. The higher the density of the keyword, the more relevant the web page for that keyword subject. However, if the keyword density is too high, it can work against you and the search engine searchbots will deem that you trying to spam the search engines and will penalise your site for keyword spamming or keyword stuffing.

Use of content with a conversational tone can produce premium keyword density and leave you with good quality content. A rough rule to follow is that your keyword should never appear in more than half of the sentences on the page. A better approach is to have your keyword not appear in more than a third of your sentences. However, you should also use synonyms and other related keywords or phrases within the pages, to add relevance. (Others may search using other keywords that are close or similar to the keywords you have selected as important). Essentially, I would not recommend a keyword density above 3%. Site like http://www.keyworddensity.com can be useful for checking keyword density.

There are many simple things you can do to ensure your site is assessed to be legitimate and most are common sense. The above two examples are two of the easiest and quickest to protect, maintain or improve your search engine positions. Keep visiting the Gatorweb Blog for more search engine optimisation and SEO tips

Gator Web Design Reading Berkshire

In addition to optimising ‘normal’ web pages, it is also important to optimise other page types.

For example, lots of sites include documents such as Word documents (.doc) Adobe PDF documents (.pdf), Spreadsheets (.xls), Video (.avi or .mpeg) etc.  If these are also optimised, your overall SEO optimization can be improved or speeded up. Pages can be ranked faster.

Below are some simple techniques that should be employed:

1. Use a Text Based PDF Creator. Lots of free tools available online; Adobe Acrobat is probably the best text based PDF creator. If a PDF document is instead created in an image based program, the search engines will completely ignore it. If the PDF is created using a text based creator like Adobe Acrobat, the search engine robots will read and index the text like any other web page;

2. Change the Document Title. The title of the PDF file is as important as the title of a normal web page. The PDF title property tells the search engine robots about the type of content. The most important aspect of the title is that Google uses the text in the title field as the link in the search engine result pages. Therefore, the title field should be keyword rich, be relevant and should not contain random text;

3. Alter the document properties. A PDF file contains many document properties in addition to the title field. These include keywords, description, author information and copyright information. All the fields should be completed with relevant information. However, the keyword field should not be just stuffed with keywords or remain empty. It has not been proved that search engines give importance to the keyword field in document properties but if this changes in the future, your PDF file will have an advantage over other web pages that have not altered the keywords field;

4. Link to PDF files from other pages within your sit.: The Search Engine searchbots will not discover and index the PDF file if it is placed too deep within your website. To ensure that the PDF file gets crawled by the search engines, it should be visibly linked from the home page or any other page which is know to be indexed and gets crawled regularly. If your aim is to get the PDF in top search engine result pages, then you have to lead the searchbots to it;

5. Optimise the content in the PDF File. The content of text based PDF files is similar to the content formal web pages. This makes content optimisation an important for PDF files. As always, the content should be relevant to the subject matter. Keywords should be placed in the first few lines of the content. Important text can be highlighted by increasing the font size and utilising the bold and italics features of PDF files. 6. Place Links in PDF Files. I.e. there should be a provision in the file to link back to its original website. This action reduces the efforts of the visitor to search for the main website and a link from the PDF file can be considered as a back link by the search engine.

To summarise, PDF files are very similar to ‘normal’ web pages in many ways. The same level of attention should be given to their optimisation in order to achieve good search engine results.

As example can be found here on the Caversham.Info local website.

Gator Web Design, Reading, UK

11 Feb, 2010

Local community websites designed

Posted by: drobs In: Web Design

We have created a new community website for an area of Reading, Berkshire called Caversham.

 Please get in touch if you would like a similar local community website.

 

An important wat to get your website ranked high on Google using your specific keywords is to get “one way links” from high Google PR sites that are relevant to yours.  Google interprets the one way link as a positive “vote” for your site for that keyword. 

It is important to create a variety of links with as many websites as possible. A lot of the websites with high PRs will not publish your articles or links. 

However, there hundreds of article submission websites but probably fewer than 50 of those have high Google PRs but they are the ones you want to target. 

Recommended sites are Ezinearticles, Goarticles and Sitepronews. 

Commenting on blogs can help too; jusy find a relevant blog with a decent Google PR and make a comment! You should get a link back to your site on the comment. However, make sure the comment is relevant, meaningful, well written and not deemed junk or spam. 

This can be a slow progress, but should be continued and progress will be seen!

Blog visitors are just as likely to find a particular article based on an image as they are the articles themselves.

Images don’t just add to a site’s appearance, they can also provide additional elements for SEO. Blog owners should name all images using keywords.

This is demonstrated on the blog website  for Caversham and Emmer Green.

Search engines are probably the most important tool for internet marketing so you need to know how they work to successfully SEO and bring the all important traffic to your website.

How do they work? Every search engine works slightly differently, but they have a basic structure in common. They maintain a large amount of information on all the websites that they find. This is called indexing.  They store some images of the pages themselves and they classify the site in terms of the type of information that it contains and a measure of how useful the page would be for someone searching for information on a certain topic.

The index created by the search engine is fed information from the internet using a software tool which has several names. You will see references to a web crawler, spider, or webbot and they all mean essentially the same thing. This is a tool that moves across the web, visiting web pages and reading their content in order to populate the search engine index. If you have a site that is not in a search engine index there are several ways of getting the search engine to visit and index it. You can go to the search engine and manually enter your web address and request a crawl. This is not always an effective way of getting a visit. The other way is to build links with existing sites. When the bot from the search engine next visits the existing site, it will follow the link to your site and discover it.

There are in fact several parts to search engine searchbots. The Googlebot has two main components. Freshbot is the part of the crawler that finds new sites. Another component, Deepbot will then revisit the new site and do a more in depth analysis of the site. It is common for only a small part of your site is initially listed in the index. A deeper crawl will find the rest of your pages along as you have structures your site correctly.

If you have information on your site that you do not wish the search engine to index, it is easy to indicate to the searchbot where you want them not to look using follow and no follow commands in the meta tags of the page. Passwords are another way of keeping searchbots out.

When you are building your site, you must consider how your pages look to searchbots in the same way that you do for your human visitors. Searchbots cannot cannot read images for example, so any data in an image will be missed by them so alternative text should be produced for images, that will be picked up. Many sites use Java to generate menus. The searchbot can miss all the links in that menu because it is an image, not text. Similarly, drop-down menus are very useful for humans, but are invisible to bots. The crawler will navigate your site using the links that it finds. Having a clear structure is an important part of web design for this reason. If you really want to avoid any linking problems, you can present the search engine with an XML file called a sitemap, which will guide ir around your site.

Searchbots have none of the human abilities to read a page and quickly determine what it is about, and if it is useful. Search Engine Optimisation (SEO) is concerned with presenting pages, so that it is clear to the searchbot what the page is about and that it is worth ranking well in the index. This is done by a thorough understanding of how the searchbots read pages and also a current picture of what specific search engines are looking for when they are analysing the quality of the content on a site. As with any new and rapidly changing technology where there are fortunes to be made and lost, SEO attracts some of the best and some of the worst people involved in the internet. Be wary of anyone who cannot show you a clear track record, and avoid anyone who offers you a guarantee of a top ranking in a short time.

Good SEO companies gain their business from their own ability, i.e. the get the best search engine results for SEO. They do not need to cold call so beware of emails offering the earth through their SEO services.

Recent Activity

About

We will be using the Gator Blog to share the knowledge and expertise of our staff, covering topics such as: