then got the note below
i need someone to help me make sure my site complies with the points below
I reviewed your account XXXXX and can see that it has been
suspended. Also I saw the correspondence between you and Rey. So I can
confirm that as Rey mentioned the site tweetbuddy.com is violating our
Webmaster Guideline Policy under Unacceptable Business Practices, and was
consequently disabled for being a low quality site. In an effort to offer
our users a high level of quality sites, it is important that all sites
within our network offer significant value by providing unique and
web designer apparently didn't put any text int he html etc.
Things to check/fix
- Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
- Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.
- Keep the links on a given page to a reasonable number.
- Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
- Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
- Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.
- Make sure that your <title> elements and ALT attributes are descriptive and accurate.
- Check for broken links and correct HTML.
- If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
- Review our image guidelines for best practices on publishing images.
- Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
- Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
- Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit The Web Robots Pages to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.
- Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
- If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.
- Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.
- Test your site to make sure that it appears correctly in different browsers.
- Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.
- Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let's Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.