White Hat Vs Black Hat SEO August 24, 2008Posted by simarprit in Blogging, Internet, Search Engines, SEO, SES 2008, websites.
Tags: Black Box SEO, Black Hat SEO, Search Engines, SEO, SEO Best Practices, SEO Tips, SES, SES 2008, White Box SEO, White Hat SEO
Which hat do you wear, Black, White, Grey, Red or None?
The question should rather be:
Which kind of head do you carry over your shoulder – Black, White, Grey or Red?
It looks like everyone has been changing/ exchanging hats. If it is the color of head we talk about, it would be more exciting and entertaining
Continuing with my series on SES 2008, San Jose, this post incorporates the Gyan acquired during the session – Black Hat, White Hat: Playing Dirty with SEO
Let us define:
Black Hat: You do whatever it takes to be on top of SERPs. Good. Bad. Ugly. Whatever. Period.
White Hat: You do whatever search engine (read Google) guidelines for webmasters tell you to, you don’t try to outsmart search engines.
Grey Hat: You are the border guy, you know the stretch, and you think you know what you would getaway with, you live white, but you know black and you use it on the side.
Red Hat: You know what it takes to hurt your competitor and you know how to getaway with it.
No Hat: If you don’t wear any and you are still an SEO, maybe you an incompetent SEO. Don’t worry, you are not alone 95% of the SEOs within the industry are plain incompetent.
The session was to debate Black Hat vs. White Hat, but the debate didn’t happen. it looked like the whole panel was wearing Grey Hat and tended to remain on the border and just didn’t want to be drawn to the controversy. Important observation were made:
- Established sites have much more restrictions, they can’t risk exclusion from SERPs.
- Just made for adsense is crap. Content needs to be good. Nobody wants trash, and for sure not the search engines.
- Searcher needs good results and most of the top results very close to each other, if Search Engines intermix the top 5 results, it wouldn’t;t matter to the surfer.
- Buying excessive links can hurt, it is Red Hat.
- Good SEO to me is Knowledge about Search engines and the subject.
- Meta Keyword stuffing hurts.
- Everything in the title hurts too.
- Hidden content is black.
- Short and focused titles are good.
- Paid unrealted blogging can hurts.
- Vague and self promoting descriptions with repeat words hurts
- Paid link buying from link farms hurts badly.
- Search engines hate doorway or gatway pages, they hate being misled
- Serach engines hate carse optimization: They are bound to hate something like this – www.example.com/HOTEL/hotel/googleCityNameHotel.html - Which may go with the title <title>CityName Hotels | Hotels in CityName |Budget CityName Hotels| Luxury CityName Hotel| Cheap CityName Hotel| Discount CityName Hotels</title> This is a live example of carse black hat, “We will do anything even put Google’s name in our URLs, hundreds of them but we need to get on the top, we don’t care what happens to our site if Google doesn’t like it.” I picked up this example as it was succeeding in ranking high up. I have kept the URL intact except removing the site name and substituting it with example.com and substituting the actual city name with CityName.
- Great generic domain name helps
More to come…
SEO Best Practices – Content Issues August 24, 2008Posted by simarprit in Blogging, Content, Internet, Search Engines, SEO, SES 2008, Spamming, Uncategorized, websites.
Tags: Content, Content Duplication, Search Engines, SEO, SEO Best Practices, SEO Tips, SES, SES 2008, Simarprit
add a comment
Content Duplication Issues and SEO Best Practices
Continuing on my series on SES 2008 San Jose, this White paper is again a hybrid of what was shared and what I have learnt over a decade on search engines.
If I give you 10 pages to read, you would scan through, start reading and if what you are reading is “new to you” may be you would read all ten of them at one go.
Now, if I give you 10 pages to read and when you scan through you find that “you’ve read it before” or “only one page is unique”, you may not even read my one unique page and trash all, worse, you would remember me as guy who tricked you by giving 10 pages to read when he had just information for one page. You would make a note “not so nice man to know.”
To me this is content duplication and so it is to search engines, so here we go.
- Search engines job is to satisfy the searcher, they want to grow and be seen as credible.
- Search engines have no favorites.
- They trust you unless you betray them, they work with a basic premise that what you are feeding them is your own and unique.
- So when you feed search engines anything they “Scan”, if you are “New” they may read whole of it.
- If you are not “New” they’ll trash you and “remember” you as “not a good site to know”.
So what are your choices, simple choice is to always provide “New content”, but this choice is expensive and restrictive to many, so what do these many do:
- Put same content on many pages on the same site as it is.
- Put same content on many pages of the same site with minor modifications, disguising it as new content.
- Put same content on many different sites under the same ownership.
- Put same content on many different sites with minor modifications but the sites are under the same ownership.
- Put same content on many different sites under many different ownerships, in many different servers, in many different data centers with or without minor modifications.
They all presume that they would be able to manipulate their way around, some do succeed, but issue is how hard are you working to do something which is wrong anyway. Search engines are becoming smarter by every passing day, they are scanning better, they are storing better and they are recalling better. The best case scenario is don’t duplicate your content and don’t manipulate content of others and put the same on your site, remember sooner or later you would be caught and become “Not a Good Site to Know”, and search engines would drop you out, as we all do.
This leaves us with the issue of what if someone does this to me. Yes, this is the issue!
So if you are original source of the content, your worry is – How does search engine know that I am the original? Search engines are working very hard to reach the original, in case they don;t make them aware.
Do what you will do with your assets: Protect them, be vigilant and act if someone breaches your copyrights. A related issue is when you syndicate your original content, I will cover this subsequently.
Some common inadvertent content duplication mistakes and issues:
- When spiders read your content four times: http://example.com, http://www.example.com, http://www.example.com/index.html or http://example.com/index.html. Most of the spiders know how to circumvent it, but it will help to put 301 re-directions in place and route everything to http://www.example.com
- When you change platform
- When you change URL structures, remove the old one and deploy 301 redirects
- When you create test folders, remove your test folders
- When you shift to a sub domain, clear the content permanently from your servers
- Disclaimers and privacy policies running across sites and copyright statements running across site. put non-crawlable JS functions or connect them centrally.
- Check your landing pages, if you have multiple landing pages make them unique
- Check your meta titles, and meta descriptions, they need to be unique
- Be careful on mirrored sites
- Content in multiple languages with common attributes or language strings is a no no
- Use exclude protocol in robots.txt where-ever you need to share the same content, within the same site or at different domains
- Check out and remove any hidden link.
- Use password protection where you need to carry duplicate content
- Permanent deletion of duplicate content is better than redirection
The above can form some of the best practices SEO’s can follow.
more to come…
SES 2008 Site Clinic August 22, 2008Posted by simarprit in Blogging, Internet, Search Engines, SEO, SES 2008, websites.
Tags: SEO, SEO Best Practices, SES, SES 2008, websites
1 comment so far
So what is a website clinic? As per Search Engine Strategies 2008, San Jose organizers, a website clinic is a one to tow hour session where a panel of “Website Gurus” evaluate your site publicly.
They allow you to explain what your site does briefly. Each site review takes about 15 minutes, I attended three sessions on this subject today:
My random thoughts:
- Try to stick to one coder or a style of coding
- Ensure your site supports the resolution your customer group generally work on
- Maybe it is a good idea to give relative links from your homepage to internal server
- Seeing what you are looking for in the URL makes sense
- Dynamic Menus can create a problem, they may confuse the spider, robots and whatever
- JS is best avoided in all sorts of navigation.
- JS also adds to overheads
- It may be good idea to tackle multiple coder issue by rewriting the code for your top pages (top pages and not all pages) in one style from time to time.
- Hyphens are better than underscores in the path and on the page.
- Wisely use SEO products and be open to adopting them.
- All upper case usage is not good
- Ensure your sitemap is there and update it regularly
- Define your 20 second elevator pitch and put it prominently
- 404 hurts, avoid Not Found Errors
- Avoid denial of service errors, they hurt
- Bring up Page 2 and Page 3 words
- Try MSN funnel tool
- Use KML
- Provide data dircetly to search engines
- Try layering image over text, but don’t go overboard
More to come..
SES 2008 SEO August 21, 2008Posted by simarprit in Blogging, Internet, Search Engines, SEO, SES 2008, websites.
Tags: Blogging, SEO, SEO Best Practices, SES, SES 2008
SEO Through Blogs and Feeds
Continuing on my blogging the SES 2008 at San Jose I must admit I am not the fastest blogger and neither do I do “Live Blogging”, my blogs are aimed to be educative executive summaries for myself and for all those who care. They are focused session on what all I have attended and express my opinion alongside those who are the knowledge givers.
Rebecca Lieb did a good job of moderating the session, her knowledge about the subject and her apt comments made the session lively, Chris Boggs, Lee Odden, Amanda Watlington and Daron Babin formed the speakers panel.
It is important to self qualify your blog and understand your objectives of blogging. The qualifications you must answer be:
- Is your blog official or personal?
- Is your blog residing on a sub-domain or is it a separate domain in itself?
- Does your blog carry your corporate signatures or it has a personal touch about it?
- Are you the lone blogger or there are are multiple bloggers who would work working on the blog?
Choosing your platform is very important. WordPress looked like the preferred platform for most around.
Some quick takeaways (includes my own interpretations and learning)
- When writing a blog remember Steven Covey, begin with the end in mind.
- You should know precisely what you want to write on and what you want to gain by that writing.
- You should make a note of the keywords you want that post/blog to cover and ensure that they find a place in your title.
- The panel also recommended that you should do some research on the tags you would like your post to be covered under.
- Socializing and creating community around the log helps
- Customize and optimize your blog and feeds
- Use blog widgets or bligets wisely, don’t be obsessed
- Acknowledge and link to others liberally, develop credibility for your blog
- Links are currency, work towards them
- Try to become a top referrer on the topic of your choice
- Twitter, Flickr, Stumbleupon you need all of them at some stage or another
- Goals should drive your content on the blog
- Refine your blog
- Use Auto Discovery Tag of blog feeds
- Use trust rank inside your blog to increase credibility
More to come
SES 2008 Domain Auction August 21, 2008Posted by simarprit in Domain Names, Internet, SES 2008, websites.
Tags: Domain Auction, Domain Names, Domainer, SES, SES 2008
add a comment
Yesterday the domainer Simarprit Singh saw himself in action once again. The auction had less than 100 in attendance, most of them freeloaders who had come for free drinks and food. I thin there were about 5 – 10 committed buyers who were there to pick-up whatever they liked at reasonable prices.
I picked up 6 domains in the auction two on SEO, recognizing my firm acceptance and current stance of the SEO industry, one on vehicles, two generic .org domains, one on baseball and one on Jobs.
The experience was good, most of the domains below $1000.00 got sold, there were hardly any takers for big domains. The auctioner with due respect was not very good as he had no clue about the subject. There was no passion and no involvement in it, the auctioneer in Domain Round Table in SFO was far better. Good for me, had he been good I would have landed spending far more than I did.
Why am I investing in domains, simple, I see a huge upside in it. I am picking up these domain names to complete my portfolio and be in a position to release my list so that anyone serious finds something worthwhile.
Do I have four letter domains? Yes! Do I have generic domains? Yes! Do I have two words logical domains? Yes! My story is getting complete.
Do I recommend domain name investments? Yes! I think it is good to have a domain portfolio as an investment option. Like all other portfolios you should build it up part by part.