August 2006


Neat Tricks and Hacks29 Aug 2006 06:30 pm

I’ve been getting a few Emails lately about my post on Proper RSSGM Use. So I thought I’d make a quick revisit to the post to help out a few people.

In the jist the emails boil down to long winded speeches about their hatred for installing Article Dashboard and how much bullshit is involved with Zend Optimizer. All I can say is, I completely agree, but don’t forget to use your thinking brains and shoot for some originality when faced with problems like this. Perhaps a Wordpress install? Or you could just get mad and fustrated and end up kicking your poor dog named Wiki out the backdoor. Oops I didn’t just say Wiki outloud did I? You didn’t just hear that.

Lay back and relax while I hypnotise you. This will only take a moment.

You didn’t hear aaaaaaaaaaaaaanything.

When I snap my fingers you will wake up
*snap*
SEO Tools28 Aug 2006 08:08 pm

Last Friday Scott Trimble from Blog Solution released another product called RSS Evolution. I got to play around with it a little bit before it came out. It’s basically an executable RSS-static html converter. For those of you who have had the displeasure of playing around with previous RSS to HTML converters before such as HyperVRE you already understand that for how incredibly useful, quick, and easy they are not a single person has got them right. For some reason no one can manage to make one that freakin works! The few that manage to actually install correctly make sure to quickly crash. The small percentage remaining somehow manages to skew the templates to an incomprehensible mess of table cells.

It’s too bad to. I’m a firm believer that executable content scrapers are the future of content generators. PHP and perl with a snazzy little database are nice, but anything that takes me more than a half hour to setup naturally forces my engineering prowess to kick in and I end up spending the rest of the day coding an automated setup for it. Frankly….fuck all that. I’d rather just have an exe. I want to open a program on my nice little WinXP computer. Import a keyword list, type in a few rss feed urls, grab a template from Open Source Templates and hit go. RSS Evolution is exactly all this. It runs very well, very effeciently, but more importantly it is extremely versatile. It can pull almost any content and turn 10 keywords into a 1,000 in a matter of minutes. If you’re wanting to create a 1,000 adsense domains in one day I wouldn’t consider any other method.

Since I’m not a salesman and this is starting to sound like a salesletter I won’t hesitate to mention its major downside. Hey nothing is perfect :) . I know much of the text formatting is done in the template, but the program itself does fail on several counts to format multiple feeds on a single page in a readable manner. This isn’t something that will pass human check by any stretch of the imagination, but buy it for what it’s meant for. An extremely fast and versatile method of creating adsense pages. I’m not going to mention the price, because knowing Scott he tends to raise prices the more the product sells. Good strategy for both the buyers and the sellers if you ask me, but at the time of writing this the price is definitely worth it. I’m already having people email me asking about it since I did put up a testimonial, I’m so cheesy sometimes :) , but so far all the ones that asked me about it and ended up buying are liking it.

Click Here To Check It Out

Program Screenshot

Output Screenshot

Blue Hat Techniques25 Aug 2006 07:09 pm

The Google Patent talks explicitly about the freshness factor and it’s importance. More importantly it talks about staleness being a factor in the rankings. This concept is understood to be true for not only Google but Yahoo and MSN as well. It mentions that not all sites in a niche need to be updated consistantly. Some niches require more freshness while others require less. This naturally raises the question; how often should my site be updated for my niche? Too much and you could potentially hurt your rankings, too little and you will slowly loose the rank you worked so hard for.One particular site of mine raised this question for me. It’s a site that is 100% static and virtually never gets updated. It would exibit some strange qualities in the organics that none of my other sites would. It would rank in the top 10 for all of it’s terms then slowly as weeks would roll by it would eventually drop into the bottom 30. I naturally considered the freshness factor. So I made a slight change to the title and added one page of content, linked to on the main page. Within 48 hours the site dropped out of site in the rankings(100+ in Google, 70+ in Yahoo, and out of the top 300 in MSN). This fustrated me but instead of changing it back I stuck with it. A week later it rose back up to the top 10. I was like COOl. So I let it be and about 4 months down the road it started slowly dropping again. so I once again made a slight change to the title and added one page of content. Same exact thing happened. This forced me to further examine the pattern being displayed in an attempt to mimic it.

Obviously the search engines must get their data from somewhere to determine how fresh your site should be so I took a close look at the competition in the niche. For a few weeks I studied their cached dates in the organics to see how often they were updated in the engines. There was a slight pattern between the frequency of the cached pages and the frequency of their updates. Which spurred me to investigate that in a whole new light which I documented in my #10 Blue Hat Technique-Teaching the Crawlers To Run. However over all I couldn’t tell how often exactly the sites would update. So I turned to Archive.org’s loveable Way Back Machine. Alexa has a wonderful feature that puts a star next to each Archive’s update vs. Archive’s update+a site update. This comes in very useful to determine how often sites of your niche should be updated. For instance certain news sites require much more freshness (take a look at CNN) than a local government site like Oregon.gov. So if you got a site that holds high rankings for a term you don’t want to loose, but you see it slowly start to trickle down the serps, you may want to take a closer look at the freshness factor. Determine which of your competitors are holding the steadiest rankings. Add them all to a list. Take a look at the sites themselves. Many of them may date their newly added content. See if you can determine how often they update. Make up an average of the sites in the top 5 and an average of the sites in the bottom 10. Then look at the sites in the consistant bottom 30’s. More often than not these rankings are usually held for the sites that are of quality SEO, but are determined too stale to rank in the top 10. Attempt to make a prediction on how often your site should be updated. Then make a prediction on how large your update should be. If you’re unsure on how large it should be play it safe and make it very small. Remember any title change forces the SE’s to reevaluate your site’s topic. This is a good thing and a bad thing. Typically the engines will drop you down on the rankings while they make the new evaluations, but you are sure to come back up if they determine it’s of the same topic. This will cause your staleness factor to drop to 0 and your freshness to be high again. Make sure to note your rank at this point. This will be your target on every update.
I know this behavior in the SERPS causes many webmasters to bang their head furiously, but if you just sit back and watch carefully what is happening with your site you are much more able to make an intelligent decision. Most newbie webmasters, when they see this effect happening on their site, they panic and redo the entire site. This is definitely the WRONG MOVE. Even if you think you’ve been sandboxed remember: You deserved top 10 rankings at one point, there is no reason why it shouldn’t deserve it again. Be patient, be smart, and use your knowledge of the freshness factors to maintain your rank.

Site Reviews & Commentary24 Aug 2006 03:21 pm

My friend Danimal from The Danimal Report pointed out to me the other day that some people have finally developed a search engine based on the released AOL Search data.

Here is some of the better ones I’ve found.

AOL Search Database - Allows you to search the entire database by keyword, user id, and website url. Also allows sorting by user id, keyword, date, and website url.

SEO Sleuth - Very clean setup. Allows you to search by keyword or domain name. Shows search referals, incoming keywords, and diversity ratio by domain name. Even breaks it down by hours of the day.

Neat Tricks and Hacks23 Aug 2006 07:14 pm

Since cloaking and IP Delivery(delivering specific content to only the ips of SE bots) are pretty much the epitome of Black Hat SEO, and none of us are blackhat :) I thought I’d start a small chaptered tutorial on sneaky alternatives. Understand, ANY form of displaying separate content to engines than what you display to regular users is clearly against the rules(with the exception of flash of course). So these definitely still classify as Black Hat and should never be considered Blue Hat. However, since a large portion of you are going to be experimenting in IP Delivery and various other cloaking methods I might as well teach you some safe methods of doing it, because obviously one wrong move/one missing IP could be the one that gets you banned very quickly. My experience with people who perform ip delivery in particular are that they are VERY cocky about it. Even though it’s a very smart well thought out technology. In all reality they don’t deserve to be cocky about it at all. Infact it’s in their best interest to be as paranoid as possible. The more paranoid they are, the better their sites’ perform. I suggest you take the same mental approach.
Method 1-The Wayward Advertiser
I call this method the Wayward Advertiser because it tricks the engine into thinking your just displaying a auto rotating banner advertisement. No big deal, SE’s have been dealing with those since the early 90’s. This “advertiser” however will force any nonbots to be displayed another page through a redirect. This is very simple.

Step 1
Create your bots view page. This is the page you want to be keyword stuffed and SEO’d to it’s max since that is all the search engine will see. Then put in the frame code. Make sure the frame code is something similar to 468×60 pixels. This is the standard advertisement size. The frame will pull a page called something similar to adspace.html. Of course be creative with it, but give away nothing that this page will be a redirect. Do everything you can to make it look like a legitimate refreshing advertisement page. That includes the filenames.
View the Botsview source here

Step 2
Create the adspace page. This page will contain three different types of code. First it will contain a framebuster sniplet. This will cause the page to bust out of the frames and in return display another page that only the users will see. We’ll dub that page the userviewpage. The second will be in the actual body. This will be a meta refresh tag, that will refresh to the userviewpage as well. This is a failsafe. I like to set this to 3 seconds just incase the user doesn’t have javascript enabled. The next will be either a 468×60 pixel image or some form of link that looks like an advertisement. This will obviously be so incase the search engine decides to follow the frames. Everything will look completely legitimate as an advertisement and it will leave.
View the adspace source here

Step 3
Create the userview page. This will be the real content you want the visitor to see. Of course this isn’t totally necessary you could just redirect the users directly to your affilliate link or whatever you’d like.
View my userview source here

Would you like to see an example?

I will periodically post the rest of the chapters of this over the next couple months. Obviously there are many different ways to accomplish a good cloaking effect. Gimmie some time and I will slowly work my way through a large portion of the quality ones.

Site Reviews & Commentary22 Aug 2006 10:28 pm

SEO BlackHat released a private blackhat forum this month. This will supposedly be a forum where the top black hatters post up their deepest darkest SEO secrets for the members, and where people can discuss new techniques openly. Sounds like a decent idea. Infact I think I had one of those kind of ideas before. I’m personally debating on whether or not to join.

The Advantages
1) I may learn something new that I can implement myself
2) I may meet someone there that can help me with one of my upcoming projects
3) It may be an entertaining forum.

The Disadvantages

1)I understand there is some kind of disclosure thing. Everything on BlueHatSEO is original and of my design. If i see something in that forum, whether or not I already knew it, I couldn’t ever post it up on here; because they will just claim i stole it from the forum and shared it.
2) $100/month. I’m not sure if it’s the fact that I’ve grown up in a very poor family but something about paying money for words never sat right with me. $100/month? I’m not quite sure where I stand with that. I’ve still to this day never bought an e-book in my life. I’m not sure if this would classify under the same principle.

The Biggest Disadvantage of All!
3) My own Awstats at the moment are creating the best case of why I should not join the SEO BlackHat Private Forums

- http://www.seoblackhat.com/forum/showthread.php 140 140

How much of my own content would I actually be paying for? What kind of blackhat forum is this? Are they just reposting Blue Hat Techniques and then being like, “Oh wow guys this forum is so damn useful! Check out all this neat stuff Eli wrote!”

I think I’m just going to save my hundred bucks and instead give SEOBlackHat a big fucking flaming finger instead.

Neat Tricks and Hacks22 Aug 2006 05:26 pm

Akismet has protected your site from 796 spam comments.

This was from a 24 hour period on Blue Hat SEO. So once again the topic of blog spam has come up. Blog spam is extremely popular, and just because I’m now a blogger myself doesn’t mean I have an excuse to suddenly become naive and pretend it’s not an EXTREMELY powerful and effective way to market. Search engines have been trying their little hearts out for the last couple years to make blog spam useless, but the facts are; They still love every single last link. As it stands now I still can’t come up with a faster way to get indexed than to put up links on 10,000 PR3+ blogs.

Like all highly public link bombing techniques such as comment spamming and trackback spamming, it’s a dying technology. Since I’m a very outspoken advocate of not following the rest of the marketing pack I’ve tried to give out some unique blog spamming tips like how to spam gov and edu sites to help narrow you from the rest of the crowd.

So as boring as this topic is I’m going to come back with another one, and say hey spam these guys. Yes they accept trackbacks :)
http://blogs.msdn.com/

Just for the record. Since I know for a fact that at least two of the major ppl that spam my site are also regular readers of it, I wanted to state spamming me only slows down how many posts I do a month, not inspires them. So don’t think you’re teaching me something new or impressing me. Don’t get me wrong I don’t mind Blue Hat getting spammed or link bombed. If I did, I’d be a hypocrite. Just don’t get the wrong impression.

Update 8/25/06- The spam has stopped completely. Thanks guys. I appreciate it :)
I’ll give you one more for being so nice: http://blogs.technet.com/

Neat Tricks and Hacks14 Aug 2006 08:18 pm

For those of you who are familiar with the popular RSSGM scraper I did want to point out a method I found to properly utilize it. Back in the day I created a couple RSSGM sites, and they got banned pretty quickly (2-5 months). For the sake of cheapness I also put it up on a couple white hat sites(article directories), and surprisingly THEY ARE STILL THERE! It was actually kind of shocking. I put up the RSSGM on subdirectories of Article Dashboard installs. Linked to it on the main page. Then boom, for some reason the article directory gave the RSSGM install quick indexing and some trust credits. So the RSSGM sites still are there, still rank, and are fully indexed.

For shits and giggles I later added multiple RSSGM installs on the same domain and they surpringly also stuck. Kinda funny if you ask me. So if your getting fustrated with your RSSGM installs getting banned so quickly try what I did. They both are free scripts and it works surpringly well.

Just thought I’d point that out since I’m sure there’s a few readers here that I know would really find use from that information.

Random Thoughts03 Aug 2006 03:03 am

So Matt Cutts has made the leap into the video blogging world recently. I personally think this is a huge step in the right direction. Basically he is taking the time to create short 8 minute videos to answer questions from the people in the SEO world. The best part about it is he is actually answering REAL QUESTIONS.

Questions like:
Does Google Analytics play a part in the SERPS?
What are some SEO myths?

These are some really good questions that he is answering. This naturally shocks me quite a bit. Everytime I’ve seen Matt in an interview the interviewers are always kissing his ass and spend the entire time asking retarded questions like How does Google feel about spam? Well Jimmy, Google doesn’t like spam very much. Then of course after the interview all the listeners start kissing the asses of the interviewers. Oh my god that was such a profound interview. I learned so much. Instead of saying what everyone is thinking. Are you fuckin kidding me?! You had a Google engineer sitting there with an open mic and you didn’t ask a single real question!

All I can say is way to go Matt. Way to step up to the plate and swing. He didn’t have to do this. He could of just have picked all the lame questions, but he chose to do it anyways. You really got to respect him for that.

General Articles02 Aug 2006 10:24 pm

As the site: command in search engines get less and less accurate and the search engines failed to ever make their API’s out of beta-like reliablity. I have finally made the complete switch to “Squeeshy Words.”

Since I completely made up the term Squeeshy Words I am almost positive none of you have ever heard of them. The term came from a dream I had before I started my company that involved a hamster and a line of ants that needed to get squished. And we shall call him Squeesh! Basically a Squeeshy Word, is an abstract made up word that doesn’t exist. Kind of like what they do with SEO contests. You invent a weird term that no one will ever possibly use. You hide it in all of your templates of a current project. Then when you want to track how many pages that project has indexed you just search for that term. It surprisingly shows you a much more accurate count of how many pages you have indexed in each engine. It also gives you the added benefit of determining which pages hold higher weight Which helps with certain interlinking quarals. Since they technically all compete against each other for that particular Squeesh Word.

For those of you who don’t use the Squeeshy Word system (you don’t have to call it that). I actually suggest you get in the habbit of it. It creates a much more accurate analytics and it saves you a ton of time staring at API based tools.

*Also note: I’ve found that the API results often vary from even the most common datacenter. I really don’t enjoy trusting them.