Aug. 15, 2012 10:14 a.m. SAN FRANCISCO, CA.–Matt Cutts, Google’s famous “spam fighting” engineer, is addressing the SES Conference and Expo attendees, and specifically answers questions about authorship.
Cutts says its a hard problem and that Google doesn’t have as much access to social media as it would like. He also tackled Danny Sullivan’s question regarding the weight of Google’s +1s button, and the use of Google+, and he says Google has responded to feedback about Google+ and that Google+ isn’t as big of a factor of rank as you might imagine. “It’s sort of un-intuitive,” Cutts said.
Going into an extended session, Cutts went on to answer further questions and says that since users can’t buy higher rankings and boost for payment and therefore users should be able to trust Google.
“The primary part of what we are building knowledge graph off of is freebase,” Cutts said. you can do more research on free base and the data that we have on people and other things, you can do more research on Google and what’s under the hood at Google, he said.
When asked why webmasters can’t get answers to what’s going wrong, Cutts said Google was going in that direction and said the unnatural links warnings and webspam is part of those messages. Cutts said he wants to offer actionable, transparent things webmasters can do to change the issues in sites that have been affected by Google’s algorithm updates.
Answering a final question on Penguin/Panda on, “The mood in the SEO world as sheer panic…is Google that stupid?”
Cutts said that after Penguin, after Panda, the environment has shifted a bit and the answer to that is that we tried to make the shifts gradual, and incremental. Content farms have gotten out of hand, but it was a big change that we felt was best for users, and now we feel that everyone needs to understand that link spam and blatant advertising on SEO forums for 20,000 links were being sold to noobies, and those people were wasting their money. I hope next year there will be a different attitude towards SEO and link spam.
“Duplicate content, well, I think we handle that very well,” Cutts said. “I don’t think you need to worry about dup content on your own site.”
Two or three graphs of text might not be penalized for spam, but it might not be counted that much, but if every single page has the same content, that’s the same thing as a doorway, and that’s the kind of thing we want to get away from, Cutts said.