Ever since Google started cracking down on blogs that had duplicate content, I have wondered how syndication played into it. In legacy media, being a syndicated columnist was a huge feat. The columnist reached greater audiences with the same content.
Then there are the news stories that play over and over on every television station and every newspaper. Even as the legacy media moved online, the practice continued. Online news sites like Huffington Post even repost content from other sites.
I’ve asked around in various groups what makes syndication okay for sites like a newspaper, but not a general blog like mine? Early answers were that they were somehow exempt from Google’s algorithm. Not fully understanding, I searched the web and Google’s own information and all I got was some technical mumbo jumbo that all I took away from was syndication was okay but not duplicate content.
I got into another discussion yesterday with an author who read an article about the benefits of syndication, but wasn’t sure how it differed from duplicate content. It got me thinking and again I went searching for answers. I ran across the most helpful article to date on the subject at Search Engine Journal.
To bottom-line the article – it’s all about the quality. You really should read the article for yourself, but I’ll highlight a few things I took away from it and some thoughts on syndication as it applies in the book-blogging world. Continue reading