Follow some of the discussions by writers about writing on LinkedIn or on some writing forums and you’ll soon realize that there’s a feeling among at least a few professional writers that creating content for the web is not real writing. Even Harper’s Magazine got into the act a while back promising their magazine would always be content free.
The perception that web content writing is bad writing is grounded in some fact. The problem stems from the multitude of websites publishing all sorts of poorly written and often inaccurate information designed to drive search engine traffic and not much else. Freelance writers recognize the people who publish such writing as the ones who pay a dollar or two for a 500 word article. Small wonder that type of content writing is, with some exceptions, poor.
History of ‘Content’
Some history may help you understand how the word, content, came to be. The web wasn’t originally designed for writing, exactly. The internet was created to let scientists manage information and share it in an open way with each other. The first browsers were aimed at making academic writing easier to read and allow simple graphics.
I suspect the term developed because putting words so they’d show in browsers required some programming; it was programmers who first began developing websites. They saw the code as the important thing, not the writing. In fact, I remember great discussions about the quality of code behind web pages.
That code, of course, provides the framework to hold what we see on the web, but in the beginning no one worried much about what went on or inside a web page – it just got filled up. Hence the awful term “content.”
Although the web was designed with openness in mind, it didn’t occur to anyone at the time that the whole world would want to get in on the act and try to make a profit in the process.
But we did, and it happened in a hurry.
Of Search Engines and Content Farms
Then came search engines. Although there were several early attempts to create search engines with human editors, the sheer number of websites, most with multiple articles, made keeping up with human eyes impossible. That meant the chore needed to be handled by computers. Since computers can’t read. at least in the in the sense that humans do, the solution was to program them to look for key words and phrases that would allow them to find pages that matched the search term people were likely to use.
The content farms were developed to take advantage of the dumbness of computers. Articles were stuffed with key words and phrases. When Google offered Adsense to web publishers it was probably only moments before marketers realized if they could trick search engines into presenting their pages first they’d earn money just because the larger audience clicked on the ads, probably in hope of real information. What had been a trickle of bad “content” became a flood.
It’s the very openness of the web that allows the bad writing. If you want to write for Harper’s you’ve got to be good. Then your piece will go through rigorous editing and fact checking. All those checks and balances are missing from the web with few exceptions.
It’s worth noting that it was 2009 when Harper’s ran that ad and even then you could find much of their content online – today they’ve got archives available back to 1859 for subscribers. In fact most respected magazines and newspapers are online now.
Some of the worst kinds of freelance content writing got pushed pages lower in search engine rank when Google changed its algorithm in February of 2011 with their Panda update. They went after low quality sites stuffed with key words and inundated with advertising. Most legitimate publishers found themselves ranking higher and web users were glad to find it was easier to find what they were looking for.
We can only hope that this kind of improvement continues.