Say you are writing an essay or a story for some newspaper or even a blog network where the requirement is that the story should be approximately 350 words in length.
In that case, you can use Microsoft Word or any other word processor for calculating the word count and even the character count of your text document.
But what if you want a more detailed analysis of that text document - like the frequency of individual words, number of unique words, average number of words per sentence or how easy or difficult a text is to read (using Lexical Density and Fog Index)
There's a simple online solution from UsingEnglish.com called Text Content Analyzer - copy-paste the text block in the form and this useful tool will generate an interesting analysis of your text.
As an illustration, here's the word frequency cloud of the recent story on CNet vs Gawker.
Infact, Todd Bishop used this tool itself for generating the word frequency cloud of Steve Jobs keynote and Bill Gates' speech at Macworld and CES respectively.
In that case, you can use Microsoft Word or any other word processor for calculating the word count and even the character count of your text document.
But what if you want a more detailed analysis of that text document - like the frequency of individual words, number of unique words, average number of words per sentence or how easy or difficult a text is to read (using Lexical Density and Fog Index)
There's a simple online solution from UsingEnglish.com called Text Content Analyzer - copy-paste the text block in the form and this useful tool will generate an interesting analysis of your text.
As an illustration, here's the word frequency cloud of the recent story on CNet vs Gawker.
Infact, Todd Bishop used this tool itself for generating the word frequency cloud of Steve Jobs keynote and Bill Gates' speech at Macworld and CES respectively.