Twitter, Facebook and Big Data

Twitter, Facebook and Big Data

By Carole Bersillon, Intern

 

At school, I was never a fan of math and analytics. I must also admit I am sometimes apprehensive about using social media, for fear of being consumed and for my personal tendency to denigrate digital trends or “over networking”. However, I fully accept the fact that social media has become an essential communication/brand management tool for PR and I acknowledge the benefits they provide, especially dealing with E-reputation or monitoring.

According to an academic essay entitled “Six Provocations for Big Data,” we have entered the “Era of Big Data”. Twitter, Google, Facebook, Wikipedia, WordPress and others offer a massive quantity of information produced by and about people. These free and accessible informational platforms have transformed the jobs of public opinion researchers and pollsters, as noticed by Carl Bialik in The Wall Street Journal article, “Tweets as Poll Data”. They developed tools to turn people’s thoughts/tweets/statuses into quantifiable public sentiment, by using keywords or positive and negative connotations.

Concerning PR activities, monitoring the web and social media is a daily activity. Agencies are using Google alerts, news aggregators, apps and all sorts of  different tools to create coverage reports that reflect the online presence of their clients. Sometimes, the coverage goes into deeper details and professionals want to know what the public is saying about the company or brand.

Even if using social media as qualitative and quantitative data can have a contrary effect or provide biased data (you only get the opinion of people who share their opinion publicly and opinions expressed by people are not necessarily the reflection of their deepest thoughts), it is a fantastic tool to access information and provide quick and smart analysis of a brand e-reputation, image or strategy.

Producing analysis through social media can have a drawback of being a time consuming and fastidious task. Having to define a perimeter of content analysis, gather the data, create categories and criteria, analyze and then understand the results takes time and energy. While writing my Master’s thesis about crisis communication in the aviation industry, I used press releases from both Boeing and Airbus to understand the strategies of the companies dealing with announcements reflecting major airline delays. It took no less than 200 press releases and 3 weeks to aggregate and then analyze the data (through 8 pre determined categories and keywords), but the result was accurate and my hypothesis solid.

The problem, PR and communication professionals don’t always have the time to perform such a long drawn-out job. There are actual companies and useful computing tools that can do it for them. Storify recently appeared as a new storytelling social media. The platform aggregates content from Twitter, Facebook, Youtube, Flickr, Instagram, Google and any other URL given. The user just has to create a headline and description and then choose content from the platforms using keywords. The tool creates “stories” about any topic. For a better understanding, check the story I created about Data and Social Media. Storify positions itself as a Storyteller. I am not sure ‘story’ is the right word to describe it but at least, it is a good tool to gather content from social media, organize it and get an insight of what is going on about a specific topic.

However, social media and big data study should not  overshadow the objective of any information analysis and curation that look for accurate content and full of meaning results, referring us to the old debate of Quality vs. Quantity.

No Comments

Leave a Reply

%d bloggers like this: