We use cookies to make this site as useful as possible. Read our cookie policy or allow cookies.

Ubiquitous data tells real story

10 April 2017 by Anthony Hilton

Ubiquitous data tells real story - read more

Seemingly harmless information can be valuable, says Anthony Hilton

More data crosses the internet in one second today than was stored in its entirety 20 years ago. According to a recent investment conference speaker, research firms are beginning to gather this data, often from unconventional sources like weather patterns and satellites, and using it to find out what is really happening in specific businesses. It is not unusual for them to know better than the management what is really happening in the business.

The implications are huge, not just for the investment community at whom the speaker was addressing his remarks, but for business in general. Companies need to wake up to what this means.

One key new factor is that much of the information does not come from what would be thought of as conventional sources. This has been happening in the background now for some time. The larger hedge funds, those with the big budgets, have long been developing algorithms which search for key words on Facebook, Twitter and other branches of social media. They are looking for nuggets which alert them to what may be happening in a business before the company has actually announced anything.

“Research houses are rapidly getting much better at handling unstructured data from unconventional and novel sources”

It seems to work. Though the funds are coy about their success rate, stories abound which relate how they have been alerted to new oil finds days before any formal announcement from the company by workers gossiping about drilling results on Twitter.

With continued advances in technology and software analytics, research houses are rapidly getting much better at handling unstructured data from unconventional and novel sources – weather reports, aggregated credit card reports, retail order flows across the web, satellite location tools and so on – and applying these to specific businesses. And, as with research generally, the value of what is produced increases over time as a history is developed and today’s results can be compared with last month and last year.

Some real life examples illustrate how it works. A fast food chain announced a significant promotion designed to increase customer traffic though its stores. The research group used satellite imagery of the fast food outlets’ car parks to see how much busier they were compared to an equivalent period before the promotion. This not only told them the degree to which the promotion was successful, but also which fast food competitors had suffered the most significant equivalent declines.

“Researchers have developed ways of plotting all this data and using it to work out traffic patterns on a local, regional and global basis”

Such knowledge can be enhanced in this and other similar industries by an analysis of social media data emanating from people on zero-hours contracts. The feeds give an indication of whether the individuals are working more, or fewer, hours than they anticipated, which again can contribute to a picture of the health of the business.

The travel industry is also fertile ground because so much of its business is online. Everyone knows that hotels discount rooms, but the scale of these discounts and the number of rooms offered varies with the time of year. Researchers have developed ways of plotting all this data and using it to work out traffic patterns on a local, regional and global basis.

By comparing what is happening now in terms of charges and room availability against last month, or the same season last year, across thousands of hotels worldwide, they can get a significant insight into the current health of the business, and what is happening to volumes and margins.

If an individual wants to try to replicate such research, the relevant software for what is known as ‘web scraping’ is available online. It would, however, be a bit of a waste of time because the hard part is not collecting the raw data, but making sense of it once it has been collected. It has to be organised, cleaned up, made consistent, and then related to specific business situations and strategies.

This is currently very difficult and requires vast computing power. Nevertheless companies need to understand the implications. As a matter of policy, many are putting more and more information on their websites and encouraging employees, suppliers and customers to interact with them. They now need to be aware that seemingly harmless data can have considerable commercial significance when matched with information from novel sources.

Anthony Hilton is Financial Editor of the London Evening Standard

Have your say

comments powered by Disqus

Advertisements


ICSA: The Governance Institute
Saffron House, 6-10 Kirby Street, London EC1N 8TS, United Kingdom