That we are flooded with data is stating the obvious. Nevertheless some of the figures presented are noteworthy. Amount of data generated per hour by:
- Retail Store 10GB
- Gas Turbine – 400 GB
- Automated factory - 1TB
- Boeing 787 – 40 TB
- Mining operation – 144 TB
Likewise that some of those data are valuable per se and some can improve the operational effectiveness of companies.
A panelist at the Big Data session went beyond the obvious, claiming that any company can and need to exploit data.
Examples of today’s companies that are monetizing data abound, like Uber leveraging on data to provide business info for generating ads and coupons to Uber’s clients based on where they have been taken, Foursquare helping determining consumer preferences, farmer making better use of insecticide and fertilizers, GE planning airplane engine maintenance based on the 1GB of data per hour per engine it receives to its support center.
Not necessarily you need to have dedicated sensors to pick up data. You can use data spontaneously generated for other purposes and use them as virtual sensors through data analytics. An example is the use of cell phone location data made by Singapore Muni Transport . By analyzing location data they know the flow of passengers, where and when there is more demand and can dynamically readjust schedule and routes.
Wind farms get more efficient by analyzing data generated by wind turbines, a tiny adjustment of the blades based on data received from other turbines can significantly improve the efficiency.
Clustering of data in areas like health care can provide insight and support more efficient health care, including proactive action (to contain epidemics…). Smart phones, smart scales, wearable can provide data useful to the owner as well as aggregated data that generate insight on a community.
Data are also affecting the way we develop, market and use products: there is a growing blurring between product and service. Kindle, DisneyWorld, Nike plus, Tesla cars are examples in different market areas where service and product overlap.
More and more each of us, as a person, is defined in terms of the data generated and consumed. Netflix, as an example, is constantly bettering their customer relationship by analyzing usage data to detect:
- Context (including collaborative searching)
- External Data
and it is looking into a person social community to further improve recommendations.
A growing number of insurance companies are offering insurance packages based on actual driver behavior.
An interesting example of use of data analytics can be found in the area of Orphan Drugs, those drugs targeting very rare disease, so rare that they are not creating a market worth pursuing.
By applying healthcare analytics to 150 million US patients records from insurance claims Pharma has been able to identify people potentially affected by a rare disease and solicit a focused diagnoses. By looking at the ones already diagnosed it was possible to extract common features and features that are not in common with others. This is cutting cost to insurance companies (unnecessary exams not leading anywhere) and expand the market of Orphan Drugs.
It was noticed that in the next decade a growing use of data will be stimulated by augmented reality, new consumer devices are already appearing on the market although they are in their infancy. In ten years time AR is likely to become a common experience in the mass market and it will bridge seamlessly the world of atoms with the world of bits.
The issue of privacy remains, and it is not going away. In spite of better techniques to neutralize data there are always work around. This is an area where technology plays on both side, neutralization and de-neutralization. Any improvement on one side improves the other side as well with no net gain.