The amount of data generated by our tech-savvy world has exploded in recent years. Digital video (300 hours of which are uploaded to YouTube every minute), audio (in 2013, more than 4.5 billion hours of music were streamed via Spotify) and photographic data (over 670 million public photos were uploaded to Flickr in 2014) is now easily accessible to anyone with a smartphone, stored automatically in a cloud platform at the touch of a button.
In 2010, Google CEO, Eric Schmidt suggested the amount of data created every two days is about the same as was made “from the dawn of man to 2003” and in data storage terms, 2010 was a long time ago.
Since then, the technology has advanced even further to allow ever greater amounts of digital storage, with helium filled hard disk drives now available with 10TB capacity, enough for 170,000 hours of music or 3.2 million high resolution images. Whilst ‘user generated content’ occupies the thoughts of Google, Facebook and Twitter, others have questioned how best to use the information gathered and whether building an advanced analytics capability to take advantage of the data is really worth the investment.
A 2014 study by management consulting firm Bain & Company found that early adopters of Big Data analytics have gained a significant lead over the rest of the corporate world. Examining more than 400 large companies, they found that those with the most advanced analytics capabilities outperform competitors by wide margins. According to the study, those leaders are five times as likely to make decisions faster than market peers and three times as likely to execute decisions as intended.
Making use of that vital business data however, remains elusive for many. Finding ways to make use of the phenomenal amount of data at our fingertips is now the Holy Grail for many enterprises. The more we can find relationships and understand our systems, the more we are able to find and understand patterns. We can then use this learning to find optimizations and improvements, that even a few years ago, were considered unknowable.
The Five Day Forecast
Take the weather. Not so long ago, weather forecasts were considered extremely unreliable. Today, weather services have access to enormous volumes of data and statistical models that allow them to produce much more accurate forecasts, even five days ahead.
Such big data solution approaches can now be successfully applied to warehouse and distribution logistics and their supply chain concepts. State-of-the-art logistics systems, with their sensors and actuators, as well as their warehouse management and control systems, produce several megabytes of historical data every day, just like weather stations.
By using Big Data, for example, we can ensure our e-commerce systems offer extremely high availability during extremely high demand, our prior knowledge and understanding of the behaviour of the system and machines are critical to achieve this.
Predicting The Future
Swisslog’s Condition Monitoring service already identifies system bottlenecks from which potential optimization measures can be derived. Not only is it possible to continuously monitor the condition of all parts and components of a logistics system, but immediate notification can be made when a potentially critical condition is developing.
The ability to predict errors, long before they actually occur, is the next step that will soon be implemented. Since not all components of an intralogistics system are subject to the same stresses, it will also be possible to detect and replace at-risk components early on. The ultimate goal being, to create a data-based life cycle management system for products that gives a complete picture of the future of any automation system.
So today, there’s no longer an excuse, thanks to advanced storage capacities, efficient software and intelligent system designs, not to use data, however big, to the benefit of your intralogistics operation.