Every day, 2.5 quintillion bytes are created. Even without understanding what that number actually means, it’s not hard to conclude there’s a great deal of useless data out there. Thus, with big data, it’s as important to know what information should be discarded as it is to understand what’s of value. That’s why these use cases are so powerful.
Smart cities not only mean better transport and more connected citizens, but also increased safety. New York City implemented risk-based fire inspections during Mayor Michael Bloomberg’s administration and recently rolled-out an updated solution.
The digital program uses data from multiple sources, including information from other city agency databases, to assess and prioritize 350,000 buildings firefighters inspect annually. In the past, inspections were conducted on a cyclical basis and recorded manually.
The risk-based system accesses a data warehouse of organized and processed building and inspection information and uses a model that tracks, scores, prioritizes and then automatically schedules a building for inspection.
Tiger poaching in India was at its height in 2016, driven by a demand for parts in China. Fortunately, Koushtubh Wright, senior regional ecologist with Snow Leopard Trust and scientist with the Nature Conservation Foundation, and his team created an algorithm based on 25,000 data points collected over 43 years, that is helping identify where the cunning criminals are likely to strike.
The models and projections for finding poaching “hot spots” can be updated regularly and are expected to help improve enforcement and, ultimately, prevent the killings. Poachers travel in groups, carry weapons. set traps for tigers and “know every trick in the book.” It’s difficult for officials and conservationists to keep up.
In his final State of the Union address, President Barack Obama called upon Vice President Joe Biden to head up Cancer Moonshot, a taskforce charged with making 10 years’ worth of advances in diagnosis, prevention and treatment of the over 100 types of cancer in half that time.
Among the recommended actions from the taskforce’s Blue Ribbon Panel: build a national cancer data ecosystem. A massive amount of patient data exists, but is housed in proprietary databases or is widely inaccessible, limiting the value on research resource and virtually eliminating the possibilities of this data for helping survivors.
Linking many of the nation’s largest data repositories would enable one-stop, unrestricted access for researchers, doctors and diverse patient populations to share data on cancer and fuel faster progress.
In the private sector, companies like Alphabet, IBM, Apple and Microsoft have been working on ways to collect, aggregate and analyze scientific and health information in an effort to apply this big data to curing cancer.
For example, Watson (a cognitive computing system) for Oncology, is a partnership between IBM and Memorial Sloan Kettering, in which physicians “train” the system to help interpret cancer patients’ clinical information.
Know of any big data power cases? Share them in the comments below.