The Internet of Everywhere — How The Weather Company Scales

Weather is both massive and intensely local. Building a weather service involves capturing lots of data for many, many places over long timeframes, applying intensive analysis to those datasets to predict what will happen next, and then delivering those insights to the millions mobile users and other endpoints who request it, all within a fraction of a second.

Here we share the slides from Robbie Strickland's Spark Summit talk.

In the slides, Robbie offers the full end-to-end view of how The Weather Company uses Apache Spark™ to manage this large-scale challenge.

Robbie Strickland is Vice President of Software Engineering at The Weather Company where he leads the team that builds data analysis services for the digital business, serving the likes of weather.com, Weather Underground, and the TWC mobile apps. He has been involved in the Cassandra project since 2010 and has contributed in a variety of ways over the years, including work on drivers for Scala and C#, the Hadoop and Spark integrations, heading up the Atlanta Cassandra Users Group, and answering lots of Stack Overflow questions. He has been selected as a Cassandra MVP for 2015 and 2016.

See the slides on SlideShare or below...

Newsletter

You Might Also Enjoy

James Spyker
James Spyker
2 months ago

Streaming Transformations as Alternatives to ETL

The strategy of extracting, transforming and then loading data (ETL) to create a version of your data optimized for analytics has been around since the 1970s and its challenges are well understood. The time it takes to run an ETL job is dependent on the total data volume so that the time and resource costs rise as an enterprise’s data volume grows. The requirement for analytics databases to be mo... Read More

Seth Dobrin
Seth Dobrin
2 months ago

Non-Obvious Application of Spark™ as a Cloud-Sync Tool

When most people think about Apache Spark™, they think about analytics and machine learning. In my upcoming talk at Spark Summit East, I'll talk about leveraging Spark in conjunction with Kafka, in a hybrid cloud environment, to apply the batch and micro-batch analytic capabilities to transactional data in place of performing traditional ETL. This application of these two open source tools is a no... Read More