Testing and Validating Apache Spark Programs

In her talk delivered at Strata+Hadoop World in San Jose in March 2016, Holden Karau detailed reasonable validation rules for Apache Spark production jobs and best practices for creating effective tests, as well as options for generating test data. See the full talk below.

Also be sure to catch two more upcoming talks from Holden:

London: Thursday June 2nd at Strata+Hadoop World in UK:
Beyond shuffling: Tips and tricks for scaling Spark jobs.

San Francisco: Wednesday June 8th at Spark Summit 2016:
Getting the Best Performance with PySpark. (Video of this talk is now available here.)

Book signing: Get your copy of High Performance Spark signed by Holden at 10:45 am Wednesday June 8th at the O'Reilly booth at Spark Summit.

Apache Spark is a fast, general engine for big data processing. As Spark jobs are used for more mission-critical tasks, it is important to have effective tools for testing and validation.

Testing and Validating Spark Programs – Strata+Hadoop World, San Jose 2016. Slides of Holden’s talks are available on Slideshare: Holden Karau

Spark Technology Center


Subscribe to the Spark Technology Center newsletter for the latest thought leadership in Apache Spark™, machine learning and open source.



You Might Also Enjoy