This document provides an introduction to Apache Spark, including an overview of its components and capabilities. It discusses Spark's history and development at UC Berkeley and the Apache Software Foundation. The document explains the Spark stack and its core abstraction called Resilient Distributed Datasets (RDDs). It provides examples of creating and transforming RDDs in Scala and Java. Finally, it lists some resources for learning more about Spark.