This document discusses intelligent agents and their environments. It defines an agent as anything that perceives its environment through sensors and acts upon the environment through actuators. Rational agents aim to maximize their performance measure by selecting actions expected to have the best outcome based on their percepts. An agent's task environment is defined by its performance measure, environment, actuators, and sensors (PEAS). Environment types are classified by whether they are observable, deterministic, episodic, static, discrete, or single-agent. Basic agent architectures include simple reflex agents, model-based reflex agents, goal-based agents, and utility-based agents.