Vinod Nayal presented on options for ingesting data into Hadoop, including batch loading from relational databases using Sqoop or vendor-specific tools. Data from files can be FTP'd to edge nodes and loaded using ETL tools like Informatica or Talend. Real-time data can be ingested using Flume for transport with light enrichment or Storm with Kafka for a queue to enable low-latency continuous ingestion with more in-flight processing. The choice between Flume and Storm depends on the amount of required in-flight processing.