The document introduces Spimbench, a scalable benchmark for instance matching in the semantic publishing domain, aimed at evaluating various matching systems and algorithms. It emphasizes the importance of benchmarking through value-based, structure-based, and semantics-aware transformations, and includes performance metrics and a weighted gold standard for testing. Future work involves utilizing Spimbench in the Ontology Alignment Evaluation Initiative and developing more sophisticated evaluation metrics.