Crowdsourcing systems are designed to elicit help from humans to accomplish tasks that are still difficult for computers. How to motivate workers to stay longer and/or perform better in crowdsourcing systems is a critical question for designers. Previous work have explored different motivational frameworks, both extrinsic and intrinsic. In this work, we examine the potential for curiosity as a new type of intrinsic motivational driver to incentivize crowd workers. We design crowdsourcing task interfaces that explicitly incorporate mechanisms to induce curiosity and conduct a set of experiments on Amazon's Mechanical Turk. Our experiment results show that curiosity interventions improve worker retention without degrading performance, and the magnitude of the effects are influenced by both the personal characteristics of the worker and the nature of the task.