So on that note I'm happy to link to this absolutely outstanding work by the Open Science Framework, project leads Rick Klein, Kate Ratliff and Michelangelo Vianello and all co-authors for their new paper The “Many Labs” Replication Project. Labs from America, Brazil, Czech Republic, Malaysia, Turkey, U.K.and more have cooperated to replicate 13 major effects in the psychology literature using 36 studies and a sample of 6,344. Figure 1 below is a tremendous step towards putting some of the recent psychology literature on firmer footing. The x-axis shows the standardized mean difference between the control and treatment group, so the interpretation is that the further rightwards the dots are, the stronger the effect is. The first two anchoring studies have proven to be extremely robust with an average effect of over 2 standard deviations above the mean (quite a bit stronger than the original studies interestingly). The priming studies have not found evidence of a priming effect.
This paper should be mandatory reading for any social psychology course. You can download the pdf here. It has succinct summaries of each of the studies the authors are replicating and details the practices used to test them.
*I mean no individual incentive in the sense that if you're interested in getting published, replications are not the way to go about it. That is obviously a major issue. The point of science is not to confirm hypotheses - it's to investigate them. Disconfirming a well-reasoned hypothesis is just as valuable as confirming one.