Your first study is from the Heritage Foundation, a group that is routinely criticized for flawed studies that are ideologically motivated, and your second one has numerous methodological flaws:
https://www.washingtonpost.com/blog...allenges-affect-study-of-non-citizens-voting/
"A number of academics and commentators have already expressed skepticism about the paper’s assumptions and conclusions, though. In a series of tweets, New York Times columnist Nate Cohn focused his criticism on Richman et al’s use of Cooperative Congressional Election Study data to make inferences about the non-citizen voting population. That critique has some merit, too. The 2008 and 2010 CCES surveyed large opt-in Internet samples constructed by the polling firm YouGov to be nationally representative of the adult citizen population. Consequently, the assumption that non-citizens, who volunteered to take online surveys administered in English about American politics, would somehow be representative of the entire non-citizen population seems tenuous at best.
Perhaps a bigger problem with utilizing CCES data to make claims about the non-citizen voting in the United States is that some respondents might have mistakenly misreported their citizenship status on this survey (e.g. response error). For, as Richman et al. state in their Electoral Studies article, “If most or all of the ‘non-citizens’ who indicated that they voted were in fact citizens who accidentally misstated their citizenship status, then the data would have nothing to contribute concerning the frequency of non-citizen voting.” In fact, any response error in self-reported citizenship status could have substantially altered the authors’ conclusions because they were only able to validate the votes of five respondents who claimed to be non-citizen voters in the 2008 CCES.
It turns out that such response error was common for self-reported non-citizens in the 2010-2012 CCES Panel Study — a survey that re-interviewed 19,533 respondents in 2012 who had currently participated in the 2010 CCES. The first table below, for instance, shows that nearly one-fifth of CCES panelists who said that they were not American citizens in 2012 actually reported being American citizens when they were originally surveyed for the 2010 CCES. Since it’s illogical for non-citizens in 2012 to have been American citizens back in 2010, it appears that a substantial number of self-reported non-citizens inaccurately reported their (non)citizenship status in the CCES surveys.
Even more problematic, misreported citizenship status was most common among respondents who claimed to be non-citizen voters. The second table below shows that 41 percent of self-reported non-citizen voters in the 2012 CCES reported being citizens back in 2010. The table goes on to show that 71 percent of respondents, who said that they were both 2012 non-citizens and 2010 voters, had previously reported being citizens of the United States in the 2010 CCES. With the authors’ extrapolations of the non-citizen voting population based on a small number of validated votes from self-reported non-citizens (N = 5), this high frequency of response error in non-citizenship status raises important doubts about their conclusions.
To be sure, my quick analysis does not at all disprove Richman et al’s conclusion that a large enough number of non-citizens are voting in elections to tip the balance for Democrats in very close races. It does, however, suggest that the CCES is probably not an appropriate data source for testing such claims."
Sent from my iPhone using U2 Interference