DURHAM, N.C. -- If you live in one of the battleground states in this year’s races for U.S. Senate, you have probably been inundated with political ads, many of which talk about a candidate’s willingness to toe the party line or vote across the aisle.
Now, analyzing such claims for accuracy is about to get easier, thanks to a new website developed at Duke University. The site, iCheck (icheckuclaim.org), lets visitors evaluate claims about congressional voting records more critically by taking a closer look at the data behind them.
Take an attack ad released Aug. 8 against Democratic Senate candidate Evan Bayh of Indiana. The ad challenges Bayh’s reputation as a moderate, saying that “Bayh toed the Obama party line 96 percent of the time.”
But a closer look at Bayh’s voting record gives a different picture. iCheck reveals that Bayh did support the president’s position 96 percent of the time in 2010, during the final year of his 12 years in office. But merely changing the time period of comparison from 2010 to 2009 causes the number to drop to 77 -- not very high considering that every other Senate Democrat and even some Senate Republicans voted with Obama more often than that during that period.
“iCheck is a great tool for fact-checkers and voters because it puts the facts in context,” said Bill Adair, founder of the website PolitiFact and a professor of journalism and public policy at Duke.
The iCheck database contains votes for every member of the House and Senate since 2009, spanning more than 2.5 million votes and tens of thousands of bills. The site integrates data from multiple sources, including GovTrack.us and a legislative tracking service called Congressional Quarterly.
Visitors to the iCheck site can look up a specific senator or representative from their state and see how often their legislator voted with the president’s position or the majority votes for each party, as well as how those alignments compare with other members of Congress. They can also explore how these measures change over time, or over different sets of key votes identified by special interest groups such as Americans for Democratic Action or the American Conservative Union.
“iCheck is unique in that you can examine the data from multiple angles and change the parameters to see how a claim holds up,” Walenz said.
Harnessing the power of computers to do some of this work is especially important in the context of today’s newsroom, Adair said. “With newsrooms shrinking, it’s become really important to create tools like iCheck that relieve some of workload for journalists so they can focus on more important tasks.”
The speed of the Internet news cycle means that half-truths and exaggerations are often shared and repeated more rapidly than human fact-checkers can expose them.
But analyses that might take hours for a human fact-checker can be explored in a matter of seconds with iCheck. With one click the site also lets visitors tweet their findings or share them on Facebook to help correct misinformation before it can spread.
iCheck is only a first step toward using computers to analyze congressional voting claims, the researchers say. It’s good for verifying statements about how often legislators align with other groups and spotting trends over time. But the site doesn’t yet have the data to evaluate claims about how often someone voted for or against a particular issue such as national security or gun control, especially in the case of massive omnibus bills that lump multiple measures together to be passed in a single vote.
iCheck is part of a surge in fact-checking initiatives, which now number more than 100 worldwide, according to tallies kept by the Duke Reporters’ Lab.
“Yes, some candidates have a poor record for accuracy, but the reason we know that is because there are now so many fact-checkers examining what the politicians say,” Adair said.
This research was supported by the National Science Foundation (IIS-1408846, IIS-1320357), a Google Faculty Research Award and Google Research Cloud Credit.
CITATION: "Fact Checking Congressional Voting Claims," Brett Walenz, Junyang Gao, Emre Sonmez, Yubo Tian, Yuhao Wen, Charles Xu, Bill Adair and Jun Yang. Proceedings of the 2016 Computation+Journalism Symposium, Sept. 2016, Stanford, California. DOI: 10.1145/1235.