Politifact's ratings don't involve intent:
http://www.politifact.com/about/
They clearly involve bias. It's bias for each rating. How they determine the veracity of the data, and how they decide on the ratings themselves. That's for each rating. Then there's the whole (big ball of wax) for what they even select to be rated. Adding all that together, leads to enormous bias.
We can look at individual ratings to help decide on this bias. Let's see how it holds up to RF debate and how we handle all these elements, though the last one (the totality of what they choose to rate at all) is not one we can likely tackle in say under a year, even if that's all we worked on for say 40 hours a week. And yet, it's that which produces a chart like that in OP.
But if you really think they aren't biased, then choose, any item you want (true or false) and then I suggest we each do that, and bring one to the table. And let's debate it. Let's see if we all agree on the rating. If not, then it would be indication of their bias, rather than some bizarre notion of they are going with "accurate" as if that is objective.
I just looked at one before writing this post, that struck me as "half true" and is dealing with stuff that is many years/decades old (about Trump). They decided to rate it as "true." I'd love to have that debate that really explores what is being stated in the assertions, plus have all that discussion not be isolated strictly to that one claim, but how it compares to all other claims on the site. IOW, I am explicitly stating they cherry pick what goes "true" for Clinton and what goes "true" for Trump. Plus I'm saying there's many layers of bias involved, all of which is discernible if you spend more than say 30 minutes on the site.