The challenge is that you need a babysitter right away. And you are rightfully concerned about bringing a stranger into your home and entrusting your baby with her. What you need is the perfect babysitter. What you are left with is the reality that there is no such thing. So you have to make the best decision you can on very little information.
Jessie Battaglia knew that she couldn’t make an accurate assessment with a simple face-to-face interview. So she did what any person raised in the AI generation would do: She turned to AI to screen candidates for her. The Washington Post tells the story in, “Wanted: The ‘perfect babysitter.’ Must pass AI scan for respect and attitude.”
“So she turned to Predictim, an online service that uses ‘advanced artificial intelligence’ to assess a babysitter’s personality, and aimed its scanners at one candidate’s thousands of Facebook, Twitter and Instagram posts.
“The system offered an automated ‘risk rating’ of the 24-year-old woman, saying she was at a ‘very low risk’ of being a drug abuser. But it gave a slightly higher risk assessment — a 2 out of 5 — for bullying, harassment, being ‘disrespectful’ and having a ‘bad attitude.'”
Would you hire a person if you knew they had a 40 percent chance of bullying, harassing, and being disrespectful? What if there was a 20 percent chance she was also a drug addict? How does the system even make these determinations? According to the Post, there is no way to know.
“The systems depend on black-box algorithms that give little detail about how they reduced the complexities of a person’s inner life into a calculation of virtue or harm.”
Part of what it has to work with are photos and text. What happens if a person is commenting on a particular song with objectionable lyrics? The person might well be objecting to the lyrics. But the fact that the words are there to be analyzed might cause the system to flag a person as violent, disrespectful, or dangerous in some other way.
Predictim does little to allay those concerns. Their ambition is to gain even more insights through data-mined social media. They want useful psychometric data from babysitters by “running their histories through personality tests, such as Myers Briggs, and offering to sell parents the results.”
This type of AI/social media profiling does not end with babysitters. Many companies are already using or preparing to use these types of systems to comb through resumes and make hiring decisions. When you apply for a job, your Disqus profile could be mined to determine if you are a good fit.