For a while now, facial recognition software has been struggling when it comes to darker skin. That can be a good or bad thing depending on who you ask.

It can become quite a hassle when facial recognition technology is being adopted throughout much of our travel experience. For instance, British Airways uses facial recognition at their boarding gates in New York, Los Angeles, London, and Orlando. Delta Airlines has already introduced the nation’s first biometric terminal in Atlanta, TSA PreCheck has plans to introduce facial recognition, and China has been using facial recognition technology at its airports in various ways. But what happens when the software is bias against people of color and suddenly traveling becomes an inconvenience?

Google has funded a facial recognition project targeting people of color in an attempt to build a massively diverse database so their products utilizing biometric technology won’t fall victim to racial bias. They’ve hired temps through a third-party agency to go out and collect a variety of face scans from people on the street. In return, they provide participants with a $5 gift card. However, the practice itself seems kind of questionable, especially since teams were dispatched to target homeless people in Atlanta, unsuspecting students on college campuses around the U.S. and attendees of the BET Awards festivities in Los Angeles, and other places.

One of the temps that was dispatched say they were even told to disguise the collection of data as a selfie game and were told to say things like, “Just play with the phone for a couple of minutes and get a gift card,” and, “We have a new app, try it and get $5.”

“We were told not to tell (people) that it was video, even though it would say on the screen that a video was taken,” a source said, adding that video of each user was stored under each TVC’s profile and periodically reviewed in performance meetings.

“We were told not to tell (people) that it was video, even though it would say on the screen that a video was taken,” a source said, adding that video of each user was stored under each TVC’s profile and periodically reviewed in performance meetings.

“If the person were to look at that screen after the task had been completed, and say, ‘Oh, was it taking a video?’… we were instructed to say, ‘Oh it’s not really,’” the source said.

Then the source says the participants were rushed through consent agreements.

So, with facial recognition becoming such a major part of our lives and a huge convenience when it comes to traveling, is it worth it to be part of a database or does something seem fishy when you basically have to be tricked into giving your identity away.

Would you willingly participate in Google’s study? Share your thoughts.