
The Protectors of Our Data
Over the past few weeks, a new app captured the attention of millions of citizens across the US and world. FaceApp, a gimmicky app, uses predictive technology to take a pass at what you might look like when you are old. Even celebrities were in on the fun, with Carrie Underwood, Russ Wilson, Drake, Ryan Lochte, Gordon Ramsey and even LeBron James all posting pictures of their future selves. The app eventually went viral, with over a 100 million downloads from just Google’s Play store, and reaching the top downloaded app over on Apple iTunes.
Soon, however, the fun turned to concern and regret. News reports began highlighting that the owners of the app were based in St. Petersburg, Russia, raising concerns that FaceApp may be sharing anything uploaded, including pictures, with the Russian government, including its intelligence agencies. Further concerns arose when the privacy statement of FaceApp showed that users were granting FaceApp with “a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license” to use any uploaded images.
The backlash, as expected once the news broke, was swift. Senator Chuck Schumer wrote to FBI Director Chris Wray that “FaceApp’s location in Russia raises questions regarding how and when the company provides access to the data of U.S. citizens to third parties, including foreign governments.”. Senator Rick Scott introduced the Promoting Responsibility In Markets and E-Retailers (PRIME) Act of 2019 to digital apps, among other use cases, to disclose their country of origin. And Democratic National Committee security chief Bob Lord sent a note to all Democratic campaigns urging them to have all staffers delete the app immediately.
Yet, in the aftermath of escalating concern about downloading the FaceApp app, one thing also became clear: while FaceApp’s location in Russia may raise concerns, the company’s actions aren’t unique or isolated. In fact, many of the most prominent apps most likely on your phone use similar techniques. Facebook even has a similar policy when it comes to the usage/storage of photos uploaded: “you grant us a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use…and create derivative works of your content.” Facebook also was recently fined $5 billion for allowing third parties to improperly collect information on tens of millions of consent and non-consenting Facebook users. TikTok, a China-based company, recently agreed to pay the Federal Trade Commission a $5.7 million settlement, responding to allegations that TikTok has been illegally collecting the private information of children using the app. And Instagram allowed a vetted advertising partners to misappropriate vast amounts of public user data and create detailed records of users’ physical whereabouts, personal bios, and photos that were intended to vanish after 24 hours.
The frightening thing about this is that each of these apps went through some process to be available on both the App Store and the Google Play store. Consumers, believing that Apple and Google were protecting their privacy interests, soon found out that this isn’t always the case. Both stores do have privacy policies that they require apps to comply with in order to be listed on the store. Apple’s app guidelines requires that “all apps must include a link to their privacy policy in the App Store Connect metadata field and within the app in an easily accessible manner,” and warns that, if you steal user data, “your apps will be removed from the store and you will be expelled from the Developer Program.” Google Play‘s guidelines also requires that “[Y]our app must provide an in-app disclosure of your data collection and use.” and that app developers must be “transparent in how you handle user data (e.g., information collected from or about a user, including device information). That means disclosing the collection, use, and sharing of the data, and limiting the use of the data to the purposes disclosed, and the consent provided by the user.”
Yet, neither store flagged the concerns later raised about FaceApp by Senators Schumer or Scott. And, by allowing the app to be accessible in the app store, both provided tacit approval of the app’s privacy restrictions.
Which raises the question: why are these companies allowed to capriciously and even carelessly determine the requirements for consumer protections, and even more so as it relates to possible national security concerns about the origin and intent of certain apps? Given the implications on consumer privacy and foreign interference, it makes little sense to empower these companies, which often times may have conflicting reasons to approve apps, with the wide mandate to approve unilaterally the viability and accessibility of apps.
In order to fix this situation, Congress should pass a bill empowering the Federal Communications Commission to set the standards and protocols around app behavior, providing the framework for companies such as Google and Apple to approve or reject apps from their respective app stores, providing app developers with consistency and stability in the review process. And, a framework, such as what I am proposing here, has historical precedent within the FCC. In fact, if you look at most of your current electronic devices, you’ll see an FCC logo indicating its compliance with the Declaration of Conformity and Certification procedures established in 1998. These regulations, primarily around limiting electromagnetic interference, were passed in 1975 under Part 15 of the FCC rules. The FCC’s DoC certification required compliance with the FCC’s technical requirements for electronic devices such as computers, TVs, monitors, smartphones, tablets and any other device that emitted RF radiation, before those devices could marketed or operated within the United States. 2017 changes further streamlined the approval process, allowing for manufacturers to self-approve certain devices under a Supplier’s Declaration of Conformity.
Extending the Declaration of Conformity and Certification framework would provide a mechanism for the FCC to set forth guidelines for acceptable disclosure and behavior by apps. This proposal would go beyond the requirements set forth in Senator Scott’s PRIME Act, by granting authority to the FCC to develop and enforce guidelines around informed consent, data collection and duration, location of data storage, notice requirements around any disclosure of consumer data to third parties and any other consumer and national security protection the FCC deems necessary. Apps would then have a clear framework to review and comply with, allowing them the ability to certify via self-approval their conformity with the framework.
Implementing a framework for companies to comply with won’t be an easy task. Finding the right balance between protecting consumers and national security on one hand, and innovation on the other will be challenging. Ensuring the framework addresses risks while eliminating uncertainty will be a challenge for the FCC, and will likely need to go through several rounds of comments. And, it should be noted, that this proposal should avoid any subjective measures, including any consideration of the type of content; instead, focusing on creating a checklist that will ensure protection and security. As the electronic device innovation under Declaration of Conformity and Certification has shown, it is possible. For too long, for-profit corporations have been in charge of how US consumers data is protected. However, the urgency and importance as it relates to our national security necessitates a changes in the way we view the approval of these apps.
Empowering the FCC to have authority to set forth and enforce guidelines governing the collection, use, storage and sharing of consumer data would be a significant step towards protecting not just consumers but our nation as well.