Mobile App Privacy Archives | TrustArc https://trustarc.com/topic-resource/mobile-app-privacy/ Wed, 24 Jul 2024 12:39:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 https://trustarc.com/wp-content/uploads/2024/02/cropped-favicon-32x32.png Mobile App Privacy Archives | TrustArc https://trustarc.com/topic-resource/mobile-app-privacy/ 32 32 The Importance of Consent Management Platforms on Mobile Apps https://trustarc.com/resource/mobile-consent-management/ Fri, 19 Jul 2024 18:14:12 +0000 https://trustarc.com/?post_type=resource&p=5032
Whitepaper

The Importance of Consent Management Platforms on Mobile Apps

Unlocking mobile app privacy

Discover how Consent Management Platforms (CMPs) help mobile apps comply with privacy regulations, build user trust, and maintain data transparency. Learn about integrating with Apple ATT, Google Consent Mode, and IAB TCF v2 in our latest whitepaper.

 

Key takeaways:
  • Regulatory compliance: CMPs ensure mobile apps comply with GDPR, CCPA, and the EU Digital Markets Act.

  • User trust: CMPs enhance transparency and give users control over their data.

  • Seamless integration: CMPs integrate with Apple ATT, Google Consent Mode, and IAB TCF v2 for robust data privacy.

 
]]>
How App Stores Privacy Updates will Impact Your Business https://trustarc.com/resource/how-app-stores-privacy-updates-will-impact-your-business/ Mon, 24 Jan 2022 16:29:00 +0000 https://trustarc.com/?post_type=resource&p=2696
Articles

How App Stores Privacy Updates will Impact Your Business

In June of 2021, the Apple app store stated that all developers on the App Store need to allow users to delete their accounts directly within the app.

While this wasn’t a hard requirement from Apple when it was first announced, it will be mandatory by June 30th, 2022, and onward.

Why is this happening now?

Many of the biggest tech companies in the world (think: Apple and Microsoft) have been focusing on consumer trust and privacy in recent years, and this trend will continue into 2022 and beyond. The Cookiepocolypse is proof of that.

It is important to understand that this change may seem like a tiny piece of your complete privacy and trust ecosystem, but it’s a very important one.

What can you do to ensure you’re compliant with this App Store change?

This is a nuanced question as compliance is a long, process-driven journey that requires input from the entire business, external vendors, and key stakeholders to ensure it is done correctly.

If you’re in a position where you need to make this change, there are a few actions you can take to ensure your place in the Apple App store remains active. These actions include: 

  • Have a consent management platform that provides easy access to managing your data subject requests
  • Review all the privacy laws that apply to your business
  • Make sure your app privacy notice clearly explains what data you collect, how that data is collected, and how you might use this data in the future.

This is a lot of information to digest in a relatively short timeframe.

The good news is that there are solutions available that can help your business in this exact use-case – the even better news is you can contact TrustArc to learn more about our very own solutions (such as Individual Rights Manager) that will help you today and beyond.

Key Topics

Get the latest resources sent to your inbox

Subscribe
]]>
Emerging Markets: A ride-sharing app in Colombia fails to certify the deletion of personal data https://trustarc.com/resource/emerging-markets-a-ride-sharing-app-in-colombia-fails-to-certify-the-deletion-of-personal-data/ Fri, 23 Jul 2021 16:24:00 +0000 https://trustarc.com/?post_type=resource&p=2879
Article

Emerging Markets: A ride-sharing app in Colombia fails to certify the deletion of personal data

Annie Greenley-Giudici

Drinking and driving is never a good idea and is illegal, depending on the circumstances.

This is particularly true in a city like Bogota, where a population of close to 10 million, an elevation of 8,660 feet above sea level, and occasional foggy nights can make things confusing for drivers that indulge in a few drinks after work.

Perhaps, this is why many Colombians embraced the convenience of an app that hails a designated driver to pick them up at their favorite bars and ensure that users and cars arrive home safely.

But not all consumers want their personal data to reside in the app forever.

After all, it may pertain to their residence, vehicle information, favorite bars, drinking habits, and credit card number. Fortunately, Colombia has a Data Protection Law to protect consumers in these situations.

Recently, Colombia’s Superintendence of Industry and Commerce (SIC) issued an enforcement action that illustrates the need to manage requests for the deletion of personal data in emerging markets.

What happened in this enforcement action?

A consumer could not remove their credit card information from the app and submitted a request to delete their data.

The ride-share company agreed and offered to issue a deletion certificate.

However, a month later, the data still needed to be deleted.

So, naturally, the consumer filed a complaint with the SIC, and a full investigation ensued.

The SIC found that failing to demonstrate the deletion of the consumer’s personal information is a breach of two sections of the Data Protection Law: §§ 8(e) (right to deletion) and 17(a) (duty to guarantee the exercise of the habeas data right).

The ride-share company was penalized with a fine of COP 44,658,840 (US$ 11,711) along with orders to

  • Document processing procedures
  • Develop a procedure for consumers to exercise their habeas data rights,
  • And implement a plan for supervision and periodic review to ensure compliance within two months.

A framework-based analysis is available here. The entire decision (in Spanish) is here.

Emerging Market Impact

Colombia has a GDP per capita of USD 5,332 and an Internet Penetration of 65%.

Enforcement actions involving monetary fines and comprehensive revisions of privacy practices are not rare.

As its 50 million people continue to embrace electronic communications to navigate the intricacies of its geography, more frequent scrutiny on privacy compliance must be expected.

Key Topics

Get the latest resources sent to your inbox

Subscribe
]]>
EDAA Launches New Mobile Principles for Online Behavioral Advertising https://trustarc.com/resource/mobile-principles-online-behavioral-advertising/ Wed, 09 Mar 2016 14:57:00 +0000 https://trustarc.com/?post_type=resource&p=3014
Articles

EDAA Launches New Mobile Principles for Online Behavioral Advertising

EDAA Self Regulatory Program for Online Behavioral Advertising Extends to Mobile Devices

At the EDAA Summit in Brussels, the European Digital Advertising Alliance announced new Mobile Principles to extend the EDAA Self Regulatory Program for Online Behavioral Advertising to the mobile environment.

Differences in the EU and US Principles

Broadly, this move aligns the EDAA with its partner organization in the U.S., the Digital Advertising Alliance (DAA), which released Mobile Guidelines to amend its principles in mid-2013. There are, however, two notable differences between the EU and the U.S. framework.

Use of the Icon

In the EDAA Mobile Principles, there is a requirement that the enhanced notice mechanism inside a mobile ad is the Icon or Icon & AdMarker specifically, rather than allowing any conspicuous mark embedded in the ad creative that links to a notice page.

Use of Device Data

In the EDAA Mobile Principles, there is a slight difference in how information on a mobile device is classified. In the U.S. DAA guidelines, there is a reference to “Personal Directory Data” being used for interest-based advertising requiring enhanced notice and choice (i.e., requiring the Icon).

In the EDAA Mobile Principles, that data is redefined as “Personal Device Data” and changed to require enhanced notice and choice. This small change in verbiage means that any ad targeted to a user based on information gathered from other applications they have on their device is, according to the EDAA, an Interest Based Ad that requires enhanced notice and choice (i.e., requires the icon).

Impact of the Changes

Point one above closes a small loophole that allows a company in the U.S. not to license the Icon for mobile usage and instead use a different icon to notify consumers. The main goal of the change in point one is to have the industry normalize a single symbol for managing consumer privacy so that consumers are not confused.

Point two above is a much broader change, affecting most CPI & CPC-focused Companies in the ad-serving chain. This change means that companies gathering information about other apps on a user’s device will need to serve the icon in ads.

Since this is a common practice to understand a user based on the types of applications s/he downloads, this may be a major change for the performance advertising side of the industry.

Key Topics

Get the latest resources sent to your inbox

Subscribe
]]>
How Increasing Opt-Out Awareness of Smartphone Tracking Can Boost Trust https://trustarc.com/resource/opt-out-awareness-smartphone-tracking/ Fri, 20 Feb 2015 16:03:00 +0000 https://trustarc.com/?post_type=resource&p=3037
Articles

How Increasing Opt-Out Awareness of Smartphone Tracking Can Boost Trust

Smartphone Users Wary: 68% Concerned About Tracking for Targeted Ads

Smartphone users don’t like the idea of being served targeted ads on their smartphones – at least for now. New survey results from Ipsos on behalf of TRUSTe show that 68% of US smartphone users are concerned about the possibility of tracking their activity to serve targeted ads.

AdChoices Awareness Rising: 37% Informed, Impacts Consumer Trust

Study after study has shown that smart device users, as well as the majority of people connected to the Internet in some way, don’t like being tracked without their knowledge or consent and have concerns about privacy. But this could change in the near future.

This survey also showed that an increasing number of people are aware of the AdChoices icon, which is part of the Digital Advertising Alliance (DAA) Self-Regulatory Program for Online Behavioral Advertising (OBA). Now, 37% of people are aware of this icon – a notable increase from 21% in the previous year.

User Control Matters: 33% More Positive with AdChoices Opt-Out

As more consumers become aware of the AdChoices icon and realize that ads with this symbol let them opt out of tracking, consumer trust in ads may increase. This also underscores the importance for advertisers to be transparent and allow user control and consent when sharing information.

The survey also showed that one in three (33%) said the information available on AdChoices and the OBA opt-out option would make them feel more positive about the concept of targeted ads.

TRUSTed Ads Empowers Users: Opt-Out for Enhanced Control

TRUSTed Ads gives consumers more control over their online ad experience by allowing them to opt-out of targeted ads via the DAA AdChoices icon.

Get the latest resources sent to your inbox

Subscribe
]]>
The Upsides and Downsides of Private Messaging Apps https://trustarc.com/resource/private-messaging-apps/ Mon, 26 Jan 2015 16:13:00 +0000 https://trustarc.com/?post_type=resource&p=3043
Articles

The Upsides and Downsides of Private Messaging Apps

Private Messaging Apps Are Quickly Growing

Have we reached the end of the “age of oversharing”? Private messaging apps are the fastest-growing category of apps, according to mobile analytics firm Flurry. Recent stats show downloads of private social messaging apps increased by 200 percent in 2013 over 2012.

From the basic urge to “say Yo” or share a few emojis to the distribution of self-destructing content to select audiences, the desire for greater control over privacy seems to drive the private messaging boom.

The Guardian recently reviewed its picks for the 10 best messaging apps. One omission from this list is Wickr, regarded as one of the most secure options.

The allure of private messaging technology is undeniable. But there are upsides and downsides to these apps and tools.

The Upsides of Private Messaging Apps

We all have a nuanced understanding of our relationships with others and the contexts in which we communicate. In traditional social media, this has been limited by platforms that may lack adequate sharing options. Moreover, the business objectives of social media companies (increasing user base and driving user engagement) have a bias for public sharing and openness.

Private messaging offers a range of benefits:

  1. Apps offer a way to curate content for a more intimate group of followers.
  2. Certain apps offer the ability to share anonymously.
  3. Apps can offer a degree of impermanence to what you share, meaning content may self-destruct once viewed or after a pre-determined period of time.
  4. Having a messaging option separate from your more public accounts can help prevent unintentional sharing.
  5. Most apps are convenient, free, and optimized for mobile use.

Potential Downsides:

Before you install the latest and greatest round of sharing apps, ask yourself:

  1. How can you be sure “ephemeral” apps delete what you share? As users of some apps have learned, shared content may still exist on devices. There are few guarantees that recipients won’t take screenshots which can then be distributed.
  2. Are anonymous apps truly anonymous? How much information are you asked to provide when you create an account? What non-personal information is collected by the app, and can that information be associated with your account? As this Danish Consumer Council’s hidden video experiment shows, we’d be shocked if our local bakery asked for as much information as the average mobile app.
  3. How secure is the app? Many startups may not have robust cybersecurity processes, and even those with solid protocols may be subject to security glitches (as the recent Instagram bug revealed.) Are our messages encrypted? Is address book data stored on a user’s device or the app provider’s servers? What are the app’s policies regarding sharing, selling, or trading user data?
  4. Could the app expose you to cyberbullying or harassment? With anonymity or secrecy comes the potential for abuse. It can also be difficult to report users for inappropriate content. Recently the anonymous app After School was banned from the Apple store because of these concerns.

There are plenty of resources to help you make informed decisions about private messaging apps:

Appthority’s Reputation Report (PDF): This report analyzes the behaviors of the top 400 mobile apps, including the top 100 free apps and 100 paid apps for both iOS and Android. It identifies apps’ risky behaviors and can help you understand the risks posed by those behaviors.

PrivacyGradePrivacyGrade provides detailed information about an app’s privacy-related behaviors. The ratings are summarized in the form of a grade ranging from A (most privacy sensitive) to D (least privacy sensitive). Currently it rates only Android apps.

EFF’s Secure Messaging Scorecard: As part of a new EFF Campaign for Secure & Usable Crypto, this site offers a scorecard of certain apps and tools and their adoption of security best practices.

SafeSmartSocial.comThough intended primarily as a guide for parents curious about the apps their kids may be using, this site provides clear, concise guides to many private messaging apps.

Private messaging apps can be a fun, useful way to engage, but as always, proceed cautiously. Pause before you post. Regardless of how private an app claims to be, continue to share mindfully.

Key Topics

Get the latest resources sent to your inbox

Subscribe
]]>
Why Are Social Media Experiments Considered An Invasion of Privacy? https://trustarc.com/resource/social-media-experiments-invasion-privacy/ Mon, 25 Aug 2014 16:23:00 +0000 https://trustarc.com/?post_type=resource&p=3052
Articles

Why Are Social Media Experiments Considered An Invasion of Privacy?

Social Media is Very Personal

We all use it differently—which reflects the real world…we all socialize in different ways. But when news broke of social media experiments by popular channels, users were outraged. But why is our expectation of privacy so high on the very channels where we share the most?

Facebook’s 2012 experiment tested nearly 700,000 users’ emotional responses to their news feeds to vet a theory on the transferability of mood. Facebook manipulated users’ news feeds to show them content that was either predominately negative or positive, analyzing users’ emotional responses by examining verbiage and frequency in their own status updates.

Soon after, OKCupid admitted it had also experimented on users. To test users’ response to its match algorithm, OKCupid falsified its “match” data—pairs who were a low match (30%) were shown as a strong match (90%), and vice versa.

It’s no secret that Americans are becoming increasingly concerned about online privacy.

The TRUSTe Consumer Confidence Index 2014 showed that 90% of Americans are concerned about privacy in social media. Never has this been more evident than through the public’s response to the Facebook experiment—84% of users said they had lost trust in Facebook, and 66% considered deleting their Facebook accounts because of the experiment. Users said the experiment was using them as “lab rats.”

The responses show that users felt betrayed, that they felt used as pawns in psychological experiments drawn up to test the efficacy of the social media products themselves, that they had been lied to or given false information, and that the practices were unethical. The response revealed that users find this kind of experimentation unsettling and a serious breach of privacy.

The fury over the experiments is interesting because social media apps revolve around users voluntarily sharing information online. Many would argue that, by its nature, a social media platform is one where users should have the weakest expectation of privacy. Moreover, advertising companies have been using psychological studies to improve the efficacy and relevance of advertising for generations.

Lastly, the emotionally charged responses to the experiments do not typify what users say about privacy generally.

What makes social media experiments different, and what is responsible for the outrage?

A lot of discussion has centered on the lack of user consent and transparency, questioning whether the experiments were ethical. Sen. Mark Warner called for an FTC investigation, saying the experiment “invites questions about whether procedures should be in place for this type of research.” Several researchers, academics, lawyers, and media outlets have questioned whether the study complies with the APA’s ethical principles of psychological research.

These are all valid questions. Notice and consent are pillars of privacy — but I think that examining the public’s response shows that the issue is deeper. The outcry in response to the experiments indicates that users have two unique expectations of social media: heightened expectations of privacy and higher levels of trust.

These unique expectations can be drawn to the nature of social networks

Social media is where we share personal details, thoughts, and images with people we know (or, in OKCupid’s case, would like to know). It’s where we go to catch up with friends and family. It’s where we share personal milestones. These are personal, sometimes intimate, details. Despite the semi-public nature of information on social media, users have a different, higher expectation of privacy when they are present.

The response to the experiments also suggests that users have a higher level of trust in social media. This makes sense because the environment is personalized, it is curated, and it is our own. We go to social media to interface with “familiar faces”—we choose who we share this information with. Social media is a sort of online “home.” Law and society have long recognized the home as a sacred place, and experiments that manipulate our “online homes” may feel like the most serious transgression.

This article was first published in MediaPost on 8/20/14

Key Topics

Get the latest resources sent to your inbox

Subscribe
]]>
DAA Releases Technical Guidelines for Implementing AdChoices Icon in Mobile https://trustarc.com/resource/daa-releases-technical-guidelines-for-implementing-adchoices-icon-in-mobile/ Tue, 08 Apr 2014 16:31:00 +0000 https://trustarc.com/?post_type=resource&p=3057
Articles

DAA Releases Technical Guidelines for Implementing AdChoices Icon in Mobile

This week, the Digital Advertising Alliance (DAA) announced the first version of its DAA Ad Marker Guidelines for Mobile on how to comply with the enhanced notice requirements of the DAA Mobile Principles – The Application of Self-Regulatory Principles to the Mobile Environment. TrustArc played an active role in drafting the guidelines, working with other companies involved with the DAA Mobile Technical Working Group, and is attributed as an author. We shared insights from real-time experience from running the TrustArc implementation of the solution to our clients.

The Ad Marker Guidelines provide guidance to app developers, mobile web publishers, and third party ad networks on how to implement the AdChoices icon (Ad Marker) in both the mobile application and web environments.

Key Highlights from the DAA AdChoices Mobile Icon Guidelines Include:

  • Ad Marker (the DAA AdChoices icon) must include an invisible touchpad area of between 20×20 and 40×40 to enable consumers to easily press the icon, access the enhanced notice, and exercise a preference.
  • Non-prescriptive corner default for the in-ad display of the Ad Marker. It is noted that companies will need to pay attention to any close event that is prescribed to be on the top right corner such as in a video ad.  Guidance around close events can be found in the IAB MRAID and Video guidelines.
  • In-Ad experience options providing companies multiple options around the consumer experience when the consumer interacts with the AdChoices icon that include: 1) Opening of an interstitial allowing the consumer the choice to return to the ad in the case of mistakenly pressing on the icon or accessing a preference mechanism; 2) Expansion of the icon to display full AdChoices text; or 3) Taking the consumer directly to a preference mechanism or instructions for device specific controls
  • App Developer implementation guidance illustrating how the Ad Marker should be included in an app’s Settings menu.

This first release of the guidelines is a big step towards ensuring consistency and standardization of the consumer experience when interacting with the AdChoices icon in both the desktop and mobile environments. At the same time, the guidelines address issues specific to the mobile environment to enable consumers to easily access and interact with the AdChoices icon and exercise their preference.

Get the latest resources sent to your inbox

Subscribe
]]>
TrustArc Study Reveals Mobile Privacy is #2 Concern for Smartphone Users https://trustarc.com/resource/trustarc-study-mobile-privacy-concern/ Thu, 05 Sep 2013 16:40:00 +0000 https://trustarc.com/?post_type=resource&p=3065
Articles

TrustArc Study Reveals Mobile Privacy is #2 Concern for Smartphone Users

TrustArc Study Reveals Smartphone Users More Concerned About Mobile Privacy Than Brand or Screen Size

The smartphone and apps markets experienced explosive growth last year, to the extent that there are now more smartphones on the planet than people.

207 million were purchased worldwide in the final quarter of 2012 alone.

The complexity of the current mobile ecosystem raises new consumer privacy concerns

Mobile device users share information about their daily lives with many third parties – sometimes willingly, sometimes not.

While regulators in the US and Europe have moved to keep up with these issues and address consumer concerns, the question remains: How much do users understand who can access their information and how those third parties use it?

What personal information do consumers feel comfortable sharing, and how do they control their privacy?

The answer to these questions – and many more – are revealed in the latest TrustArc 2013 Consumer Data Privacy Study: Mobile Edition, which offers a detailed insight into current consumer opinion, business implications, and market trends.

Conducted by Harris Interactive among smartphone users in the US and UK between June 12 and June 19, 2013, the survey is part of an established research series by TrustArc.

The findings provide a valuable barometer on current consumer perceptions and mobile privacy trends by examining issues such as data collection, geo-location tracking, mobile advertising, and privacy management responsibility.

And, although the research findings in the US and UK were similar in many instances, they also reveal a number of significant differences of opinion.

Mobile Privacy Concerns

Privacy is, and remains, a concern among smartphone users on both sides of the Atlantic.

Despite the considerable investment in product and brand development made by mobile phone companies and app developers, smartphone users are more concerned about their privacy than the brand, camera, weight or screen size.

For 22% of US and 20% of UK users privacy is their greatest concern when using mobile apps, second only to battery life, with 78% in the US and 76% in Great Britain refusing to download an app they don’t trust.

Smartphone users in the US and the UK are equally concerned about privacy issues when banking online – in the US 63% worry frequently or always and in the UK the figure is 54%.

Reluctance to Share Personal Information

The study reveals 43% of smartphone users in the US and 47% in the UK are not prepared to share any information about themselves with a company in exchange for a free or lower cost mobile app.

Unlike in the US, where 38% (up from 31% in 2012) are willing to share at least some information, in the UK the trend is reversed with the figure at 35% (down from 40% in 2012).

The number of US users prepared to share their age (44%), full name (31%), date of birth (19%), and web-surfing behavior (12%) have all increased.

But the figures remain static, from last year, for those in the UK willing to reveal their age (38%), full name (34%) and date of birth (19%) and they express a decreasing willingness to share web-surfing behavior (9%).

Interestingly, consumers in both countries are more protective of their contacts and photos than their home address, phone number or current location.

Low Awareness of Mobile Tracking

When it comes to tracking, 31% of US smartphone users are not aware that tracking takes place on a mobile device with the figure rising dramatically across the pond with 46% unaware in the UK.

Those in both countries do not like the idea of being tracked (69% in the US and 70% in the UK) which is considerably higher than on desktop where 52% in the US and 47% in the UK express concerns about online behavioral advertising.

Smartphone users across both countries are actively involved in managing their mobile privacy concerns with 76% in the US and 69% in the UK stating they are ultimately responsible.

In addition, 40% of US and 37% of UK smartphone users check for an app privacy policy which is read by 35% of US users, but only 27% of those in the UK.

In addition, more smartphone users in the US (29%) check to see if an app has a trust mark or seal than in the UK (17%).

With mobile privacy concerns running higher than ever, the business implications simply can’t be ignored. If a user won’t download an app or share location data mobile commerce, and technology innovation, feels the impact.

It’s clear companies must address mobile privacy concerns by giving users what they want – more transparency and control over their privacy choices.

Key Topics

Get the latest resources sent to your inbox

Subscribe
]]>
What’s Next for the NTIA Mobile App Transparency Code? https://trustarc.com/resource/whats-next-for-the-ntia-mobile-app-transparency-code/ Thu, 01 Aug 2013 22:13:00 +0000 https://trustarc.com/?post_type=resource&p=2118
Articles

What’s Next for the NTIA Mobile App Transparency Code?

The Evolution of Mobile App Transparency: NTIA’s Multistakeholder Journey

On July 12, 2012, the Department of Commerce’s NTIA division kicked off a Multistakeholder proceeding focused on deciding a standard for mobile app transparency – the format and elements of a mobile app privacy notice (or as we’ll refer to it, the NTIA code).

Sitting with the many other attendees in the vast cavernous hall of the Herbert Hoover Auditorium that day and observing the wide range of interests represented in the room, I was admittedly skeptical about whether this group could reach consensus on anything that could provide meaningful guidance to app developers.

Even for the most Pollyannaish of privacy heads, the possibility that representatives from government, industry and the advocacy community could actually sit down together (let alone decide on a mobile privacy standard together) seemed remote.

Navigating the NTIA Code: A Crucial Step Towards Privacy and Transparency

Fast forward a little over a year to July 25, 2013. At its 16th (and for now final) meeting, a majority of stakeholders voted to “freeze” a draft NTIA code and start testing it in the marketplace before finalizing later this year. Issues remain about some of the draft code’s provisions, around user comprehension of terms used in the code, and how these terms should be laid out in a mobile notice.

For the majority of stakeholders however, the draft NTIA code is a win.

It’s worth stepping back and thinking about what has been decided and agreed upon by the NTIA Multistakeholder group. For the first time, a broad coalition representing consumers and industry has agreed on some basic data elements that should be noticed by mobile apps (for the full story, the current version of the draft code is posted on the NTIA’s site).

Mobile app developers who want to comply with the NTIA’s self-regulatory standard must notify users about whether they collect and share personal information – defined broadly to include data generated from a user’s activity on that device (browser and phone history), user uploaded files (contacts, photos) and sensitive data (health, financial, location).

Providing this type of information to consumers is important; TRUSTe’s research shows that 72% of smartphone users are more concerned about privacy than they were a year ago.

Having participated in and attended the NTIA meetings, it is clear that there are critical issues around implementation that remain open – but I also believe that these issues can be resolved by test driving different versions of an NTIA compliant format in the marketplace.

For instance, an outstanding issue that is key for many stakeholders, including TRUSTe, is whether an app developer should list all data elements (nutrition label) or just the ones collected/shared by the app (ingredient approach)?

Clearly this particular issue can be resolved through usability testing – are users confused by a mobile app’s privacy notice that informs them about the entire universe of data collection that could be happening on their device?

In this regard, TRUSTe is working with ACT, the Innovators Network and companies like AT&T, Apple, Facebook, Microsoft and Verizon, to conduct a program of consumer and developer testing that determines the answers to the remaining open issues and ensures that an NTIA compliant notice effectively communicates with consumers.

In fact, ACT is already testing this version of an NTIA compliant notice with a few of its developers. The Future of Privacy forum also worked on some UI mockups of an NTIA compliant notice and you can view these here.

In the next few months, we hope to share the results of these consumer tests with you and roll TRUSTe’s own version of an NTIA compliant mobile short notice.

In the end, is the NTIA code a win for consumers and the app developer community? Absolutely.

The current draft of the NTIA code builds on the “Transparency” principle in the Obama Administration’s Consumer Privacy Bill of Rights, which gives consumers the right to access “easily understandable information about privacy and security practices.” The mobile notices being contemplated by the NTIA code will not only inform, but also educate consumers about they types of data being collected by a mobile application, and with whom that data is being shared. That’s why testing will be such an integral part of this process.

The NTIA code will also provide much needed guidance to the app developer community, by establishing a self-regulatory standard that this community can build and improve upon. The fact that the NTIA code was developed through the Multistakeholder process gives it credibility with a wide range of audiences – academic, advocacy and industry – all of who actively contributed to and participated in the process that resulted in the current version of the NTIA code.

In closing, I thought I would provide a quick rundown on what’s currently required of app developers who want to provide consumers with an NTIA-compliant mobile short form notice.

The mobile app’s short form privacy policy should inform the consumer whether or not the app collects the following types of data:

  • Biometrics (information about your body, including fingerprints, facial recognition, signatures and/or voice print)
  • Browser History (a list of websites visited)
  • Phone or Text Log (a list of the calls or texts made or received)
  • Contacts (a list of contacts, social networking connections or their phone numbers, postal, email and text addresses)
  • Financial Info (credit, bank and consumer-specific financial information such as transaction data)
  • Health, Medical or Therapy Info (health claims and other information used to measure health or wellness)
  • Location (precise past or current location of where a user has gone)
  • User Uploaded Files (files stored on the device that contain your content, such as calendar, photos, text, or video)

The app’s privacy policy must also inform consumers if they share the above-referenced data categories or personal data with third parties such as:

  • Ad Networks (companies that display ads to you through apps)
  • Carriers (companies that provide mobile connections)
  • Consumer Data Resellers (companies that sell consumer information to other companies for multiple purposes including offering products and services that may interest you)
  • Data Analytics Providers (companies that collect and analyze your data)
  • Government Entities (any sharing with the government except where required by law or expressly permitted in an emergency)
  • Operating Systems and Platforms (software companies that power your device, app stores, and companies that provide common tools and information for apps about app consumers)
  • Other Apps (other apps of companies that the consumer may not have a relationship with)
  • Social Networks (companies that connect individuals around common interests and facilitate sharing)
Key Topics

Get the latest resources sent to your inbox

Subscribe
]]>